Feds preview rules of the road for self-driving cars
In this July 15, 2016 photo, a double-decker tour bus drives by an Audi self-driving vehicle parked on Pennsylvania Avenue, near the Capitol in Washington. OBAMA ADMINISTRATION officials are previewing longawaited guidance that attempts to bring self-driving cars to the nation’s roadways safely – without creating so many roadblocks that the technology can’t make it to market quickly.
Traditional automakers and tech companies have been testing selfdriving prototypes on public roads for several years, with a human in the driver’s seat just in case. The results suggest that what once seemed like a technology perpetually over the horizon appears to be fast approaching, especially with car companies announcing a string of investments and acquisitions in recent months.
Federal officials have been struggling with how to capitalise on the technology’s promised safety benefits – the cars can react faster than people, but don’t drink or get distracted – while making sure they are ready for widespread use. The new guidance represents their current thinking, which they hope will bring some order to what has been a chaotic roll-out so far.
Self-driving cars have the potential to save thousands of lives lost on the nation’s roads each year and to change the lives of the elderly and the disabled, US President Barack Obama said in an op-ed published Monday by the Pittsburgh Post-Gazette.
“Safer, more accessible driving. Less congested, less polluted roads. That’s what harnessing technology for good can look like,” Obama wrote. But he added: “We have to get it right. Americans deserve to know they’ll be safe today even as we develop and deploy the technologies of tomorrow.”
One self-driving technology expert said the overall tenor of the guidance signalled that the federal government truly has embraced autonomous driving. “In terms of just attitude, this is huge,” said Bryant Walker Smith, a law professor at the University of South Carolina who closely tracks the technology. He also cautioned that many details remain unclear.
The government did make clear that the National Highway Traffic Safety Administration will seek recalls if semi-autonomous systems don’t make drivers pay attention.
The agency, which is part of the Transportation Department, released guidelines showing how NHTSA can use its recall authority to regulate new technology. “It emphasises that semi-autonomous driving systems that fail to adequately account for the possibility that a distracted or inattentive driver-occupant might fail to retake control of the vehicle in a safety-critical situation may be defined as an unreasonable risk to safety and subject to recall,” the department said in a statement.
NHTSA says the guidelines aren’t aimed at electric car maker Tesla Motors. But the bulletin would address events like a fatal crash in Florida that occurred while a Tesla Model S was operating on the company’s semiautonomous Autopilot system. The system can brake when it spots obstacles and keep cars in their lanes. But it failed to spot a crossing tractor-trailer and neither the system nor the driver braked. Autopilot allows drivers to take their hands off the steering wheel for short periods.
Tesla has since announced modifications so Autopilot relies more on radar and less on cameras, which it said were blinded by sunlight in the Florida crash. The company has maintained that Autopilot is a driver-assist system and said it warns drivers they must be ready to take over at any time.
Under the overall guidelines, the federal transportation regulators, rather than states, should be in charge of regulating self-driving cars since the vehicles are essentially controlled by software, not people, administration officials said.
States have historically set the rules for licensing drivers, but when the driver becomes a computer “we intend to occupy the field here,” Transportation Secretary Anthony Foxx said. States, he said, should stick to registering the cars and dealing with questions of liability when they crash.
Automakers should also be allowed to self-certify the safety of autonomous vehicles by following a 15-point checklist for safe design, development, testing and deployment, said officials who briefed reporters. Though companies are not required to follow the guidance – it is voluntary and does not carry the force of formal regulation – Foxx said he expects compliance.
“It’s in their vested interest to go through the rigours that we’re laying out here” to gain the confidence of both regulators and the public, Foxx said.
In somewhat contradictory fashion, officials also said the National Highway Traffic Safety Administration is examining whether it should have “premarket approval” authority, in which the government inspects and approves new technologies like autonomous vehicles. That would be a departure from the agency’s historic self-certification system and might require action from Congress.