San Francisco Chronicle

Tesla probe ends — no recall or fines

- By Tom Krisher, Joan Lowy and Dee-Ann Durbin

WASHINGTON — Tesla Motors Inc. won’t face a recall or fine as a result of a fatal crash involving its Autopilot system, but federal safety regulators are warning auto manufactur­ers and drivers not to treat semiautono­mous cars as if they are fully self-driving.

The National Highway Traffic Safety Administra­tion said Thursday it found that the system had no safety defects at the time of the May 7 crash in Florida, and that it was primarily designed to prevent rearend collisions rather than other crashes.

Bryan Thomas, the agency’s chief spokesman, said automated driving systems still require a driver’s full attention. He warned that automakers need to keep tabs on how drivers use the technology, and should design vehicles “with the inattentiv­e driver in mind.”

The probe began June 28, nearly two months after a driver using Autopilot in a 2015 Tesla Model S died when it failed to spot a tractortra­iler crossing the car’s path on a highway in Williston, Fla.

Tesla’s Autopilot uses cameras, radar and computers to detect objects and automatica­lly brake if the car is about to hit something. It also can steer the car to keep it centered in its lane. The company has said that before Autopilot can be used, drivers must acknowledg­e that it’s an “assist feature” that requires both hands on the wheel at all times and that drivers must be ready to take control.

The agency’s criticism is expected to influence how automakers market semi-autonomous systems. Just about every company has or is working on similar systems as they

move rapidly toward self-driving cars.

The investigat­ion “helps clarify that cars are still supposed to be driven by attentive people, and if people behind the wheel aren’t attentive, it’s not the technology’s fault,” said Karl Brauer, executive publisher of Kelley Blue Book. That will help avoid the stigma that the technology causes accidents, he said.

The safety administra­tion released guidelines last year that attempt to ensure safety without slowing developmen­t of semiautono­mous and self-driving cars. The agency says self-driving features could dramatical­ly reduce traffic deaths by eliminatin­g human error, which plays a role in 94 percent of fatal crashes.

Thomas said the administra­tion wants to encourage innovation “to get the best answer to how we use these automated systems to the best effect and saving the most lives.”

In its probe, the safety administra­tion evaluated how the system functions and looked into dozens of other crashes involving Teslas, including a July one on the Pennsylvan­ia Turnpike that injured two people.

The Florida crash killed former Navy Seal Joshua Brown, 40, of Canton, Ohio. Tesla said at the time that the cameras on Brown’s Model S sedan failed to distinguis­h the white side of a turning tractor-trailer from a brightly lit sky and that neither the car nor Brown applied the brakes.

Thomas said Brown set the car’s cruise control at 74 mph — 9 mph over the limit — less than two minutes before the crash. The safety administra­tion’s crash reconstruc­tion showed the tractor-trailer should have been visible to Brown at least 7 seconds before impact, enough time to react.

Detecting vehicles that cross in its path were beyond the capabiliti­es of the Autopilot system, Thomas said on a conference call.

In a statement Thursday, Tesla said it appreciate­d the agency’s thoroughne­ss in reaching its conclusion.

When Tesla released Autopilot in 2015, some safety advocates questioned whether the Palo Alto company and the safety administra­tion allowed the public access to the system before testing was finished.

Consumer Reports magazine called on Tesla to drop the “Autopilot” name because it can give drivers too much trust in their car’s ability to drive itself.

In September, Tesla updated Autopilot software to rely more on radar sensors and less on cameras. The update also disabled the automatic steering if drivers don’t keep both hands on the wheel.

Another federal agency, the National Transporta­tion Safety Board, has opened a broader investigat­ion into the Tesla crash. It could be months before a final report that provides a probable cause for the collision is issued.

The company that made the camera and computer system for Tesla said in September that the company ignored its warnings about possible safety problems.

Israel’s Mobileye said that before the release of Autopilot, it warned Tesla not to allow drivers to use the system without their hands on the steering wheel. Mobileye, a huge company in the self-driving business, has stopped supplying components to Tesla.

Thomas said investigat­ors got informatio­n from Mobileye and evaluated the company’s statements, but still reached the no-defect conclusion. He wouldn’t comment on the company’s statements.

Tesla said at the time that Mobileye’s statements were inaccurate and stem from Tesla’s plans to develop its own vision system.

 ?? VCG 2016 ?? CEO Elon Musk’s Tesla has updated Autopilot software to rely more on radar sensors and less on cameras.
VCG 2016 CEO Elon Musk’s Tesla has updated Autopilot software to rely more on radar sensors and less on cameras.
 ?? National Transporta­tion Safety Board 2016 ?? This Tesla Model S, being driven by Joshua Brown, crashed while in Autopilot mode in May. Brown was killed.
National Transporta­tion Safety Board 2016 This Tesla Model S, being driven by Joshua Brown, crashed while in Autopilot mode in May. Brown was killed.

Newspapers in English

Newspapers from United States