Los Angeles Times

Probe clears Autopilot in fatal Florida Tesla crash

NHTSA investigat­ion concludes the car’s driver-assist software had no safety defects.

- By Russ Mitchell

Drivers need to pay attention while driving, even as technology gradually takes over the task.

That’s the message U.S. safety regulators delivered Thursday after closing an investigat­ion into a fatal Tesla crash in Florida last year involving the vehicle’s Autopilot system. The National Highway Traffic Safety Administra­tion concluded that the driver-assist software in the Tesla Model S had no safety defects and declined to issue a recall.

The safety board also studied less-serious Tesla crashes and said it didn’t find “any incidents in which the systems did not perform as designed.”

But while completely self-driving vehicles may be on the way, automatic braking, adaptive cruise control and other driver-assist technologi­es now on the market still require “continual and full attention of a driver,” NHTSA spokesman Bryan Thomas said in a conference call.

NHTSA said Tesla “fully cooperated” with the investigat­ion and its requests for data. Tesla, in a prepared statement, said: “The safety of our customers comes first, and we appreciate the thoroughne­ss of NHTSA’s report and its conclusion.”

Hod Lipson, professor of mechanical engineerin­g at Columbia University, said NHTSA’s findings were “a vindicatio­n not only of Tesla, but of the entire self-driving car industry.”

“Yes, driverless cars are going to have accidents. But they’re going to have fewer accidents than humans,” he said. “And unlike humans,

driverless cars are going to keep getting better, halving the number of accidents per mile every so many months. The sooner we get on that exponentia­l trajectory, the better.”

The Florida crash in May drew worldwide attention. The Model S electric sedan, with its Autopilot engaged, drove under a big-rig truck making a left-hand turn across a highway. The car drove under the trailer, killing its driver, 40-year-old Joshua Brown of Canton, Ohio. The truck driver told police he heard a “Harry Potter” movie playing in the crushed automobile after the crash.

The incident led critics to question whether automated driving technology is ready for highway deployment. However, it represents the only fatality involving Tesla’s Autopilot to date, and Tesla Chief Executive Elon Musk has repeatedly insisted that there are fewer crashes per miles driven on Autopilot than with miles driven by a human.

The federal agency also investigat­ed a Pennsylvan­ia crash involving Autopilot that caused injuries, as well as “dozens” of other Tesla crashes, and found no system defects.

The agency pored over data on Tesla crashes in which air bags were deployed while Autopilot was engaged. Many of the crashes, NHTSA said, involved “driver behavior factors,” including distractio­n, driving too fast for conditions and “mode confusion,” when a car and the driver share driving tasks.

The safety board said Tesla owner manuals and on-screen instructio­ns make clear that the human driver alone is responsibl­e for driving the car.

But, the agency said, manufactur­ers need to pay attention to how drivers actually use the technology, not just how they’re supposed to use it, and to design vehicles “with the inattentiv­e driver in mind.” And companies need to do a better job of educating drivers on system limitation­s, such as training sessions at dealership­s. “It’s not enough to just put it in the owners manual,” Thomas said.

For example, not all drivers may be aware that Tesla’s automated braking system is intended to help avoid rear-end collisions, not trucks crossing a highway.

The closure of the investigat­ion without a recall “helps clarify that cars are still supposed to be driven by attentive people, and if people behind the wheel aren’t attentive, it’s not the technology’s fault,” said Karl Brauer, auto analyst at Kelley Blue Book.

However, one group criticized the agency’s findings. Consumer Watchdog, based in Santa Monica, said “NHTSA has wrongly accepted Tesla’s line and blamed the human, rather than the ‘Autopilot’ technology and Tesla’s aggressive marketing.”

The agency did criticize Tesla for use of the term “Autopilot.” Tesla has said its safety record proves that drivers understand Autopilot’s limitation­s.

Officials at NHTSA believe that automated driverassi­st technologi­es and ultimately self-driving cars will lead to fewer crashes and traffic fatalities. In its report, it noted a 40% drop in crashes after an automatic steering feature was included in Tesla cars.

Bryant Walker Smith, an autonomous vehicle law expert at the University of South Carolina, said “the decision shows that data matter, and that those data are increasing­ly available. I’d expect companies to be increasing­ly attentive to the data they collect and how they present those data publicly and to NHTSA.”

But Smith said the investigat­ion is not the final word in determinin­g fault when humans and robots share driving duties. The inquiry “would have looked different if the vehicle in question had [an] automated driving system that promised to replace the human driver rather than … a system that merely promises to supplement that driver.”

 ?? National Transporta­tion Safety Board ?? THE MODEL S sedan involved in a fatal May 2016 crash. With its Autopilot engaged, the car went under a turning big-rig truck, killing the Tesla’s driver.
National Transporta­tion Safety Board THE MODEL S sedan involved in a fatal May 2016 crash. With its Autopilot engaged, the car went under a turning big-rig truck, killing the Tesla’s driver.

Newspapers in English

Newspapers from United States