Los Angeles Times

Tesla partly blamed in fatal crash

NTSB says Autopilot ‘permitted prolonged’ driver disengagem­ent.

- By Jim Puzzangher­a

WASHINGTON — A fatal 2016 crash involving a Tesla sedan was caused by the driver’s over-reliance on his vehicle’s Autopilot system and by a truck driver’s failure to yield while entering a Florida roadway, a federal panel determined Tuesday.

But the National Transporta­tion Safety Board also laid some blame on Tesla Inc. in the long-awaited findings of an investigat­ion into the first known fatal accident involving semiautono­mous driving technology.

The board said Tesla’s Autopilot contribute­d to the crash. The software of the Tesla Model S “permitted [the driver’s] prolonged disengagem­ent from the driving task” and let him use the Autopilot system on the wrong type of road.

The NTSB also said technology that senses a driver’s hands on the wheel — such as that used by Tesla — is not an effective way to tell whether the driver is paying attention.

“In this crash, Tesla’s system worked as designed, but it was designed to perform limited tasks in a limited range of environmen­ts,” the board’s chairman, Robert L. Sumwalt, said after the board voted 4 to 0 on the probable cause of the crash and staff recommenda­tions for avoiding future crashes.

“Tesla allowed the driver to use the system outside of the environmen­t for which it was designed ... and the system gave far more leeway to the driver to divert his attention to something other than driving,” he said. “The result was a collision that, frankly,

should have never happened.”

Joshua Brown, 40, died when the Tesla Model S sedan he was driving smashed into the trailer of a big-rig truck that was making a left turn in front of it from a cross street. Brown was traveling 74 mph using the Tesla’s semiautono­mous Autopilot feature, which did not identify the truck and stop the vehicle.

Tesla, which is based in Palo Alto and led by Elon Musk, said that “the safety of our customers comes first” and that its Autopilot technology “significan­tly increases safety” and reduces crashes.

The company cited a January report by the National Highway Traffic Safety Administra­tion that found a 40% drop in crashes after an automatic steering feature was included in Tesla cars. That report also concluded Tesla’s Autopilot had no safety defects and no recall was needed.

“We appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommenda­tions as we continue to evolve our technology,” the company said in a statement. “We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.”

The NTSB’s staff extensivel­y studied the crash and issued its findings as the automotive industry accelerate­s its developmen­t of selfdrivin­g vehicles.

The Trump administra­tion said Tuesday that it was loosening rules on the developmen­t of driverless cars. The announceme­nt came less than a week after the House passed legislatio­n that would exempt automakers from some safety standards and permit each of them to put as many as 100,000 self-driving vehicles a year on U.S. roads.

The NTSB recommende­d that makers of semiautono­mous vehicles develop a way to prevent the use of Autopilot-style technology when their vehicles are on roads that are not appropriat­e for the technology.

“A world of perfect selfdrivin­g cars would eliminate tens of thousands of deaths and millions of injuries each year,” Sumwalt said. “It is a long route from being partially automated vehicles to self-driving cars. Until we get there, somebody still has to drive.”

The staff said the Autopilot system functioned as designed during the May 2016 crash in Williston, Fla. But the technology was not meant to be used on the type of road on which the crash occurred. The Tesla owners manual says Autopilot should be used only on highways or limited access roads that have onramps and offramps.

The Florida crash took place on a state road that had access from cross streets. The NTSB staffers said that they thought Brown was very knowledgea­ble about the vehicle, but that the owners manual could be confusing.

“A driver could have difficulti­es interpreti­ng exactly [on] which roads it might be appropriat­e” to use Autopilot, said Ensar Becic, an NTSB human performanc­e investigat­or.

The staff recommende­d that Tesla and makers of other semiautono­mous vehicles use satellite data to determine the type of road their vehicles are on and that they allow Autopilots­tyle technology to be used only where appropriat­e.

The Tesla software did not detect the truck crossing in front of Brown’s vehicle. NTSB staff dismissed the idea — floated last year by Tesla — that the car’s sensors were unable to detect the white truck because it was against a bright sky.

Tests by the National Highway Traffic Safety Administra­tion determined that Tesla and other vehicles with semiautono­mous driving technology had great difficulty sensing cross traffic.

“The systems don’t detect that type of object moving across their path with any reliabilit­y,” said Robert Molloy, director of the NTSB’s Office of Highway Safety.

The staff is recommendi­ng the use of vehicleto-vehicle technology — in effect, vehicles talking to one another — to sense cross traffic.

The NTSB staff also said Tesla’s reliance on sensing a driver’s hands on the wheel was not an effective way of monitoring whether the driver was paying attention.

Tesla has repeatedly called Autopilot an “assist feature.” It has said that while using Autopilot, drivers must keep their hands on the wheel at all times and be prepared to take over if necessary.

“Since driving is a largely visual task and the driver may interact with the steering wheel without assessing the environmen­t, monitoring steering wheel torque is a poor surrogate for monitoring driving engagement,” Becic said.

The NTSB staff recommende­d the use of a more effective technology to determine whether a driver is paying attention, such as a camera tracking the driver’s eyes. A possible example — not mentioned by the staff — is Cadillac’s Super Cruise steering system, which includes a tiny camera that tracks eye and head movement to make sure the driver is paying attention to the road.

The panel’s review found that Brown’s last interactio­n with the vehicle as he drove was 1 minute and 51 seconds before the crash, when he set the cruise control function at 74 mph.

There was no indication the driver of the Tesla or the driver of the truck took evasive action before the crash, the NTSB staff said.

The driver of the truck, Frank Baressi, 63, refused to be interviewe­d by police after the crash, the NTSB said.

A blood test 90 minutes after the crash indicated he had used marijuana, but the NTSB determined that “his level of impairment, if any, at the time of the crash could not be determined from the available evidence.”

Before Tuesday’s meeting, Brown’s family said in a statement that he loved technology and that “zero tolerance for deaths would totally stop innovation and improvemen­ts.”

“Joshua loved his Tesla Model S. He studied and tested that car as a passion,” the family said in the statement issued Monday by Cleveland law firm Landskrone­r Grieco Merriman.

“We heard numerous times that the car killed our son. That is simply not the case,” the statement said.

 ?? NTSB ?? JOSHUA BROWN, 40, died last year when the Tesla Model S sedan he was driving smashed into the trailer of a big-rig while the Autopilot feature was in use.
NTSB JOSHUA BROWN, 40, died last year when the Tesla Model S sedan he was driving smashed into the trailer of a big-rig while the Autopilot feature was in use.

Newspapers in English

Newspapers from United States