Los Angeles Times

Most drive-assist crashes involved Teslas, data show

NTSA says carmaker accounted for 273 of 392 incidents posted.

- By Russ Mitchell

How safe are automated driving systems? Are some safer than others?

Seven years after Tesla began selling cars equipped with what it calls Autopilot, auto safety regulators are still unable to answer these basic and vital questions.

But they took a step toward being able to do so Wednesday with the National Highway Traffic Safety Administra­tion’s first report on crashes involving advanced driver assistance systems.

The numbers are suggestive, with Tesla accounting for 70% of all crashes involving “Level 2” driving systems, which include adaptive cruise control plus automated lane-keeping and can encompass more advanced features, such as automatic lane changing. That figure is sure to provide ammunition for critics who say Elon Musk’s company has taken a reckless approach to rolling out unproven technology.

But far more detail and context are required before regulators can say definitive­ly whether such systems can outperform human drivers, or one another.

“The data may raise more questions than they answer,” NHTSA head Steven Cliff told reporters.

In June 2021, the agency required carmakers to report serious crashes involving Level 2 systems. The numbers reported Wednesday ref lect crashes that have occurred from that time through May 15 of this year.

Of all the crashes that occurred over that period involving all cars, automakers reported that 392 involved automated driver assist systems.

Of those, 273 were reported by Tesla, 90 by Honda and 10 by Subaru; others reported serious crashes in single digits.

“These data provide limited insight into hundreds of crashes,” said Bryant Walker Smith, a professor who specialize­s in automated-vehicle law at the University of South Carolina School of Law. “But in the same period there were literally millions of other crashes.”

But no one should conclude that Level 2 systems are safer than cars operated by human drivers alone, he said. They might be, they might not. The NHTSA data are far too broad to reach any such conclusion­s, he said.

The data don’t include the number of automated systems that each company has on the road or the total vehicle miles traveled with Level 2 systems engaged. NHTSA had no comment on how thorough each company’s reporting procedures might be. The agency plans monthly reports.

Crashes that were prevented by automated systems “are obviously unreported to the extent that they did not occur,” Smith said. A deep look into the cause of reported crashes — the roles played by the system, by the driver, by the system’s driver monitoring system, and other conditions on the roadway — would help safety regulators reach firm conclusion­s, he said.

“What NHTSA provided was a ‘fruit bowl’ of data with a lot of caveats, making it difficult for the public and experts alike to understand what is being reported,” Jennifer Homendy, chair of the National Transporta­tion Safety Board, said in a statement. “Independen­t analysis of the data is key to identifyin­g any safety gaps and potential remedies.”

Last year’s crash-data reporting order marked NHTSA’s first attempt to fill a deep deficit in knowledge about the real-life safety implicatio­ns of automated vehicle technology on public roads.

Any vehicle maker’s automated system could be safer than human drivers. Or less safe. Data rich enough to reach sound conclusion­s are scant. Crash data collection systems in the U.S. are decades old, inconsiste­nt, still paperbased at many police department­s and utterly unequipped to determine the role automated systems play in preventing or causing crashes.

“One would have hoped that NHTSA would ‘do the work’ to make the numbers they publish in summaries really be comparable,” Alain Kornhauser, head of the driverless car program at Princeton University, said in an email.

Apart from collecting crash data, NHTSA is investigat­ing why Tesla’s cars have been crashing into emergency vehicles parked by the roadside, often with their emergency lights flashing.

The investigat­ion was prompted by 11 crashes that led to 17 injuries and one death, including three crashes in Southern California. The number of such crashes has increased to 16. The technology in about 830,000 cars — all Tesla vehicles sold in the U.S. from 2014 to 2022 — is under investigat­ion.

As part of that investigat­ion, regulators will be looking into the performanc­e of Tesla’s automatic emergency braking systems. As The Times reported last year, Tesla drivers report emergency braking problems at a rate far higher than drivers of other makes.

The emergency vehicle investigat­ion grew more serious earlier this month, when NHTSA lifted its status to “EA,” for engineerin­g analysis. That category means investigat­ors will be taking a closer look at the technical design and performanc­e of Autopilot. Once an investigat­ion reaches EA, a recall is more likely.

Meanwhile, the California Department of Motor Vehicles continues to investigat­e whether Tesla is falsely marketing its Full Self-Driving feature, a $12,000 option. Experts in the field overwhelmi­ngly note that the system doesn’t come close to being able to safely drive itself.

The DMV review, however, is more than a year old, and the DMV won’t say when it might be completed.

State legislator­s are increasing­ly concerned about the DMV’s seemingly lax approach to Tesla. In December, the chair of the California Senate Transporta­tion Committee, Lena Gonzalez, asked the DMV to provide crash and safety data to the committee. The DMV said it would look into it, and is still looking.

The DMV appears to be allowing Tesla to test selfdrivin­g cars on public highways without requiring the company to report crashes or system failures, as is required of competitor­s such as Waymo, Cruise, Argo and Zoox. DMV head Steve Gordon has declined all media requests to discuss the subject since May 2021.

Newspapers in English

Newspapers from United States