The Guardian (USA)

Tesla Autopilot feature was involved in 13 fatal crashes, US regulator says

- Guardian staff and agencies

US auto-safety regulators said on Friday that their investigat­ion into Tesla’s Autopilot had identified at least 13 fatal crashes in which the feature had been involved. The investigat­ion also found the electric carmaker’s claims did not match up with reality.

The National Highway Traffic Safety Administra­tion (NHTSA) disclosed on Friday that during its three-year Autopilot safety investigat­ion, which it launched in August 2021, it identified at least 13 Tesla crashes involving one or more death, and many more involving serious injuries, in which “foreseeabl­e driver misuse of the system played an apparent role”.

It also found evidence that “Tesla’s weak driver engagement system was not appropriat­e for Autopilot’s permissive operating capabiliti­es”, which resulted in a “critical safety gap”.

The NHTSA also raised concerns that Tesla’s Autopilot name “may lead drivers to believe that the automation has greater capabiliti­es than it does and invite drivers to overly trust the automation”.

Tesla said in December that its largest-ever recall, covering 2.03m US vehicles – or nearly all of its vehicles on US roads – was to better ensure drivers pay attention when using its advanced driver-assistance system.

After closing the first investigat­ion, regulators opened another, this one into whether that recall to install new Autopilot safeguards was adequate.

The NHTSA said it was opening the second investigat­ion after identifyin­g concerns due to crash events after vehicles had had the recall software update installed “and results from preliminar­y NHTSA tests of remedied vehicles”.

That recall investigat­ion covers models Y, X, S, 3 and Cybertruck vehicles in the US equipped with Autopilot and produced in the 2012 to 2024 model years, NHTSA said.

The agency said Tesla has issued software updates to address issues that appear related to its concerns but has not made them “a part of the recall or otherwise determined to remedy a defect that poses an unreasonab­le safety risk”. The NHTSA also cited Tesla’s statement “that a portion of the remedy both requires the owner to opt in and allows a driver to readily reverse it”.

Tesla said in December that Autopilot’s software system controls “may not be sufficient to prevent driver misuse” and could increase the risk of a crash.

Tesla did not immediatel­y respond to a request for comment.

In February, Consumer Reports, a non-profit organizati­on that evaluates products and services, said its testing of Tesla’s Autopilot recall update found that changes did not adequately address many safety concerns raised by the NHTSA and urged the agency to require the automaker to take “stronger steps”, saying Tesla’s recall “addresses minor inconvenie­nces rather than fixing the real problems”.

Tesla’s Autopilot is intended to enable cars to steer, accelerate and brake automatica­lly within their lane, while enhanced Autopilot can assist in changing lanes on highways but does not make vehicles autonomous.

One component of Autopilot is Autosteer, which maintains a set speed or following distance and works to keep a vehicle in its driving lane.

Tesla said in December it did not agree with the NHTSA’s analysis but would deploy an over-the-air software update that will “incorporat­e additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to their continuous driving responsibi­lity whenever Autosteer is engaged”.

The NHTSA’s then top official, Ann Carlson, said in December that the investigat­ion determined that more needed to be done to ensure drivers are engaged when Autopilot is in use. “One of the things we determined is that drivers are not always paying attention when that system is on,” Carlson said.

The NHTSA opened its August 2021 investigat­ion of Autopilot after identifyin­g more than a dozen crashes in which Tesla vehicles hit stationary emergency vehicles.

Separately, since 2016, the NHTSA has opened more than 40 special Tesla crash investigat­ions in cases where driver systems such as Autopilot were suspected of being used, with 23 crash deaths reported to date.

Tesla’s recall includes increasing prominence of visual alerts and the disengagin­g of Autosteer if drivers do not respond to inattentiv­eness warnings and additional checks upon engaging Autosteer. Tesla said it would restrict Autopilot use for one week if significan­t improper usage is detected.

Tesla disclosed in October that the US justice department issued subpoenas related to its Full Self-Driving (FSD) and Autopilot features. Reuters reported in October 2022 that Tesla was under criminal investigat­ion.

Tesla in February 2023 recalled 362,000 US vehicles to update its FSD beta software after the NHTSA said the vehicles did not adequately adhere to traffic safety laws and could cause crashes.

 ?? ?? A Tesla model 3 drives on autopilot along the 405 highway in Westminste­r, California, in 2022. Photograph: Mike Blake/Reuters
A Tesla model 3 drives on autopilot along the 405 highway in Westminste­r, California, in 2022. Photograph: Mike Blake/Reuters

Newspapers in English

Newspapers from United States