The Guardian (USA)

Tesla recalls more than 2m vehicles in US over Autopilot system

- Reuters

Tesla is recalling just over 2m vehicles in the United States fitted with its Autopilot advanced driver-assistance system to install new safeguards, after a safety regulator said the system was open to “foreseeabl­e misuse”.

The National Highway Traffic Safety Administra­tion (NHTSA) has been investigat­ing the electric automaker led by the billionair­e Elon Musk for more than two years over whether Tesla vehicles adequately ensure that drivers pay attention when using the driver assistance system.

Tesla said in the recall filing that Autopilot’s software system controls “may not be sufficient to prevent driver misuse” and could increase the risk of a crash.

The acting NHTSA administra­tor, Ann Carlson, told Reuters in August it was “really important that driver monitoring systems take into account that humans over-trust technology”.

Tesla’s Autopilot is intended to enable cars to steer, accelerate and brake automatica­lly within their lane, while enhanced Autopilot can assist in changing lanes on highways but does not make them autonomous.

One component of Autopilot is Autosteer, which maintains a set speed or following distance and works to keep a vehicle in its driving lane.

Tesla said it did not agree with NHTSA’s analysis, but would deploy an over-the-air software update that will “incorporat­e additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to their continuous driving responsibi­lity whenever Autosteer is engaged”.

The company did not respond to a question on whether the recall would be performed outside the United States.

NHTSA opened an investigat­ion in August 2021 into Autopilot after identifyin­g more than a dozen crashes in which Tesla vehicles hit stationary emergency vehicles and upgraded it in June 2022.

NHTSA said as a result of its investigat­ion Tesla had issued the recall after the agency found “Tesla’s unique design of its Autopilot system can provide inadequate driver engagement and usage controls that can lead to foreseeabl­e misuse of the system”.

Separately, since 2016, NHTSA has opened more than three dozen Tesla special crash investigat­ions in cases where driver systems such as Autopilot were suspected of being used, with 23 crash deaths reported to date.

NHTSA said there might be an increased risk of a crash in situations when the system is engaged but the driver does not maintain responsibi­lity for vehicle operation and is unprepared to intervene or fails to recognize when

it is canceled or not.

NHTSA’s investigat­ion into Autopilot will remain open as it monitors the efficacy of Tesla’s remedies. Tesla and NHTSA held several meetings since mid-October to discuss the agency’s tentative conclusion­s on potential driver misuse and Tesla’s proposed software remedies in response.

The company will roll out the update to 2.03m Model S, X, 3 and Y vehicles in the United States dating back to the 2012 model year, the agency said.

The update based on vehicle hardware will include increasing prominence of visual alerts on the user interface, simplifyin­g engagement and disengagem­ent of Autosteer and additional checks upon engaging Autosteer “and eventual suspension from Autosteer

use if the driver repeatedly fails to demonstrat­e continuous and sustained driving responsibi­lity while the feature is engaged”, Tesla said.

It did not provide more specifics about exactly how alerts and safeguards would change.

Shares in the world’s most valuable automaker were down 1% in premarket trading.

Tesla disclosed in October that the

US justice department had issued subpoenas related to its Full Self-Driving (FSD) and Autopilot systems. Reuters reported in October 2022 that Tesla was under criminal investigat­ion over claims the company’s electric vehicles could drive themselves.

Tesla in February recalled 362,000 US vehicles to update its FSD Beta software after NHTSA said the vehicles did not adequately adhere to traffic safety laws and could cause crashes.

NHTSA closed an earlier investigat­ion into Autopilot in 2017 without taking any action. The National Transporta­tion Safety Board (NTSB) has criticized Tesla for a lack of system safeguards for Autopilot, and NHTSA for a failure to ensure the safety of Autopilot.

 ?? Photograph: David Zalubowski/AP ?? Tesla said in the recall filing that Autopilot’s software system controls ‘may not be sufficient to prevent driver misuse’ and could increase the risk of a crash.
Photograph: David Zalubowski/AP Tesla said in the recall filing that Autopilot’s software system controls ‘may not be sufficient to prevent driver misuse’ and could increase the risk of a crash.

Newspapers in English

Newspapers from United States