Daily Southtown

Tesla faces federal scrutiny over its Autopilot technology

- By Neal E. Boudette

Tesla faced numerous questions about its Autopilot technology after a Florida driver was killed in 2016 when the system of sensors and cameras failed to see and brake for a tractor-trailer crossing a road.

Now the company is facing more scrutiny than it has in the past five years for Autopilot, which Tesla and its chief executive, Elon Musk, have long maintained makes its cars safer than other vehicles. Federal officials are looking into a series of recent accidents involving Teslas that either were using Autopilot or might have been using it.

The National Highway Traffic Safety Administra­tion confirmed last week that it was investigat­ing 23 such crashes. In one accident this month, a Tesla Model Y rear-ended a police car that had stopped on a highway near Lansing, Michigan. The driver, who was not seriously injured, had been using Autopilot, the police said.

In February in Detroit, under circumstan­ces similar to the 2016 Florida accident, a Tesla drove beneath a tractor-trailer that was crossing the road, tearing the roof off the car. The driver and a passenger were seriously injured. Officials have not said whether the driver had turned on Autopilot.

NHTSA is also looking into a Feb. 27 crash near Houston in which a Tesla ran into a stopped police vehicle on a highway. It is not clear if the driver was using Autopilot. The car did not appear to slow before the impact, the police said.

Autopilot is a computeriz­ed system that uses radar and cameras to detect lane markings, other vehicles and objects in the road. It can steer, brake and accelerate automatica­lly with little input from the driver. Tesla has said it should be used only on divided highways, but videos on social media show drivers using Autopilot on various kinds of roads.

“We need to see the results of the investigat­ions first, but these incidents are the latest examples that show these advanced cruise-control features Tesla has are not very good at detecting and then stopping for a vehicle that is stopped in a highway circumstan­ce,” said Jason Levine of the Center for Auto Safety, a group created in the 1970s by the Consumers Union and Ralph Nader.

Tesla, which generally does not respond to inquiries from reporters, did not return phone calls or emails seeking comment. And Musk did not respond to questions sent to him on Twitter.

The company has not publicly addressed the recent crashes. While it can determine if Autopilot was on at the time of accidents because its cars constantly send data to the company, it has not said if the system was in use.

The company has argued that its cars are very safe, claiming that its own data show that Teslas are in fewer accidents per mile driven and even fewer when Autopilot is in use. It has also said it tells drivers that they must pay close attention to the road when using Autopilot and should always be ready to retake control of their cars.

While NHTSA has not forced Tesla to recall Autopilot, the National Transporta­tion Safety Board concluded that the system “played a major role” in the 2016 Florida accident. It also said the technology lacked safeguards to prevent drivers from taking their hands off the steering wheel or looking away from the road. The safety board reached similar conclusion­s when it investigat­ed a 2018 accident in California.

By comparison, a similar General Motors’ system, Super Cruise, monitors a driver’s eyes and switches off if the person looks away from the road for more than a few seconds. It can be used only on major highways.

 ?? DREAMSTIME 2019 ?? A man uses Autopilot technology in a Tesla. Federal officials are looking into a string of recent accidents in which the technology was or may have been used.
DREAMSTIME 2019 A man uses Autopilot technology in a Tesla. Federal officials are looking into a string of recent accidents in which the technology was or may have been used.

Newspapers in English

Newspapers from United States