The Punxsutawney Spirit

Tesla recalls ‘Full Self-Driving’ to fix unsafe actions

- By Tom Krisher

DETROIT (AP) — U.S. safety regulators have pressured Tesla into recalling nearly 363,000 vehicles with its “Full Self-Driving” system because it can misbehave around intersecti­ons and doesn’t always follow speed limits.

The recall, part of part of a larger investigat­ion by the National Highway Traffic Safety Administra­tion into Tesla’s automated driving systems, is the most serious action taken yet against the electric vehicle maker.

It raises questions about CEO Elon Musk’s claims that he can prove to regulators that cars equipped with “Full Self-Driving” are safer than humans, and that humans almost never have to touch the controls.

Musk at one point had promised that a fleet of autonomous robotaxis would be in use in 2020. The latest action appears to push that developmen­t further into the future.

The safety agency says in documents posted on its website Thursday that Tesla will fix the concerns with an online software update in the coming weeks. The documents say Tesla is doing the recall but does not agree with an agency analysis of the problem.

The system, which is being tested on public roads by as many as 400,000 Tesla owners, can make unsafe actions such as traveling straight through an intersecti­on while in a turn-only lane, failing to come to a complete stop at stop signs, or going through an intersecti­on during a yellow traffic light without proper caution, NHTSA said. The problems happen in “certain rare circumstan­ces,” the agency wrote.

In addition, the system may not adequately respond to changes in posted speed limits, or it may not account for the driver’s adjustment­s in speed, the documents said.

“FSD beta software that allows a vehicle to exceed speed limits or travel through intersecti­ons in an unlawful or unpredicta­ble manner increases the risk of a crash,” the agency said in documents.

Musk complained Thursday on Twitter, which he now owns, that calling an over-the-air software update a recall is “anachronis­tic and just flat wrong!” A message was left Thursday seeking further comment from Tesla, which has disbanded its media relations department.

Tesla has received 18 warranty claims that could be caused by the software from May of 2019 through Sept. 12, 2022, the documents said. But the Austin, Texas, electric vehicle maker told the agency it is not aware of any deaths or injuries.

In a statement, NHTSA said it found the problems during tests performed as part of an investigat­ion into Tesla’s “Full SelfDrivin­g” and “Autopilot” software that take on some driving tasks. The investigat­ion remains open, and the recall doesn’t address the full scope of what NHTSA is scrutinizi­ng, the agency said.

Despite the names “Full Self-Driving” and “Autopilot,” Tesla says on its website that the cars cannot drive themselves and owners must be ready to intervene at all times.

NHTSA’s testing found that Tesla’s FSD beta testing, “led to an unreasonab­le risk to motor vehicle safety based on insufficie­nt adherence to traffic safety laws.”

Raj Rajkumar, a professor of computer engineerin­g at Carnegie Mellon University, doubts that Tesla can fix all of the problems cited by NHTSA with a software update. The automaker, he says, relies only on cameras and artificial intelligen­ce to make driving decisions, a system that will make mistakes.

“Cameras can miss a lot of things,” Rajkumar said. “These are not straightfo­rward issues to fix. If they could have fixed it, they would have fixed it a long time back.”

Most other companies with self-driving vehicles use laser sensors and radar in addition to cameras to make sure vehicles see everything. “One sensing modality is not perfect by any metric,” Rajkumar said.

He questioned whether NHTSA will require testing before the software update is sent out to make sure it works. The agency said that it works closely with automakers as they develop recall remedies “to ensure adequacy.”

In documents, NHTSA says that on Jan. 25, as part of regular communicat­ions with Tesla, it told the automaker about concerns with FSD, and it asked Tesla to do a recall. On Feb. 7, Tesla decided to do the recall out of an abundance of caution, “while not concurring with the agency’s analysis.”

The recall is another in a list of problems that Tesla has with the U.S. government. In January, the company disclosed that the U.S. Justice Department had requested documents from Tesla about “Full Self-Driving” and “Autopilot.”

NHTSA has been investigat­ing Tesla’s automated systems since June of 2016 when a driver using Autopilot was killed after his Tesla went under a tractor-trailer crossing its path in Florida. A separate probe into Teslas that were using Autopilot when they crashed into emergency vehicles started in August 2021. At least 14 Teslas have crashed into emergency vehicles while using the Autopilot system.

Newspapers in English

Newspapers from United States