The Boston Globe

US casts doubt on Tesla Autopilot recall remedies

- By Tom Krisher

DETROIT — federal highway safety investigat­ors want tesla to tell them how and why it developed the fix in a recall of more than 2 million vehicles equipped with the company’s Autopilot partially automated driving system.

investigat­ors with the uS National Highway traffic Safety Administra­tion have concerns about whether the recall remedy worked because tesla has reported 20 crashes since the remedy was sent out as an online software update in December.

the recall fix also was to address whether Autopilot should be allowed to operate on roads other than limited access highways. the fix for that was increased warnings to the driver on roads with intersecti­ons.

but in a letter to tesla posted on the agency’s website tuesday, investigat­ors wrote that they could not find a difference between warnings to the driver to pay attention before the recall and after the new software was released. the agency said it will evaluate whether driver warnings are adequate, especially when a drivermoni­toring camera is covered.

the agency asked for volumes of informatio­n about how tesla developed the fix, and zeroed in on how it used human behavior to test the recall effectiven­ess.

Phil koopman, a professor at carnegie mellon university who studies automated driving safety, said the letter shows that the recall did little to solve problems with Autopilot and was an attempt to pacify NHtSA, which demanded the recall after more than two years of investigat­ion.

“it’s pretty clear to everyone watching that tesla tried to do the least possible remedy to see what they could get away with,” koopman said. “And NHtSA has to respond forcefully or other car companies will start pushing out inadequate remedies.”

Safety advocates have long expressed concern that Autopilot, which can keep a vehicle in its lane and a distance from objects in front of it, was not designed to operate on roads other than limited access highways.

missy cummings, a professor of engineerin­g and computing at George mason university who studies automated vehicles, said NHtSA is responding to criticism from legislator­s for a perceived lack of action on automated vehicles.

“As clunky as our government is, the feedback loop is working,” cummings said. “i think the NHtSA leadership is convinced now that this is a problem.”

the 18-page NHtSA letter asks how tesla used human behavior science in designing Autopilot, and the company’s assessment of the importance of evaluating human factors. it also wants tesla to identify every job involved in human behavior evaluation and the qualificat­ions of the workers. And it asks tesla to say whether the positions still exist.

A message was left seeking comment from tesla about the letter.

tesla is in the process of laying off about 10 percent of its workforce, about 14,000 people, in an effort to cut costs to deal with falling global sales.

cummings said she suspects that cEO Elon musk would have laid off anyone with human behavior knowledge, a key skill needed to deploy partially automated systems like Autopilot, which can’t drive themselves and require humans to be ready to intervene at all times.

“if you’re going to have a technology that depends upon human interactio­n, you better have someone on your team that knows what they are doing in that space,” she said.

cummings said her research has shown that once a driving system takes over steering from humans, there is little left for the human brain to do. many drivers tend to overly rely on the system and check out.

“You can have your head fixed in one position, you can potentiall­y have your eyes on the road, and you can be a million miles away in your head,” she said. “All the driver monitoring technologi­es in the world are still not going to force you to pay attention.”

in its letter, NHtSA also asks tesla for informatio­n about how the recall remedy addresses driver confusion over whether Autopilot has been turned off if force is put on the steering wheel. Previously, if Autopilot was deactivate­d, drivers might not notice quickly that they have to take over driving.

the recall added a function that gives a “more pronounced slowdown” to alert drivers when Autopilot has been disengaged. but the recall remedy doesn’t activate the function automatica­lly — drivers have to do it. investigat­ors asked how many drivers have taken that step.

NHtSA could seek further recall remedies, make tesla limit where Autopilot can work, or even force the company to disable the system until it is fixed, safety experts said.

 ?? GODOfREDO A. VáSquEz/ASSOciAtED PRESS/fiLE ?? Tesla vehicles at a charging station in Emeryville, Calif. In a letter to Tesla posted on the US National Highway Traffic Safety Administra­tion website, investigat­ors wrote that they could not find a difference between warnings to the driver to pay attention before an Autopilot recall and after the new software was sent out.
GODOfREDO A. VáSquEz/ASSOciAtED PRESS/fiLE Tesla vehicles at a charging station in Emeryville, Calif. In a letter to Tesla posted on the US National Highway Traffic Safety Administra­tion website, investigat­ors wrote that they could not find a difference between warnings to the driver to pay attention before an Autopilot recall and after the new software was sent out.

Newspapers in English

Newspapers from United States