San Diego Union-Tribune

SCRUTINY OF TESLA CRASH MAY SIGNAL NEW REGULATION

Federal safety agency probing electronic systems in autonomous cars has been hands-off in the past

- BY TOM KRISHER

The fiery crash of a Tesla near Houston with no one behind the wheel is drawing scrutiny from two federal agencies that could bring new regulation of electronic systems that take on some driving tasks.

The National Highway Traffic Safety Administra­tion and the National Transporta­tion Safety board said Monday they would send teams to investigat­e the Saturday night crash on a residentia­l road that killed two men in a Tesla Model S.

Local authoritie­s said one man was found in the passenger seat, while another was in the back. They’re issuing search warrants in the probe, which will determine whether the Tesla’s Autopilot partially automated system was in use. Autopilot can keep a car centered in its lane, keep a distance from cars in front of it, and can even change lanes automatica­lly in some circumstan­ces.

In the past, NHTSA, which has authority to regulate automakers and seek recalls for defective vehicles, has taken a hands-off approach to regulating partial and fully automated systems for fear of hindering developmen­t of promising new features.

But since March, the agency has stepped up inquiries into Teslas, dispatchin­g teams to three crashes. It has investigat­ed 28

Tesla crashes in the past few years, but thus far has relied on voluntary safety compliance from auto and tech companies.

“With a new administra­tion in place, we’re reviewing regulation­s around autonomous vehicles,” the agency said last month.

Agency critics say regulation­s — especially of Tesla — are long overdue as the automated systems keep creeping toward being fully autonomous. At present, though, there are no specific regulation­s and no fully self-driving systems available for sale to consumers in the U.S.

At issue is whether Tesla CEO Elon Musk has oversold the capability of his systems by using the name Autopilot or telling customers that “Full Self-Driving” will be available this year.

“Elon’s been totally irresponsi­ble,” said Alain Kornhauser, faculty chair of autonomous vehicle engineerin­g at Princeton University. Musk, he said, has sold the dream that the cars can drive themselves even though in the fine print Tesla says they’re not ready. “It’s not a game. This is serious stuff.”

Tesla, which has disbanded its media relations office, did not respond to requests for comment Monday. Its stock fell 3.4 percent in the face of publicity about the crash.

In December, before former President Donald Trump left office, NHTSA sought public comment on regulation­s. Transporta­tion

Secretary Elaine Chao, whose department included NHTSA, said the proposal would address safety “without hampering innovation in developmen­t of automated driving systems.”

But her replacemen­t under President Joe Biden, Pete Buttigieg, indicated before Congress that change might be coming.

“I would suggest that the policy framework in the U.S. has not really caught up with the technology platforms,” he said last month. “So we intend to pay a lot of attention for that and do everything we can within our authoritie­s,” he said, adding that the agency may work with Congress on the issue.

Tesla has had serious problems with Autopilot, which has been involved in several fatal crashes where it failed to stop for tractortra­ilers crossing in front of it, stopped emergency vehicles, or a highway barrier.

The NTSB, which can only issue recommenda­tions, asked that NHTSA and Tesla limit the system to roads on which the system can safely operate, and that Tesla install a more robust system to monitor drivers to make sure they’re paying attention. Neither Tesla nor the agency took action, drawing criticism and blame for one of the crashes from the NTSB.

Missy Cummings, an electrical and computer engineerin­g professor at Duke University who studies automated vehicles, said the Texas crash is a watershed moment for NHTSA.

She’s not optimistic the agency will do anything substantia­l, but hopes the crash will bring change. “Tesla has had such a free pass for so long,” she said.

Frank Borris, a former head of NHTSA’s Office of Defects Investigat­ion who now runs a safety consulting

business, said the agency is in a tough position because of a slow, outdated regulatory process that can’t keep up with fast-developing technology.

The systems holds great promise to improve safety, Borris said. But it’s also working with “what is an antiquated regulatory rule promulgati­ng process which can take years.”

Investigat­ors in the Houston-area case haven’t determined how fast the Tesla was driving at the time of the crash, but Harris County Precinct Four Constable Mark Herman said it was a high speed. He would not say if there was evidence that anyone tampered with Tesla’s system to monitor the driver, which detects force from hands on the wheel. The system will issue warnings and eventually shut the car down if it doesn’t detect hands. But critics say Tesla’s system is easy to fool and can take as long as a minute to shut down.

The company has said in the past that drivers using Autopilot and the company’s “Full Self-Driving Capability” system must be ready to intervene at any time, and that neither system can drive the cars itself.

On Sunday, Tesla CEO Elon Musk tweeted that the company had released a safety report from the first quarter showing that Tesla with Autopilot has nearly a 10 times lower chance of crashing than the average vehicle with a human piloting it.

But Kelly Funkhouser, head of connected and automated vehicle testing for Consumer Reports, said Tesla’s numbers have been inaccurate in the past and are difficult to verify without underlying data.

Newspapers in English

Newspapers from United States