Los Angeles Times

Fatal Tesla crash renews scrutiny of tech

Two deaths in Texas have safety experts wondering if federal regulators will address driverless systems.

- By Russ Mitchell

It’s a 21st century riddle: A car crashes, killing both occupants — but not the driver.

That’s what happened over the weekend in Houston, where a Tesla Model S slammed into a tree and killed the two men inside. According to police, one had been sitting in the front passenger seat, the other in the back of the car.

Although investigat­ors have not said whether they believe Tesla’s Autopilot technology was steering, the men’s wives told local reporters the pair went out for a drive Saturday after talking about the system.

Tesla Chief Executive Elon Musk pushed back on speculatio­n but also asserted no conclusion, tweeting Monday that “Data logs recovered so far show Autopilot was not enabled.” The company has resisted sharing data logs for independen­t review without a legal order.

After Musk’s tweet, a county police official told Reuters that the department would serve a warrant for the data.

Autopilot technicall­y requires the human driver to pay full attention, but it’s easy to cheat the system, and the internet is rife with videos of pranksters sitting in the back while a Tesla cruises down the highway with the driver seat empty.

It’s a state of affairs that leaves many auto safety experts and driverless technology advocates wondering just what it will take before regulators end the word games and rule-skirting that have allowed it to continue. Could the crash in Houston provide that impetus?

“I suspect there will be big fallout from this,” said Alain Kornhauser, head of the driverless car program at Princeton University.

Tesla’s Autopilot system has been involved in several fatal crashes since 2016, when a Florida man was decapitate­d as a Tesla on Autopilot drove him under the trailer of a semi truck.

Less lethally, Teslas have slammed into the back of fire trucks, police cars and other vehicles stopped on highway lanes.

Yet little action has been taken by federal safety officials and none at all by the California Department of Motor Vehicles, which has allowed Tesla to test its autonomous technology on public roads without requiring that it conform to the rules that dozens of other autonomous tech companies are following.

The National Highway Traffic Safety Administra­tion said Monday that it had dispatched a “Special Crash Investigat­ion team” to Texas. The agency, an arm of the U.S. Department of Transporta­tion, said it “will take appropriat­e steps when we have more informatio­n.”

The agency declined to speak with The Times about what those steps might be.

Since 2016, NHTSA has launched investigat­ions into at least 23 crashes involving Autopilot; but if they resulted in any conclusion or action, NHTSA hasn’t told the public about it.

Jason Levine, executive director of the Center for Auto Safety, thinks it’s about time that changes.

“There doesn’t seem to be much activity coming out of our federal safety administra­tion with respect to what is pretty evidently becoming a public danger,” he said. “You’ve got the market getting ahead of regulators, which isn’t uncommon, but this all didn’t start yesterday.”

Tesla sells an enhanced version of Autopilot called Full Self-Driving Capability for $10,000, although there is no car sold anywhere in the world today that is capable of full self-driving.

Although Tesla technology might well be safe when used as directed, Tesla’s marketing can lead people to believe the car is capable of autonomous driving. NHTSA, Levine points out, has rules against “predictabl­e abuse” in automotive technology.

“It is predictabl­e when you call something Autopilot it means autopilot, and when you call something Full Self-Driving it means full self-driving,” he said.

Incidents such as the fatal Texas crash “are foreseeabl­e incidents,” Levine said, “no matter how many disclaimer­s Tesla lawyers decide to insert in fine print.”

Musk disbanded the company’s media relations department in 2019. Emails to the company were not returned.

The California DMV is in a position to clarify matters but thus far has not. In previously undisclose­d emails to the DMV in recent months, made public by the legal document transparen­cy organizati­on Plainsite, Tesla told the DMV that its system is not autonomous but a socalled Level 2 driver assist system.

The DMVs own regulation­s bar companies from advertisin­g the sale or lease of a vehicle as autonomous if it “will likely induce a prudent person to believe a vehicle is autonomous.”

In public presentati­ons and slideshows, DMV Deputy Director Bernard Soriano described Level 4 automation, which requires no human driver, this way: “Full self-driving.”

In an emailed statement, the DMV suggested that it views what Tesla is selling as a non-autonomous system. It did not address questions about whether the company, in using the term Full SelfDrivin­g, is violating the regulation against misreprese­nting driving systems as autonomous.

Adding to the confusion, Musk himself has appeared on “60 Minutes” and Bloomberg TV behind the wheel of a Tesla with his hands in the air. He’s been talking about Tesla’s fully autonomous technology as if it’s imminent since 2016. That year, Tesla posted a video showing one of its cars running in autonomous mode through Palo Alto. “The person in the driver’s seat is only there for legal reasons,” the video said.

The same year he announced a coast-to-coast test drive of an autonomous Tesla by the end of 2017, which as of April 2021 has not happened. He told a Shanghai conference in 2020 that the “basic functional­ity” for fully autonomous driving would be complete that year. It wasn’t.

He said the company would have 1 million driverless robotaxis on the road by the end of 2020, which would cause Tesla cars to appreciate in value. So far there are none.

The misleading promises and the confusing nomenclatu­re are beginning to rile other players in the driverless car industry. Several industry executives have told The Times that they fear that Musk’s behavior could disturb the public and cause policymake­rs to enact restrictiv­e laws and regulation­s that could unnecessar­ily delay the introducti­on of driverless cars.

Now, some are beginning to speak out publicly.

“We’ve had multiple years of claims that ‘by the end of the year it’s going to be magically self-driving by itself without a human in the car,’ ” Ford’s autonomous vehicles head, John Rich, said at a recent Princeton University conference. “It is not helpful, OK? It is confusing the public. Frankly, even the investor community is very, very confused as to what paths are plausible and what the capabiliti­es of the different systems are.”

Musk has long cultivated a maverick approach to robot-car technologi­es. Other car and tech companies combine radar, lidar and visual sensors in their systems to identify and analyze a robot-car’s surroundin­gs. Musk believes lidar is an unnecessar­y expense and recently said Tesla would soon stop using radar too, relying solely on visual sensors for the main driving task.

And although other companies with advanced driver-assist systems similar to Autopilot use infrared cameras to make sure a human is in the driver’s seat and paying attention to the road ahead, Musk specifical­ly rejected that technology in favor of a steering wheel sensor that can be easily defeated by hanging a weight off the wheel or jamming an object into it.

General Motors’ SuperCruis­e system, for example, allows hands-free driving and automated lane changing on interstate­s and other limited-access highways, but monitors the driver to ensure they’re paying attention. If not, warning lights and sounds are deployed. If the driver remains inattentiv­e, the car will exit traffic lanes and stop itself.

Ford recently announced a similar product, BlueCruise, expected to become available this year. Neither company refers to the technology as full self-driving.

 ?? Dreamstime TNS ?? FEDERAL safety officials have taken little action despite crashes involving Tesla’s Autopilot system.
Dreamstime TNS FEDERAL safety officials have taken little action despite crashes involving Tesla’s Autopilot system.

Newspapers in English

Newspapers from United States