Lodi News-Sentinel

Your Tesla could explain why it crashed — good luck getting the Autopilot data

- By Russ Mitchell

On Jan. 21, 2019, Michael Casuga drove his new Tesla Model 3 southbound on Santiago Canyon Road, a two-lane highway that twists through hilly woodlands east of Santa Ana, Calif.

He wasn’t alone, in one sense: Tesla’s semiautono­mous driver-assist system, known as Autopilot — which can steer, brake and change lanes — was activated. Suddenly and without warning, Casuga claims in a Superior Court of California lawsuit, Autopilot yanked the car left. The Tesla crossed a double yellow line, and without braking, drove through the oncoming lane and crashed into a ditch, all before Casuga was able to retake control.

Tesla confirmed Autopilot was engaged, according to the suit, but said the driver was to blame, not the technology. Casuga’s attorney, Mike Nelson in New York City, asked Tesla to release the data to show exactly what happened. Tesla refused, the suit claims, and referred Casuga and his lawyer to the car’s event data recorder, known as the black box. But the black box — a common feature in cars since the early 2000s — doesn’t record Autopilot data. Autopilot informatio­n is captured and stored separately, often sent over the airwaves to Tesla’s remote cloud computer repository.

Finding out who or what causes a car crash should be easier today. Cars have become computers on wheels, bristling with sensors, data processors and memory chips. Data “is significan­tly better and more potentiall­y useful than ever,” said Jason Levine, executive director of the Center for Auto Safety advocacy group.

But the ownership and accessibil­ity of much of that data is in flux, as legislator­s and regulators play catch-up with the fact that human beings and mobile robot systems increasing­ly share the driving.

Casuga’s lawsuit is an attempt to get a court order for Tesla to turn over his car’s data, Nelson said. If the data were recorded on the car’s black box, Casuga would have legal access. But no laws or regulatory requiremen­ts give car owners the right to access operationa­l informatio­n, not even basic safety data, if it’s not on the black box. (And in some states, even the black box data doesn’t belong to the car owner.)

“The car manufactur­er knows what happened,” Nelson said. But short of a court order, a car maker is not bound to release informatio­n on semiautono­mous driving systems to a car owner, a lawyer, to safety researcher­s, even to a police department investigat­ing a fatal crash (though The Times found no evidence that Tesla or other companies are resisting police requests). Only federal safety regulators have on-demand rights to car crash data collected onboard by the manufactur­er but not on the black box.

A rare public airing of driver-assist technology’s role in traffic crashes will occur Tuesday in a meeting of the National Transporta­tion Safety Board, where two fatal Tesla incidents involving Autopilot will be discussed, including the 2018 Model X crash that killed Apple engineer Walter Huang in Mountain View, Calif. The meeting will be viewable via webcast.

Levine hopes some basic questions will be addressed at the meeting. “How is the technology working? Is it failing? When it’s failing, is it operator’s failure or the technology’s failure?” Levine said. “You can’t always determine all of that from the (black box) event data recorder.”

Tesla did not respond to multiple requests for comment.

At a time when massive data sets can be threaded through sophistica­ted computers to determine trends in safety performanc­e, robot-car data is locked in manufactur­er silos, thus far unobtainab­le even in anonymized form to be evaluated by independen­t safety researcher­s. Autonomous driving is a nascent field crowded with competitor­s backed by billions of dollars jockeying for market leadership, and they jealously guard proprietar­y technology. But safety experts say an over-aggressive attitude about intellectu­al property is getting in the way of basic safety assessment.

“Data associated with a crash, or even near crash, of a vehicle operating under automated control should not be deemed proprietar­y,” said Alain Kornhauser, professor of operations research at Princeton University, a specialist in driverless technology and policy, and founder of the annual Smart Driving Car summit. “Safety must be a cooperativ­e effort, not a ‘I know something you don’t know’ competitiv­e play.”

It’s fairly safe to assume that, properly designed and planned for, fleets of fully automated robot cars, trucks and buses one day will be safer than vehicles driven by humans. But getting to the point where humans feel comfortabl­e sharing the road with robots will require demonstrat­ing how and how well the technology works.

“There are many places where self-driving cars are going to be safer. I have a driver-assisted Subaru and I love it,” said Madeleine Clare Elish, who studies the confluence of humans and automated systems at the research group Data & Society. “But that doesn’t mean it cancels out all the complicati­ons, including when the technologi­es fail.”

Elish’s recently published academic paper, titled Moral Crumple Zones, concludes that when blame assessed in major accidents involving humans and advanced technology, human operators tend to feel the heat even when the technology is poorly designed.

Technology defects are inevitable, and are not limited to hardware and software glitches. Major gaps in logic and sensing still limit the ability of robot cars to handle the baroque complexiti­es of the real world.

 ?? TRIBUNE NEWS SERVICE ?? Tesla Model X in a storage lot after fatal Mountain View freeway crash in 2019.
TRIBUNE NEWS SERVICE Tesla Model X in a storage lot after fatal Mountain View freeway crash in 2019.

Newspapers in English

Newspapers from United States