Los Angeles Times

How safe is driverless tech? The numbers don’t say

NHTSA’s reticence on Tesla crash data raises transparen­cy concerns in a nascent, unevenly regulated industry.

- By Russ Mitchell

It was an extraordin­ary vote of confidence for autonomous driving by the nation’s top vehicle safety agency. Two years ago, the National Highway Traffic Safety Administra­tion announced crash rates for Tesla cars dropped by almost 40% after installati­on of a driver-assist technology called Autosteer.

“Forty percent. That was an eye-popper,” said R.A. Whitfield, director of Quality Control Systems Corp. and an expert in statistics. So “breathtaki­ng and remarkable,” he said, that he didn’t quite believe it. “Extraordin­ary claims ought to be backed by extraordin­ary evidence.”

But when Whitfield requested the supporting data, he encountere­d a thick bureaucrat­ic wall at NHTSA, the taxpayerfu­nded agency primarily responsibl­e for vehicle safety in the United States. On Nov. 27, 2018, after a federal lawsuit and almost two years, NHTSA finally released the data.

Whitfield was shocked. In a detailed, 25-page report issued on Feb. 8, he said the NHTSA study violated basic principles of standard research methodolog­y to the point where no conclusion of any kind could be justified.

Whitfield — whose 32year-old Maryland-based firm helps companies reduce risk through analysis based on data and statistics — said the agency’s take on Tesla safety was “not well founded.” In scientific circles, that means bunk.

The agency did not dispute Whitfield’s assertions but said it was “reviewing the report released by Quality Control Systems Corp. with interest and will provide comment as appropriat­e.”

The episode raises questions that go far beyond whether Tesla’s Autopilot is safe. It draws attention to the collection and transparen­cy of data that will be crucial to crafting laws and regulation­s governing the use of vehicles that, in whole or in part, can drive themselves — and the extent to which driverless-technology companies can win public trust.

“The lesson here is a need

[Tesla, for candor,” said Bryant Walker Smith, a driver less vehicle law expert at the University of South Carolina. “What does it mean to be trustworth­y in this field?”

Companies, regulatory agencies and politician­s, Smith said, need to communicat­e clearly: “This is what we’re doing, this is why we think it’s safe, and this is why you should believe us.”

The technology behind driverless cars has advanced rapidly. In December, Waymo, the driverless division of Google’s Alphabet parent company, began offering a commercial robot car taxi service around Phoenix. Cruise Automation, owned by General Motors, is developing a driverless taxi service for San Francisco. Uber, after virtually abandoning its driverless program, has resumed research and developmen­t. With hundreds of billions at stake in a budding driverless car industry, every major auto company and tech company has skin in the game.

But the field won’t advance rapidly until the law catches up. Currently, the federal government barely regulates driverless vehicles. That’s been left to the states, which are creating a patchwork of laws, with no nationwide consistenc­y.

Congress tried to pass federal driverless legislatio­n last year, but a key bill failed in the Senate.

“I think they failed in part because of a lack of trust,” Smith said.

Recent polls show that Americans are leery about driverless car safety; more than half (52%) of those surveyed by Gallup last spring said they never wanted to use a self-driving car.

Lack of company and government transparen­cy will only hurt, Smith said: “When you have a lack of trust in the technology, the companies developing the technology, and the agency regulating those technologi­es, government will be a lot less willing” to back driverless cars.

The issue of safety-data transparen­cy gained widespread attention last March, after an Uber car in driverless mode — with an apparently distracted test driver at the wheel — killed a woman walking her bicycle across an Arizona highway.

Most researcher­s and the general public have little data available to understand what happened in that crash.

Uber’s reaction was to apologize, pay the victim’s family a legal settlement and avoid a public trial.

The National Transporta­tion Safety Board, or NTSB, released a preliminar­y report last May that noted that the Volvo car’s automatic brakes were turned off and that the woman was wearing dark clothes and was hard to see. But little underlying data about Uber’s driverless system was revealed.

Some states require limited data release, but its usefulness for safety studies can also be limited.

For example, California requires companies that test driverless cars on public roads to provide annual “disengagem­ent” reports to the public — a tally of the number of times a robot system had to turn control over to the human test driver, with a short explanatio­n as to why. The California Department of Motor Vehicles issued 2018 numbers on Wednesday.

But the numbers, by themselves, don’t reveal much. A high number of disengagem­ents might mean a company is pushing the edges of its system and making rapid progress — or that the system isn’t working well at all.

The latest controvers­y resurfaces an infamous Tesla crash in May 2016, when Tesla’s Autopilot — a so-called Level 2 system that legally requires constant human attention — drove a car under a semi truck crossing the highway, decapitati­ng the driver.

NHTSA investigat­ed the accident. The driver was inattentiv­e, the agency concluded in a January 2017 report, and the Autopilot technology was not to blame — even though the system could not tell the difference between the silvery side of the trailer truck and an overcast sky.

That NHTSA report included the analysis of Autosteer safety. Using Tesla’s own records, collected through its over-the-air software updating system, the agency looked at mileage driven in cars before and after Autosteer was installed. (Autosteer is the sub-system in Autopilot that can automatica­lly steer the car into other lanes, pass other cars and turn onto exit ramps.)

Using air bag deployment­s as a proxy for crashes, NHTSA concluded that Tesla cars were involved in 40% fewer crashes after Autosteer was installed. If that result were applied to cars in general, thousands of deaths could be prevented.

Just after the report, Tesla Chief Executive Elon Musk tweeted, “The data show that the Tesla vehicle crash rate dropped by almost 40% after Autosteer installati­on.”

The company used the finding to support its contention that Autopilot made cars safer.

In March 2018, after a Tesla driver was killed when Autopilot drove the car into a concrete barrier, Tesla posted a response on its website that included a reference to the NHTSA study: “Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliabilit­y.”

Whitfield maintains that the data show nothing of the sort. The main problem he identified: NHTSA took air bag deployment­s before and after Autosteer installati­on to estimate the number of crashes per million miles. But most of the cars reported by Tesla were missing the miles the car traveled before Autosteer was installed. With no miles at all to add to the equation, but the same number of air bag deployment­s, any findings would inflate the crash rate for pre-Autosteer cars, he said.

For the small minority of cars for which mileage data were provided both before and after Autosteer was installed, Teslas were involved in 60% more crashes, Whitfield calculated. That could mean cars with Autosteer were more dangerous than cars without.

Whitfield emphasized that he had drawn no such conclusion. “We did not produce this data. We don’t vouch for it. We don’t know if it’s true or not,” he said.

But the exercise makes clear that NHTSA’s conclusion­s are “not well founded,” he said.

In his report, Whitfield said neither he nor his company had a financial stake in Tesla or any autonomous­vehicle-related company or organizati­on, and it was not paid by anyone.

His interest in the subject is motivated by public concern, he said: “Efforts to hide the crash record will impede progress in achieving whatever safety benefits advanced driver-assistance systems might ultimately bring.”

On Wednesday, Tesla issued a statement that read in part: “Our own vehicle safety data for Q3 and Q4, which includes data from roughly two billion miles driven in Tesla vehicles, shows that drivers using Autopilot were significan­tly less likely to be involved in an accident than those driving without using Autopilot.”

The company’s statement made only one reference to Whitfield’s paper, saying that “QCS’ analysis dismissed the data from all but 5,714 vehicles of the total 43,781 vehicles in the data set we provided to NHTSA back in 2016.” Whitfield, in turn, said he wasn’t dismissing data, he was pointing out that essential data were not provided.

Alain Kornhauser, who heads Princeton University’s autonomous vehicle engineerin­g program, has another problem with the NHTSA finding: The data show that, if determinat­ion of safety is the goal, NHTSA is asking the wrong questions, he said. He notes that the NHTSA study didn’t assess whether Autosteer was turned on or off when the air bags were triggered.

“Isn’t the issue of safety of Autopilot the question of when Autopilot is engaged versus not engaged? The question is not whether Autopilot is available or not,” Kornhauser said. “Maybe we need more transparen­cy. What we probably need is for NHTSA to release all of the data that they were given by Tesla.”

Tesla’s is hardly the only informatio­n NHTSA is keeping under wraps. Early last year, when General Motors petitioned NHTSA for a special exemption to deploy driverless cars in the U.S., law professor Smith filed a federal Freedom of Informatio­n Act request to see what GM was asking for. NHTSA has ignored the request, Smith said.

Last year, before the underlying data were released, the agency in a Bloomberg story called its own Autosteer conclusion “cursory” and said it was not assessing Autosteer’s effectiven­ess.

Congress will have a chance to revisit the data questions when it again tries to pass driverless legislatio­n later this year.

One of the groups that might be knee-deep in that effort — PAVE (Partners for Automated Vehicle Education) — has identified data transparen­cy as a key issue in building public trust in driverless tech.

The organizati­on was formed last year by car companies, tech companies, safety advocates and disability rights groups to help educate the public.

Tesla and Uber have not been invited to join.

 ?? Gene J. Puskar Associated Press ?? THE COLLECTION and transparen­cy of data will be crucial in winning the public’s trust in driverless vehicles and crafting appropriat­e regulation­s.
Gene J. Puskar Associated Press THE COLLECTION and transparen­cy of data will be crucial in winning the public’s trust in driverless vehicles and crafting appropriat­e regulation­s.
 ?? David McNew AFP/Getty Images ?? A WAYMO driverless vehicle is displayed at last month’s CES convention in Las Vegas. Waymo, a division of Google parent Alphabet, has begun offering robot car taxi service around the Phoenix area.
David McNew AFP/Getty Images A WAYMO driverless vehicle is displayed at last month’s CES convention in Las Vegas. Waymo, a division of Google parent Alphabet, has begun offering robot car taxi service around the Phoenix area.
 ?? Karim Sahib AFP/Getty Images ?? TESLA Chief Executive Elon Musk, shown in 2017, had touted a federal agency’s now-questioned conclusion about the safety of Tesla’s Autosteer technology.
Karim Sahib AFP/Getty Images TESLA Chief Executive Elon Musk, shown in 2017, had touted a federal agency’s now-questioned conclusion about the safety of Tesla’s Autosteer technology.

Newspapers in English

Newspapers from United States