Los Angeles Times

Uber reveals failings of its self-driving program

Company details management and tech changes it’s implementi­ng

- By Michael Laris

More than seven months after a self-driving Uber SUV killed a pedestrian in Arizona, the company has released safety reports that detail broad technologi­cal and management failings and describe efforts since the tragedy to address them.

The findings released Friday reveal shortcomin­gs at the core of Uber’s driverless technology itself, which relies on cameras and sensors to take in the environmen­t and software to process that informatio­n and make all the decisions — big and small — needed to drive safely.

A key internal recommenda­tion cited the need for “improving the overall software system design,” which is akin to saying Uber’s robot car needed a better brain with sharper thinking.

In practice, that means that since the fatal crash in Tempe, Ariz., in March, company engineers have worked at “reducing latency,” or the delay between when an initial observatio­n is made and when an action is taken in response, Uber said. “We are now able to detect objects and actors sooner and execute safe reactions faster,” Uber said.

The driverless

system also more quickly obtains “accurate velocity measuremen­ts for actors moving suddenly or erraticall­y,” it said.

Uber expressed contrition and an eagerness to work with others in the industry to improve safety across the board.

“The competitiv­e pressure to build and market self-driving technology may lead developers to stay silent on remaining developmen­t challenges,” Uber Chief Executive Dara Khosrowsha­hi wrote as part of a new safety assessment released Friday. He said Uber wants to join competitor­s in finding ways to “measure and demonstrat­e” driverless performanc­e and that he hopes to encourage “a culture of transparen­cy rooted in safety.”

Whether Uber can transform itself into a safety leader is far from clear.

“It would be fantastic if they did that. But they’d have to do it seriously,” said Joan Claybrook, who was head of the National Highway Traffic Safety Administra­tion under President Carter. “From when they started, safety was never No. 1 on their list. It was to get the vehicles on the road.”

There are no federal safety standards for driverless vehicles. Claybrook says driverless testing should not be done on public roads, a risk Uber and other major firms say is necessary to provide the diverse, real-world conditions needed to train expert computer drivers and create safer roads in the future.

“I don’t think the public should be guinea pigs,” Claybrook said.

In Arizona, Uber was pushing the limits as it scrambled to catch up with self-driving firms such as Alphabet Inc.’s Waymo, which emerged from Google’s nearly decade-old self-driving car project.

Federal investigat­ors say Uber employees intentiona­lly disabled the automatic-braking features on their specially outfitted Volvo XC90 so that it wouldn’t slow down erraticall­y during the testing in Tempe. The car detected pedestrian Elaine Herzberg six seconds before hitting her but misidentif­ied her as an unknown object, a vehicle and then a bike, according to the National Transporta­tion Safety Board. It also misjudged where she would go.

Uber’s backup safety driver had been streaming “The Voice” on her phone and didn’t start braking until after Herzberg, 49, was struck, the NTSB said.

The San Francisco company pulled its driverless cars from public roads and said they would not return until internal and external safety reviews were completed and the company made necessary improvemen­ts.

Investigat­ors say Herzberg was pushing a bike across a darkened boulevard, outside a crosswalk, when Uber’s Volvo hit her. Onboard video shows Herzberg looking back toward the Volvo only a moment before it reached her. Uber reached an undisclose­d financial settlement with Herzberg’s family.

Noah Zych, Uber’s head of system safety, said the company is “raising the bar” for what its self-driving cars must prove before returning to public roads. To do so, he said, Uber is putting them up against extremely challengin­g scenarios on test tracks.

“What happens if a vehicle pulls out in front of us at the very last minute? Or a bicycle runs a stop sign? Or a person comes out from behind a parked car?” Zych said.

“A lot of human drivers, I think, would struggle with consistent­ly passing those tests. We’re working to make sure our system passes those tests as well,” he said.

Uber is also completing fixes that will allow the automatic emergency braking system on its cars to be used during driverless testing, Zych said. That reverses its approach in Tempe.

Zych said the goal is to earn trust by being transparen­t and making the safety improvemen­ts the company promises.

“But we recognize that just saying that isn’t going to necessaril­y be compelling,” he said. “Public sentiment and trust is also something that doesn’t come back, or come at all, overnight.”

Overall, Uber said it has incorporat­ed a “new approach to handling uncertaint­y,” such as whether a vehicle will yield the right of way, “enabling the system to reason over many possible outcomes to ultimately come to a safe response.”

There’s also improved “object and actor detection” for ambiguous situations in which there’s low visibility or views are blocked.

But the company says it still has a long way to go. As for the management and oversight problems that allowed Uber to deploy flawed or not-fully-formed technology on public roads, the company says it has made progress.

It has upped the “cadence of executive-level reviews,” meaning senior leaders from Khosrowsha­hi on down are paying closer attention to self-driving safety, according to the company. Uber executives say that safety is now their core value, and that they have establishe­d an independen­t safety team “to provide appropriat­e checks and balances.”

Uber said it has created an anonymous reporting system so employees can raise safety concerns, and they’ve already been using it.

It also laid off hundreds of backup drivers who were supposed to be the human safety net protecting the public from risks with the developing technology. They’ve been replaced with “mission specialist­s” with more rigorous training, the company said.

But creating a deeper “safety culture” in an organizati­on is time-consuming, according to an external inquiry commission­ed by Uber and led by the law firm LeClairRya­n and former NTSB Chairman Christophe­r Hart.

“A safety culture is not something that springs up ready-made from the organizati­onal equivalent of a near-death experience,” the report concluded, quoting a prominent British expert in human error, James Reason. Instead, it emerges gradually from “practical and down-to-earth measures” and from “a process of collective learning.”

Uber has filed to resume testing its driverless vehicles in Pittsburgh.

 ?? Angelo Merendino AFP/Getty Images ?? A KEY internal recommenda­tion in Uber’s disclosure­s cites the need for “improving the overall software system design,” akin to saying Uber’s robot car needed a better brain with sharper thinking.
Angelo Merendino AFP/Getty Images A KEY internal recommenda­tion in Uber’s disclosure­s cites the need for “improving the overall software system design,” akin to saying Uber’s robot car needed a better brain with sharper thinking.
 ?? Eric Risberg Associated Press ?? AN UBER driverless car was involved in a fatal crash in Arizona this year. Above, one of the company’s cars in San Francisco in 2016.
Eric Risberg Associated Press AN UBER driverless car was involved in a fatal crash in Arizona this year. Above, one of the company’s cars in San Francisco in 2016.
 ?? Justin Sullivan Getty Images ?? UBER has applied to the Pennsylvan­ia Department of Transporta­tion to resume testing its driverless vehicles in Pittsburgh. Above, an Uber car in 2017.
Justin Sullivan Getty Images UBER has applied to the Pennsylvan­ia Department of Transporta­tion to resume testing its driverless vehicles in Pittsburgh. Above, an Uber car in 2017.

Newspapers in English

Newspapers from United States