San Francisco Chronicle

Self-driving Uber saw pedestrian but didn’t brake

- By Carolyn Said

An Uber self-driving car that struck and killed a woman in Arizona in March saw her in the roadway but did not brake, according to a preliminar­y report on the crash released Thursday.

Nor did the autonomous system alert the vehicle’s operator to the danger. The operator was behind the wheel but looking down shortly before the crash.

While the report by the National Transporta­tion Safety Board did not assign blame for the March 18 death of Elaine Herzberg, experts said the conclusion­s are stark.

“Uber’s systems were clearly at fault,” said Raj Rajkumar, a Carnegie Mellon professor and leader of its autonomous vehicle research. “They had problems on the technical side, they had not adequately tested, and they had problems in training the (human) operators.”

The report found that while Uber’s automated driving system could handle regular braking, the autonomous system lacked emergency braking abilities — a major gap. Instead, Uber relied on a human

backup driver to take over if needed. The catch, however, was that “the system is not designed to alert the operator,” according to the report.

In other words, Uber’s autonomous systems saw Herzberg but had no way of slamming on the brakes or alerting the driver.

Uber’s reasoning for barring the autonomous system from hitting the emergency brakes, according to the safety board, was that it made for “erratic vehicle behavior.”

Dashcam video released in March showed that the backup driver was looking down until a split-second before the impact. The driver told the NTSB that she was “monitoring the self-driving interface,” not using her cell phone.

Uber suspended all selfdrivin­g tests after the crash, the first pedestrian fatality caused by a self-driving car. This week the company said it will cease robot-vehicle operations in Arizona while resuming its self-driving taxi service in Pittsburgh this summer. Its self-driving engineers work in Pittsburgh, San Francisco and Toronto.

Uber’s modified Volvo XC90 sport utility vehicle struck and killed Herzberg as she wheeled a bicycle across an unlit section of the road in Tempe. The report said Herzberg tested positive for methamphet­amine and marijuana.

The self-driving systems, aided by radar and lidar, observed Herzberg six seconds before impact but struggled to figure out what they were seeing, classifyin­g her “as an unknown object, as a vehicle, and then as a bicycle with varying expectatio­ns of future travel path,” the report said.

At 1.3 seconds before impact, the system “determined that emergency braking was needed to mitigate a collision,” the report said — but that informatio­n went nowhere.

“According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control to reduce the potential for erratic vehicle behavior,” the report said.

Applying the brakes even at that last second could have saved Herzberg’s life, outside observers said.

At the car’s 38 mph speed, it would have been 72 feet away from Herzberg when it determined that braking was needed 1.3 seconds before the fatal collision. That is the standard braking distance for a vehicle traveling 38 mph.

“It turns out that this (72 feet) is exactly the stopping distance at that speed,” said Brad Templeton, a Silicon Valley consultant and entreprene­ur who writes a blog on robot cars. “If the car applied hard brakes 1.3 seconds in advance, it would have gently nudged her with the bumper, sad to say.”

Uber said in a statement that it has “initiated our own safety review of our self-driving vehicles program,” as well as hiring former NTSB Chairman Christophe­r Hart to advise on safety. “We look forward to sharing more on the changes we’ll make in the coming weeks,” it said.

Uber’s decision to disable the autonomous car’s emergency braking “implies that they must have a lot of false alarm problems, which means that the system is far from maturity,” Steven Shladover, a UC Berkeley research engineer and pioneer in self-driving research, said in an email.

The report raises numerous questions, he said, including why the system didn’t at least start to slow down once it detected an obstacle in its path, and why it was not programmed to alert the test driver.

Uber had also turned off the Volvo’s factory-equipped features, including collision avoidance with automatic emergency braking and driver-alertness detection.

“It’s disturbing that they did not take advantage of the Volvo safety systems ... which are among the most sophistica­ted on any production vehicle,” Shladover said.

Jim McPherson, a Benicia lawyer who runs the Safe Self Drive consultanc­y, said Uber appears to rely too heavily on its backup drivers. While most self-driving companies have two people in their test cars, Uber has only one.

“Recognizin­g, catching and intervenin­g in automation mistakes demands constant vigilance, instant reaction times and ability to know when automation is failing,” he said. “It may be more than we should expect of human safety drivers.”

Uber “had two fail-safes, one of which they disabled, and one of which (the driver) was lulled into a sense of not acting,” McPherson said. “She was asked to step in when automation failed, but how was she to know when that happened? The system as a whole is faulty.”

Michael Ramsey, research director at Gartner, said it is a sad irony that existing, noncomplex technology such as automatic braking systems could have prevented the collision.

“The report gives a template to what to do differentl­y — the opposite of what Uber did,” Ramsey said.

Newspapers in English

Newspapers from United States