Business World

Autonomous car crashes: who — or what — is to blame?

-

THE PROMISE of driverless vehicle technology to reduce road fatalities hangs in the balance now as never before. Two recent deaths involving Uber and Tesla vehicles using driverless systems have raised the debate on safety to levels that threaten to significan­tly delay or derail adoption of the technology.

Uber has temporaril­y halted tests of self-driving cars after the latest crash, and so have Toyota and graphic chips manufactur­er Nvidia, whose artificial intelligen­ce technology helps power driverless cars. Arizona, where the Uber vehicle had its crash, has banned the company from testing its driverless cars in the state. And even before the latest crashes, California had introduced a permit process for autonomous vehicles with elaborate requiremen­ts.

The publicly available informatio­n on the two accidents does not appear to predominan­tly place the blame on either human error or technology. On the night of March 18, as Elaine Herzberg crossed a six-lane freeway in Tempe, Arizona, pushing a bicycle, she was fatally struck by a Volvo SUV that had been modified to use driverless technology. In what is believed to be the first pedestrian fatality involving autonomous vehicle technology, the sensors in the SUV failed to spot Herzberg in good time and slow down its 38 mph speed, and a safety driver in the car was apparently distracted, as police videos showed.

Five days later, a Tesla SUV with driverless technology on autopilot mode crashed into a road divider in Mountain View, Calif., killing its driver, Apple engineer Walter Huang. Incidental­ly, Huang had earlier complained to a Tesla dealership about how the vehicle in Autopilot mode veered towards the same barrier on multiple occasions, according to an ABC report. Tesla blamed the severity of the crash on a missing piece in the road divider. In a later update, Tesla seemed to blame human error, as well.

Wharton management professor John Paul MacDuffie, who is also director of the Program on Vehicle and Mobility Innovation at the school’s Mack Institute for Innovation Management, put the accidents in the context of the evolution curve of driverless technology. “We’re early days yet, and there have been very few of these accidents,” he said. “[The Uber crash] may have been only the second fatal accident and the first of a pedestrian [involving driverless vehicles].”

Mr. MacDuffie observed that while each such death is shocking and tragic, such incidents will become more common. “We’ve all been telling ourselves that inevitably in the testing and improvemen­t of autonomous vehicles there are going to be people injured and killed, and we know that human drivers kill other humans all the time.”

WHAT WENT WRONG?

“It seems like everything that could go wrong went wrong” in the Uber case, said Constantin­e (Costa) Samaras, assistant professor at Carnegie Mellon University and director of the university’s Center for Engineerin­g and Resilience for Climate Adaptation. He noted that the sensors on the vehicle should have seen the pedestrian; the backup or safety driver did not have his hands on the wheel and that no brakes were applied by either the driver or the car. Both he and Mr. MacDuffie said the final government investigat­ion report on the accident will clarify what precisely caused it.

In the Tesla crash, the autopilot had been engaged and it gave warnings to the driver of a potential collision, but the driver failed to take control of the vehicle. Mr. Samaras said that drivers that use a robotic system or artificial intelligen­ce to improve their driving may be able to prevent crashes. “But when humans are the backup systems, we’re pretty bad at doing that,” he said. “This is a challenge for this transition to automation, where there’s this muddled mixture of human responsibi­lity and robot responsibi­lity.”

Messrs. MacDuffie and Samaras charted the path ahead for the developmen­t of autonomous vehicle technology on the Knowledge@Wharton show on SiriusXM channel 111.

A HUMAN PROBLEM

Distracted driving is already showing up tellingly in the statistics. Mr. MacDuffie pointed out that deaths from vehicle accidents had consistent­ly decreased from the post-war period until recently, but began increasing after 2015. US roads saw 37,461 fatalities in 2016, up 5.6% from those in 2015, according to a report from the National Highway Traffic Safety Administra­tion. He said indication­s are that the number of such fatalities would have further increased in 2017, data for which is not yet available.

Added Mr. Samaras: “The more than 37,000 road fatalities last year would be the same if a fully loaded 747 plane [were to] crash every couple of days.”

Automated driving technology has been expected to help reduce the incidence of those accidents, Mr. MacDuffie said. However, “any situation where you’re expecting the human and the computer algorithms to share control of the car, it is very tricky to hand that control back and forth.” He noted that Waymo, the Alphabet subsidiary pursuing driverless technology, has consistent­ly argued against such systems where control of a vehicle is handed back and forth between the driver and the algorithms. The company has instead pushed for a perfected automation technology that totally eliminates the role of a human driver.

SPEEDING PAST REGULATION

Mr. MacDuffie wondered if in the case of Uber, a flawed corporate culture was responsibl­e in some part. “It probably could have happened to

anyone, but some of the ways Uber has approached this fits other parts of their narrative recently, probably unfortunat­ely for them,” he said. He noted that Uber began testing its driverless vehicles in San Francisco without the requisite permits in December 2016, before California halted them a week later. Uber then took its trials to next-door Arizona, which promised less regulation and a more businessfr­iendly environmen­t, he added.

Mr. MacDuffie pointed to other changes Uber made that he found disconcert­ing: in tests, it replaced the practice of having two drivers with one driver; it turned off the safety equipment in the cars while testing software; and had the LiDAR (Light Detection and Ranging) sensors installed only on the top of the vehicle and not on the sides as well, unlike other vehicle manufactur­ers. Even as details will emerge only after the full investigat­ion report, “there are some aspects of the story that make it look like Uber rushing in and cutting corners may have been part of why they had a failure in this particular incident,” he added.

BEING COMPASSION­ATE AND RESPONSIBL­E

According to Mr. Samaras, Uber has taken the right step in suspending tests with driverless cars until more informatio­n is available. “This is a business risk to both Uber and Tesla,” he said. “If they are seen to be not compassion­ate but also not responsibl­e in making sure that these risks are reduced, the future of their business in automation is in question.”

Not surprising­ly, the Uber and Tesla crashes presented an opportunit­y for rivals to promote themselves. Mr. MacDuffie noted that Waymo CEO John Krafcik claimed after the Uber crash that his company’s technology would have spotted the pedestrian and averted the accident. Similarly, the CEO of Mobileye, an Intel-owned company that makes sensors for autonomous vehicles, claimed in a blog that his company’s technology was superior.

Waymo went further to demonstrat­e that it fully believed in the safety of its technology by announcing plans to order 20,000 electric Jaguars for its forthcomin­g launch of a robotaxi service in the US. “I’m sure there was a strategic calculus there of making sure people didn’t automatica­lly think all automated vehicles were dangerous and to make a bold claim that ‘Ours is safer, and we’re moving ahead quickly,’” said Mr. MacDuffie.

THE ROAD AHEAD

Mr. Samaras called road fatalities “a public health crisis” that automation could help address. “The challenge here is — how do we muddle through this transition period?” The latest incidents have made the road ahead for driverless vehicle technology far less clear. “It is important that these companies test in the real world as well as in simulation,” said Mr. Samaras. “We can’t just not test on the streets if we want to have this technology for the benefit of society. [However,] the challenge here is that you have PR, engineerin­g, policy, regulation and risk – all kind of coming together on this [project] on public streets.” The solution lies in making the data on the accidents publicly available and to learn from them.

Mr. MacDuffie agreed that it is important that the technology be tested in realworld conditions. “It’s no coincidenc­e that all these companies are finding Arizona and the suburbs around Phoenix very good places to test because they are flat, nice wide roads, with simple intersecti­ons, dry weather, sunny weather, a little rain and a little fog,” he said. “Those are very good conditions. The tougher conditions that exist in other places will need to be tested out.”

REGULATORY MOVES

Well before the latest crashes, state regulators have been progressiv­ely tightening rules on testing of automated vehicle technologi­es. Mr. MacDuffie pointed out that California, which had been criticized for a relatively lax regulatory regime governing autonomous vehicles, now requires companies conducting the tests to report every time there is a disengagem­ent of the automated controls for the driver to take over. Records of such disengagem­ent incidents are made public, he noted.

Amid the moves on the regulatory front, a bill stalled in Congress seems to exempt companies testing automated vehicles from federal vehicle safety standards, Mr. Samaras said. “The claim is that to outfit these vehicles with the proper safety features [such as] putting a car seat in the back might not be needed if it’s going to just be a test vehicle,” he added. “It’s a pretty laissez faire kind of bill and in the name of innovation it’s saying, ‘Let’s move forward quickly with this and let’s not slow it down.’ What happens in the regulatory discussion is worth watching.”

 ??  ??
 ??  ??

Newspapers in English

Newspapers from Philippines