National Post (National Edition)
‘Were there oversights in the bigger picture?’
The U.S. National Transportation Safety Board and National Highway Traffic Safety Administration are investigating the incident.
Bryant Walker Smith, a professor at the University of South Carolina who studies legal aspects of autonomous driving, wrote shortly after Uber halted its self-driving program that the circumstances of the crash “suggested that something went wrong” and that “the lawfulness of the victim’s actions is only marginally relevant to the technical performance of Uber’s testing system.”
“What will be very important to understand is not just why this particular crash happened in terms of the interactions in the road, but also why it happened in terms of Uber’s design processes. Were there oversights in the bigger picture that led to this?” Smith said in an interview.
“If there are serious reasons to be concerned that this reflected a broader failure, then there will be pressure on the states that have been permissive (of AV testing) to think through that.”
While Smith wrote that developers and regulators are aware that tragedies may occur with automated driving systems, the Arizona fatality “was uncomfortably soon in the history of automated driving.”
The fatality also raises the question of whether governments are generally prepared for the testing of the still-developing technology, said Michael Ramsey at research firm Gartner Inc.
“Right now, I think the biggest impact of this is that it’s probably going to make governments assess whether they are actually ready to handle a similar situation like this,” he said.
“In situations where there is no safety driver to be interviewed, how are they going to deal with this and are they really ready to call a vehicle safe? How are they going to determine that? It reaffirms that this is a real concern, not just something theoretical.”
But Ramsey also expects that the fatality will not have a major impact when it comes to long-term development. He points to a 2016 fatality involving a Tesla Model S on autopilot, which was investigated by U.S. authorities but ultimately did not impede development.
“I don’t think in the long term it’s going to have any effect. We already saw after the Tesla accident, which was not the same by any stretch of the imagination, there was no sign that it led to a reduction in investment or speed of development,” Ramsey said.
“I think that’s probably a good indicator of what’s going to happen here. This will be used as proof point for people who have legitimate concerns, and it will cause a temporary pause in development.”
Barrie Kirk, executive director of the Canadian Automated Vehicles Centre of Excellence (CAVCOE), also believes the setbacks for autonomous vehicle development will be limited to the short term, with some developers hitting the pause button while waiting for results of Uber’s investigation.