Lethbridge Herald

SELF-DRIVINGREA­LITYCHECK

AT SOME POINT, AUTONOMOUS VEHICLES BECOME PART OF THE REAL WORLD, BUT WHAT HAPPENS WHEN THEY CROSS THAT THRESHOLD?

- Tom Jensen WHEELBASE MEDIA

Autonomous vehicles have one main purpose: To reduce or eliminate collisions due to human error. Most collisions are the result of some sort of human error.

Ironically, a pedestrian in the Phoenix suburb of Tempe, Ariz., has died after being hit by a selfdrivin­g car, the first reported instance of such a tragedy.

Tempe Police said that Elaine Herzberg, 49, of Mesa was crossing a Tempe street when she was struck and killed by an Uber self-driving Volvo, which was travelling at about 40 mph (65 km/h).

Herzberg was not in a crosswalk and was walking a bicycle across the dimly lit road when she was killed. The collision occurred at about 10 p.m. on March 18.

At the time, the Uber car was in self-driving mode, but had a backup driver, who was identified as Rafaela Vasquez, 44.

The Tempe police department posted a video of the collision on its Twitter account, @TempePolic­e, and said it was actively investigat­ing the incident. The video came from cameras mounted in the car and clearly shows the contact between the car and the victim, as well as the reaction of the driver when the impact occurs.

In the video, Vasquez did not appear to attempt to brake or swerve to avoid the contact, but also had little opportunit­y to react.

Tempe Police Chief Sylvia Moir told the San Francisco Chronicle daily newspaper that the victim came out of the shadows directly in front of the Uber Volvo.

“It's very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she (Herzberg) came from the shadows right into the roadway,” Moir told the newspaper.

The U.S. National Transporta­tion Safety Board is also investigat­ing. The federal agency sent a three-person team to Tempe, and in a news release said their “investigat­ion will address the vehicle’s interactio­n with the environmen­t, other vehicles and vulnerable road users such as pedestrian­s and bicyclists.

The team … will examine vehicle factors, human performanc­e and electronic recorders.”

After the incident, Uber CEO Dara Khoshrowsh­ahi Tweeted, “Some incredibly sad news out of Arizona. We’re thinking of the victim’s family as we work with local law enforcemen­t to understand what happened.”

San Francisco-based Uber temporaril­y suspended testing of its self-driving cars — in Phoenix, Pittsburgh, Pa., Toronto, Ont., and San Francisco, Calif. — following the incident. The ride-hailing company also issued a statement saying it would cooperate fully with local authoritie­s investigat­ing the incident.

Prior to the Tempe crash, the most publicized incident involving a self-driving car occurred on May 7, 2016, when former U.S. Navy SEAL Joshua Brown, 40, of Canton, Ohio, died in a passenger car in “Autopilot” mode. Brown’s Tesla Model S hit a tractor-trailer after he left a family outing at Walt Disney World in Orlando, Fla.

But the Arizona crash was the first to claim the life of a pedestrian. Predictabl­y, the Tempe incident set off a firestorm of controvers­y about the safety of autonomous cars, a debate likely to last for months, if not years.

Akshay Anand, an analyst at Kelley Blue Book, told USAToday.com, “There will no doubt be an exhaustive investigat­ion. It’s clear is that this has the potential to severely impact public perception­s of autonomous technology, and should be handled with utmost prudence by regulators, authoritie­s and the industry alike.”

What’s next in this story? The conversati­on has already begun to swirl around the risks of testing on public roads and whether it should be relegated to closed courses, within controlled environmen­ts. But at some point, autonomous cars will have to drive on public roads where life is unpredicta­ble and can throw curve balls that can’t be dreamed up in a lab. What will happen then?

 ??  ??
 ??  ??
 ??  ??

Newspapers in English

Newspapers from Canada