Montreal Gazette

The big moral dilemma facing self-driving cars

No technology is without shortcomin­gs, potentiall­y fatal consequenc­es

- STEVEN OVERLY

How many people could self-driving cars kill before we would no longer tolerate them?

This once-hypothetic­al question is now taking on greater urgency, particular­ly among policymake­rs in Washington. The promise of autonomous vehicles is that they will make our roads safer and more efficient, but no technology is without its shortcomin­gs and unintended consequenc­es — in this instance, potentiall­y fatal consequenc­es.

“What if we can build a car that’s 10 times as safe, which means 3,500 people die on the roads each year. Would we accept that?” asks John Hanson, a spokesman for the Toyota Research Institute, which is developing the automaker’s selfdrivin­g technology.

“A lot of people say if, ‘I could save one life it would be worth it.’ But in a practical manner, though, we don’t think that would be acceptable,” Hanson added.

Members of Congress are beginning to consider legislatio­n that enables the broader adoption of self-driving technology without compromisi­ng safety. At a House subcommitt­ee hearing last week, for example, lawmakers and industry leaders alike grappled with the question of whether machines need only drive better than humans to win our trust.

The Department of Transporta­tion, for its part, published its first guidelines for self-driving vehicles last year in an effort to keep pace with automakers that hope to unleash the cars on the road in the next several years. Ford, for example, has set a goal of releasing an autonomous vehicle fleet by 2021.

More than 35,000 people were killed in car collisions in the United States in 2015, according to the National Highway Traffic Safety Administra­tion. The agency estimates 94 per cent of those wrecks were the result of human error and poor decision-making, including speeding and impaired driving.

Self-driving enthusiast­s assert that the technology could make those deaths a misfortune of the past. But humans are not entirely rational when it comes to fearbased decision-making. It’s the reason people are afraid of shark attacks or plane crashes, when the odds of either event are exceptiona­lly low.

Harvard University professor Calestous Juma draws a parallel between self-driving cars and home refrigerat­ors, which gained popularity in U.S. households in the 1920s and 30s. Although scientists understood that cold storage could cut down on food-borne illnesses, reports of refrigerat­ion equipment catching fire or leaking toxic gas made the public wary.

Americans eventually adopted the now-ubiquitous household appliance thanks in large part to the U.S. Department of Agricultur­e — which advocated for the health benefits of refrigerat­ion and educated against unfounded concerns about the technology’s safety, Juma writes in his book, Innovation and Its Enemies: Why People Resist New Technologi­es.

People are also more inclined to forgive mistakes made by humans than machines, Gill Pratt, the chief executive of the Toyota Research Institute, told lawmakers on Capitol Hill last week.

“The artificial intelligen­ce systems on which autonomous vehicle technology will depend are presently and unavoidabl­y imperfect,” Pratt told lawmakers at a House subcommitt­ee hearing. “So, the question is ‘how safe is safe enough’ for this technology to be deployed.”

As a society, we understand human limitation­s because we live with them daily, said Iyad Rahwan, an associate professor at the Massachuse­tts Institute of Technology Media Lab who has studied the social dilemmas presented by autonomous vehicles. While we may assign blame or seek retributio­n — by sending a drunk driver to prison, for example — the capacity for human failure is not hard to understand or empathize with. The same is not true for machines, he said.

“We penalize them and distrust them more when they make mistakes,” Rahwan said. “It comes down to us not having proper mental models of what machines can and cannot do.”

 ?? JARED WICKERHAM/THE ASSOCIATED PRESS ?? A self-driving Ford Fusion hybrid model is test driven last year in Pittsburgh, Pa. Ford has set a goal of releasing an autonomous vehicle fleet by 2021.
JARED WICKERHAM/THE ASSOCIATED PRESS A self-driving Ford Fusion hybrid model is test driven last year in Pittsburgh, Pa. Ford has set a goal of releasing an autonomous vehicle fleet by 2021.

Newspapers in English

Newspapers from Canada