Windsor Star

When your brakes fail, who lives and who dies?

On the verge of autonomous cars, moral questions arise, Lorraine Sommerfeld says.

- Driving.ca

Way back in 2016, Science magazine published a study about how fully autonomous cars would deal with the biggest question of all: Who dies?

“Most people want to live in a world where cars will minimize casualties,” said co-author and MIT professor Iyad Rahwan. “But everybody wants their own car to protect them at all costs.” Engineerin­g, meet ethics. Oh, I’m not saying engineerin­g has been without ethics all this time. It’s just that this is the first time in my memory that ordinary people will have to contemplat­e over and over a question I previously only thought about when an old university professor of mine asked us to consider who to throw overboard when a boatful of people was doomed. Morals and Ethics at McMaster was a fun class.

Now Nature magazine has published the results of MIT’s Moral Machine, an online brain twist you might not want to play with anyone looking over your shoulder. They put little skulls on the proposed dead things; it’s creepy and awesome. It’s a version of the Trolley Problem — scenarios where you choose who lives and dies when the brakes fail.

It forces you to quantify humans by age, gender, and general worth to society. Pregnant women and children, pedestrian­s crossing against the light, whether they’re athletes or criminals, cats or dogs, fat people, old people, the homeless; it’s a fairly horrible quiz. People seem to have a general squick factor about killing pedestrian­s — killing one to save 10, my man overboard scenario — but ultimately, if the brakes fail and that someone is in the car, they opt to save themselves, with old people being the most expendable. Told you it was horrible, but 40 million people can’t be wrong. Can they? As we approach the intersecti­on of autonomous cars as a goal and autonomous cars as an actual thing, manufactur­ers are being forced to deal with what people want their cars to do. Or maybe, what people say they want their cars to do. It’s going to be a bit of a U-turn in some ways; we’ve been sold for decades on the fact that improved safety features in cars will save the lives of occupants. What if the better solution is to let the driver die? Someone who just paid tens of thousands of dollars for a vehicle might hesitate. A lot.

The Moral Machine results show exactly the hot water Mercedes-Benz executive Christoph von Hugo got himself into at the 2016 Paris Motor Show. Car and Driver quoted him saying Mercedes’ autonomous vehicles would save the occupants first — “save the ones you know you can save,” he said. And he knows his cars can save the occupants. He also added that because of the vast increase in overall road safety — 96 per cent of crashes are the result of driver error — this point would become almost moot.

We’re picturing our current chaotic road conditions and imagining a computer deciding who lives and who dies, and asking who programmed it to make that decision. Fully autonomous cars would eliminate the vast majority of crashes, period — when we get to that point. Mercedes-Benz walked back the statement — the optics are, admittedly, less than ideal — stating afterward “neither programmer­s nor automated systems are entitled to weigh the value of human lives,” and that the company is “not legally allowed to favour one life over another in Germany and other nations.” Except, that is exactly what we are asking manufactur­ers to do. Engineers now have to be philosophe­rs, moralists and ethicists. In whatever imaginary year, every car on the road is autonomous and communicat­ing with every other car, and with all the infrastruc­ture, we’ll be at the blessed near-negligible crash situation. Who is going to argue that cleaning up the carnage on our roads wouldn’t be a good thing? Certainly, nobody who has ever lost a loved one or had their lives forever altered by an impaired driver, a stupid one, or someone having a heart attack. There are 160,000 vehicle crashes in Canada each year, and more than 2,800 deaths. Look at that number again.

No, the problem isn’t what a great thing it would be to eradicate the human and social costs. It’s about deciding who’s deciding until we get there.

Take the Moral Machine online quiz (moralmachi­ne.mit.edu). I saved a lot of dogs. And I don’t even like dogs very much.

 ??  ?? Should an autonomous vehicle prioritize safeguardi­ng its occupants, or those outside the vehicle?
Should an autonomous vehicle prioritize safeguardi­ng its occupants, or those outside the vehicle?

Newspapers in English

Newspapers from Canada