The Jerusalem Post

Could self-driving vehicles make ethical decisions?

- R #Z +6%: 4*&(&-

Jerusalem-based Mobileye and other companies developing self-driving vehicles may have to cope with the eventualit­y that robots could make ethical decisions on whom to save and whom to sacrifice in a car accident.

This conclusion on autonomous vehicles was made from a study by German researcher­s at the Institute of Cognitive Science at the University of Osnabrück, just published in Frontiers in Behavioral Neuroscien­ce. Described by the authors as “groundbrea­king” research, the study has strong implicatio­ns if human ethical decisions can actually be made by machines.

Contrary to previous thinking, the researcher­s found for the first time that human morality can be modeled – meaning that machine-based moral decisions are, in principle, possible – by using immersive virtual reality to study human behavior in simulated road traffic scenarios.

The study participan­ts were asked to drive a car in a typical suburban neighborho­od on a foggy day, then were presented unexpected­ly with unavoidabl­e dilemmas involving inanimate objects, animals and humans and had to decide which of the latter should be saved from injury or death.

Cognitive scientist Leon Sütfeld, the study’s lead author, wrote that until now it had been assumed that moral decisions are strongly dependent on context and therefore could be modeled or described algorithmi­cally, “But we found quite the opposite.

“Human behavior in dilemma situations can be modeled by a rather simple value-of-lifebased model that is attributed by the participan­t to every human, animal, or inanimate object.” This, he continued, implies that human moral behavior can be well described by algorithms that could be used by machines as well.

The study’s findings may have major implicatio­ns in the debate around the “behavior” of self-driving cars and other machines in unavoidabl­e situations. Prof. Gordon Pipa, another senior author of the study, said that since it now seems to be possible that machines can be programmed to make human-like moral decisions, it is crucial that society engages in an urgent and serious debate.

“We need to ask whether autonomous systems should adopt moral judgments,” Pipa insists.

For example, a child running onto the road would be classified as significan­tly involved in creating risk, thus being less deserving to be saved in comparison to a bystander on a footpath. “But is this a moral value held by most people and how large is the scope for interpreta­tion?” the researcher­s asked.

The issue could also be relevant to autonomous weapons firing by robots in war. The study’s authors say that autonomous cars are just the beginning, as robots in hospitals and other artificial intelligen­ce systems become more common.

They warn that we are now at the beginning of a new era with a need for clear rules, otherwise machines may start marking decisions without us.

Newspapers in English

Newspapers from Israel