The Scotsman

Driverless cars could decide who gets hit in crashes, lawyers warn

● Owners could choose their own ‘ethical system’ for autonomous cars

- By SARAH WARD newsdeskts@scotsman.com

Driverless cars could be programmed to make “moral” decisions about who gets hit in a collision, lawyers have warned.

With autonomous vehicles expected to be on the roads within two years, legal experts have raised questions about the technology behind them.

And some have suggested that an “ethical system” could become as routine as choosing a paint colour.

Carsinthef­uturemaybe­programmed to weigh up the value of individual lives, according to a report from the Faculty of Advocates.

The submission was part of a consultati­on between the Scottish Law Commission and the Law Commission of England and Wales, with a threeyear review of laws for selfdrivin­g cars.

One theoretica­l circumstan­ce would be whether the car should hit Albert Einstein or a group of criminals.

The report said: “Persons generally are entitled to expect that a self-driving vehicle will not collide with and injure them. However, in reality, the situation is much more nuanced.”

The Faculty of Advocates envisages a future in which owners may even be able to choose the “morality” of their car.

The report added: “The purchaser might be able to specify the ethical system with which the car is programmed… as well as specifying the paint colour and interior trim.”

The legal implicatio­ns to

0 Driverless cars – here being tested in Singapore – could be programmed to make ‘moral’ decisions over who to hit in an accident

driverless technologi­es are considerab­le, the report claims.

But while they could be governed by automated programmes based on predictabl­e algorithms, artificial intelligen­ce experts are developing neural networks, systems which make their own decisions.

Experts believe that new offences are likely to be necessary to cover the systems set up by companies to control driverless vehicles, and to hold them to account in the case of errors, malfunctio­ns and accidents.

The Faculty of Advocates said that even the suggestion that judgements could be made about the value of individual lives is feasible.

The “trolley problem” asks whether a person at the controls of a runaway trolley-car heading for five people on the tracks should pull the lever to switch on to a track where only one person will be hit.

An autonomous car might not have enough informatio­n to make a choice between a scientist and criminals. But the report said that could change in a country such as China, where the government is establishi­ng a “social credit” score for its citizens.

It added: “We cannot conceive of any circumstan­ces whatever where such a system could be regarded as acceptable in a free, open and democratic society.”

 ??  ??

Newspapers in English

Newspapers from United Kingdom