The Scottish Mail on Sunday

‘Huge risk’ to public from driverless cars

Police warn tech f irms could be liable for crashes

- By Georgia Edkins

DRIVERLESS cars pose ‘a huge risk to a huge number of people’, warns one of Scotland’s most senior police officers.

Autonomous vehicles, including some buses and taxis, could be on our roads in only two years.

But Scotland’s head of road policing warns that driverless technology will mean profound practical and legal consequenc­es.

In a consultati­on document, Chief Superinten­dent Stewart Carle questioned whether the vehicles’ computers will be able to make appropriat­e and safe decisions.

He also warned that if a driverless car breaks road traffic laws, the software manufactur­ers may be liable for any criminal offence.

Police are particular­ly concerned about the limits of the cars’ human recognitio­n technology.

Serious accidents could be caused if a car mistakes a waving binman wearing a high-vis jacket for a police officer trying to direct traffic. To deal with crashes, Police Scotland said the force may have to set up a new investigat­ive ‘black box’ unit to understand who – or what – is to blame.

Officers are also considerin­g the possibilit­y that software firms may be charged with causing some collisions, rather than blaming the person in the car.

Meanwhile, people riding in autonomous vehicles will no longer be called ‘drivers’ but ‘users-incharge’ as they will neither fully drive the vehicle nor be simply a passenger. ‘Users-in-charge’ will need a full driving licence and must stick to legal alcohol limits.

The police concerns were submitted to the Scottish Law Commission’s consultati­on on driverless cars. Mr Carle recommende­d that technology is developed so that it can react quickly to external changes and internal faults. A car must understand when a crash has happened and send a signal to a control room before pulling over.

To make sure passengers are as safe as possible, the car must also assess the weight of a passenger so it can provide appropriat­e seating and restraints. When there is an internal fault, the car must recognise it and refuse to move.

The submission read: ‘Mirrors and windscreen­s allow drivers to see and gather informatio­n about their surroundin­gs upon which to base their decisions, automated vehicles use other sensors. If those sensors are obscured or faulty, then the vehicle itself should recognise this and take appropriat­e action, [such as] fail to move.’

Mr Carle, who said software failures have ‘the potential to pose a huge risk to a huge number of people’, also warned of teething problems with recognisin­g humans.

He wrote: ‘[There is] probably not an easy solution to identify a police officer, bearing in mind the multiple variables in size, shape, clothing. Not all police officers use the same hand signals.

‘Also, some other individual­s wear similar reflective jackets and headwear – a binman waving to a colleague could be misinterpr­eted by the vehicle. This appears to be a major technologi­cal challenge.’

The consultati­on added that the car and its creators could be subject to improvemen­t notices, fines, suspension and withdrawal of their automated driving system (ADS) approval.

‘A binman waving could be misinterpr­eted’

Newspapers in English

Newspapers from United Kingdom