‘Huge risk’ to public from driverless cars
Police warn tech f irms could be liable for crashes
DRIVERLESS cars pose ‘a huge risk to a huge number of people’, warns one of Scotland’s most senior police officers.
Autonomous vehicles, including some buses and taxis, could be on our roads in only two years.
But Scotland’s head of road policing warns that driverless technology will mean profound practical and legal consequences.
In a consultation document, Chief Superintendent Stewart Carle questioned whether the vehicles’ computers will be able to make appropriate and safe decisions.
He also warned that if a driverless car breaks road traffic laws, the software manufacturers may be liable for any criminal offence.
Police are particularly concerned about the limits of the cars’ human recognition technology.
Serious accidents could be caused if a car mistakes a waving binman wearing a high-vis jacket for a police officer trying to direct traffic. To deal with crashes, Police Scotland said the force may have to set up a new investigative ‘black box’ unit to understand who – or what – is to blame.
Officers are also considering the possibility that software firms may be charged with causing some collisions, rather than blaming the person in the car.
Meanwhile, people riding in autonomous vehicles will no longer be called ‘drivers’ but ‘users-incharge’ as they will neither fully drive the vehicle nor be simply a passenger. ‘Users-in-charge’ will need a full driving licence and must stick to legal alcohol limits.
The police concerns were submitted to the Scottish Law Commission’s consultation on driverless cars. Mr Carle recommended that technology is developed so that it can react quickly to external changes and internal faults. A car must understand when a crash has happened and send a signal to a control room before pulling over.
To make sure passengers are as safe as possible, the car must also assess the weight of a passenger so it can provide appropriate seating and restraints. When there is an internal fault, the car must recognise it and refuse to move.
The submission read: ‘Mirrors and windscreens allow drivers to see and gather information about their surroundings upon which to base their decisions, automated vehicles use other sensors. If those sensors are obscured or faulty, then the vehicle itself should recognise this and take appropriate action, [such as] fail to move.’
Mr Carle, who said software failures have ‘the potential to pose a huge risk to a huge number of people’, also warned of teething problems with recognising humans.
He wrote: ‘[There is] probably not an easy solution to identify a police officer, bearing in mind the multiple variables in size, shape, clothing. Not all police officers use the same hand signals.
‘Also, some other individuals wear similar reflective jackets and headwear – a binman waving to a colleague could be misinterpreted by the vehicle. This appears to be a major technological challenge.’
The consultation added that the car and its creators could be subject to improvement notices, fines, suspension and withdrawal of their automated driving system (ADS) approval.
‘A binman waving could be misinterpreted’