Daily Mail

Could hackers take control of your wheel while you’re driving?

- By Ray Massey

YOU are driving down the motorway in your brand new Mercedes when it suddenly swerves. Before you know it the car is veering out of control and there is nothing you can do.

This nightmare scenario is one of the great fears with the imminent launch of driverless vehicles. But could someone really hack into your car and send it careering across the road?

British spy chiefs warned just this year that the roll-out of next-generation super-fast 5G mobile and internet services — central to driverless cars — could be hacked by ‘terrorists, hostile states and serious criminals’.

GCHQ head Jeremy Fleming said Chinese companies are set to play a leading role in providing the technology. And some senior and respected motor industry chiefs have warned of the danger of hacked autonomous cars becoming ‘bombs on wheels’.

In 2015, a pair of cyber-security experts were able to remotely hack into a 2014 Jeep Cherokee.

By going via the entertainm­ent system, they disabled some of the 4x4’s engine functions and interior controls such as air con, locks and radio. Parent company Fiat Chrysler later recalled 1.4 million of its vehicles.

But supporters say such concerns are being over- egged — for now.

Much of the technology needed to make cars drive themselves is already incorporat­ed and in use in many cars we are using now.

Cars can park, steer to stay within a lane, and even overtake a car in front all by themselves — though warnings sound if the driver’s hands are off the steering wheel for too long.

I have driven ‘ hands free’ BMWs, Mercedes-Benz’s, Volvos and even a Nissan Qashqai on test tracks and private land, letting the self-drive technology do the work.

The motor industry cites five ‘levels’ of autonomy through which it is progressin­g. We’re currently around Level 3, but moving towards 4 by 2021 when drivers really can lie back and let the car take control and take their ‘mind’ off the driving.

The holy grail is a car that can be switched to ‘fully autonomous’ and drive itself all the time, until the driver decides to resume control. That’s level 5.

Government ministers have boasted how they want to make the UK the ‘self-driving capital’ of Europe, and even the world.

Maverick entreprene­ur Elon Musk has spearheade­d the drive for electric and autonomous cars with his Tesla range. But mainstream manufactur­ers are already advancing quickly.

Self-driving trials are going on now in Coventry, Milton Keynes and London involving the likes of Volvo and Jaguar Land Rover. Apple and Google are also testing the water.

Supporters point out that most car accidents are due to human error and the most dangerous part of any car is ‘the nut behind the steering wheel’ — the driver.

They can be tired, inattentiv­e, drunk or just not very good.

An autonomous car, by contrast, is programmed to get it right every time.

Set against that are a series of high-profile accidents involving self-driving technology, including the first motorist killed in a selfdrivin­g car in 2016 when the Tesla Model S in which he was travelling in ‘Autopilot’ mode ploughed into a truck. And in March this year cyclist Elaine Herzberg was killed in a collision with an autonomous Uber taxi in Arizona.

Motoring pundit Jeremy Clarkson has challenged the bosses of the German car giant Audi, whose A8 features self- drive technology, to try it along a precipice with a 1,000ft drop: ‘You drive it over the Death Road in Bolivia and I’ll buy one.’

Even authoritat­ive bodies such as the UK’s Institute of the Motor Industry ( IMI) have warned about ‘ the potentiall­y apocalypti­c ramificati­ons of automation’.

A spokesman for the IMI said: ‘It’s a serious situation. Cars could be turned into bombs on wheels that are remotely controlled by terrorists.

‘ You don’t have to have a terrorist on board. It’s risk-free to them.’

The fatal lorry hijackings in France and Germany, and the car attacks on Parliament, showed how vehicles could be used ‘to terrible and lethal effect’ as ‘deadly weapons’, he said.

Professor Jim Saker at Loughborou­gh University warned in a study commission­ed by the IMI: ‘ The prospect that an autonomous car can be driven loaded with explosives at a target raises major concerns for counter-terrorism.’

There are moral and legal dilemmas to reconcile.

Faced with hitting a mother pushing a pram, swerving left into the path of a pensioner, or right over a cliff edge, what should the self- driving car be programmed to do? Should it also seek to protect the occupants at all cost?

And who is legally responsibl­e if it crashes?

Newspapers in English

Newspapers from United Kingdom