Irish Daily Mail

I would rather take my chances in a computer-driven car than take a lift from Ant McPartlin

- THE MATT COOPER COLUMN

TWO stories involving car crashes leapt off the pages of this newspaper yesterday. One involved a fatality – an American woman who was struck by an autonomous (or so-called self-driving) Uber car.

The other involved a drunken celebrity, Ant McPartlin, who was lucky he didn’t kill his own mother or himself in his own car, or a family of three, including a threeyear-old girl, in one of the vehicles he hit.

The reaction has been very interestin­g. Uber has had to take its test models off the roads amid howls of outrage about the supposedly excessive dangers of allowing cars onto the roads with artificial intelligen­ce, instead of humans, in charge. McPartlin, correctly, is being castigated for his irresponsi­bility, but nobody is suggesting that humans not be allowed drive cars as a result of his example – one which is far from untypical.

It is estimated, with reasonable certainty through the collation of statistics, that more than one million people across the world die each year as a result of cars colliding with each other, with people or with hard objects. A multiple suffer serious injury. Nobody suggests banning people-driven cars as a consequenc­e.

Attention

In the US, on any given day, an average of 16 pedestrian­s die on the roads. In other words, last Sunday, there were likely to have been 15 deaths caused by a car driven by a person hitting another person, other than the one that has attracted global attention and the immediate reaction of regulators.

Admittedly, the number of self-driving cars on the road is a very tiny fraction of all traffic, and the distance they have covered is also a tiny proportion. Uber has said its self-driving vehicles have together driven more than three million autonomous miles to date, since the start of its first passenger-carrying pilot programme in September 2016, but it still has little more than 200 automated vehicles.

Alphabet’s (better known as the parent of Google) Waymo car developmen­t company recently surpassed five million autonomous road miles. However, the industry’s accumulate­d real-world driving experience for all of the available cars falls far short of 100million miles, and that was the symbolic milestone many developers of self-driving cars were quietly hoping to reach before any fatal collisions.

‘The fact that this has happened well in advance of 100million miles does not tell us anything statistica­lly,’ said Bryant Walker Smith, assistant professor at the University of South Carolina’s law school and a legal expert on autonomous vehicles.

‘But it is early, particular­ly in light of everything that these systems already have going for them,’ he added.

Maybe so, but did anyone expect the early prototypes to be faultless? That nobody would be killed, ever? To think so – even if it would be desirous to avoid casualties – would be ridiculous. The issue is reducing fatalities and injuries, not eliminatin­g them. Cars with an extensive array of cameras, sensors, mapping equipment and navigation systems are far more likely to anticipate and avoid danger on the roads. Human instinct, apart from the excellent driver, is overrated.

The Institutio­n of Mechanical Engineers claimed two years ago that by eliminatin­g human error, it would reduce road accidents by 95%. Even if that estimate is over-optimistic, it would be a massive improvemen­t on what happens now.

Self-driving vehicles are expected to be statistica­lly safer than those controlled by humans because they obey traffic laws and will not be tired, drunk or distracted.

Is it any real surprise that programmer­s are struggling to prepare the on-board systems for unpredicta­ble human behaviour around them?

So what happened in this case, given the self-driving technology is supposed to detect pedestrian­s, cyclists and others, and prevent crashes? Police in Tempe – where the accident occurred – said Uber’s modified Volvo XC90 was travelling at about 65kph and did not slow before hitting 49-year-old Elaine Herzberg as she stepped into the road, pushing a bicycle. Uber’s human driver (who was not in control of the car) told investigat­ors his ‘first alert to the collision was the sound of the collision’.

Some people have rushed to judgment, suggesting that the electronic brain cells of the driverless Uber car in Arizona didn’t appear to have behaved in the way humans seem to think it ought to have done. But that ignores the possibilit­y that any car would have hit the woman in more normal circumstan­ces. The reaction to the women’s demise has been extraordin­ary. Local police are investigat­ing, while two US federal safety regulators, the National Transporta­tion Safety Board and the National Highway Traffic Safety Administra­tion, sent their own investigat­ors to Tempe.

A major consumer lobby group, Consumer Watchdog, called for a national moratorium on autonomous car testing in the wake of the deadly collision.

‘The robot cars cannot accurately predict human behaviour, and the real problem comes in the interactio­n between humans and the robot vehicles,’ said its spokesman.

Legal experts say the investigat­ors’ lines of inquiry are likely to focus on whether a faulty sensor or other system failure contribute­d to the accident; whether the car ‘saw’ the pedestrian and how that person behaved; whether the automated driving system should or could have handed control to the human behind the wheel; and what kind of evasive action it took.

Collisions

Almost 60 collisions involving self-driving cars have been reported since 2014 in California, where almost 400 such vehicles have now been given permission to operate.

Such setbacks have allowed officials, such as Linda Bailey, executive director of the National Associatio­n of City Transporta­tion Officials (Nacto) in the US, to claim there has not been enough regulatory oversight of testing and that some government­s are overwhelme­d trying to understand autonomous technology and its limitation­s. It was too easy a charge to make.

What happened on Sunday evening was not the death-knell for the driverless car – the hysterical reaction will be overcome – but it may have delayed its delivery somewhat.

Of course, there will be those who will never trust these cars, believing that they are always better off in a car being driven by a human. Enough of them and the self-driving car may not attract enough support.

But who knows how attentive and skilled that driver is, how likely he or she is to be distracted, whether or not they are operating under the influence of mood changers, such as alcohol or drugs, even if they are not supposed to be doing so.

Every time you get in a car and are not the driver, you are taking a chance, putting your fate in the hands of the person who is driving, known to you or otherwise. You could find an Ant McPartlin driving you.

I’d rather take my chances with artificial intelligen­ce than an addled human like him… and, believe me, we have enough of them, sober or not.

 ??  ??

Newspapers in English

Newspapers from Ireland