Gulf News

Tesla crash shows man and machine must cooperate

The potential for a dangerous gap between perception and reality of autonomous-drive technology remains

-

lmost as soon as news broke of a fatal crash involving Tesla’s Autopilot last year, fans and detractors of the electric-car manufactur­er have been clear on the tragedy’s causes. Tesla’s supporters and investors never doubted that the system improves safety, so the driver must have failed to heed Tesla’s warnings and remain attentive. Detractors and short investors are all but certain that Autopilot somehow failed to protect the car’s driver, allowing him to drive directly into a semi at 74mph (119km/h).

After more than a year of debate, a conclusive answer is finally at hand, courtesy a National Transporta­tion Safety Board (NTSB) investigat­ion in the United States, whose final results were presented last week. But the board’s findings aren’t likely to leave either side happy: Rather than blaming man or machine alone, it seems that both human drivers and the Autopilot system — specifical­ly the complex relationsh­ip between the two — contribute­d to the deadly event.

At the heart of the matter is a dangerous dynamic: With billions at stake in the frantic race to develop self-driving car technology, there are huge incentives for carmakers to create the impression that vehicles for sale today are “autonomous”. But as the NTSB made it clear, no vehicle now on the market is capable of safe autonomous driving. When consumers take high-tech hype at face value, a lethal gap between perception and reality can open.

Tesla reaped months of laudatory coverage and billions worth of market cap by presenting its Autopilot system as being more autonomous than any other advanced driverassi­st system, even as it warned owners that they must remain attentive and in control at all times. Though Autopilot did offer better performanc­e than other advanced driver assistance systems, the key to its success was the lack of limitation­s Tesla put on its use. Because Autopilot allows owners to drive hands-free anywhere, even on roads where Tesla has warned that such use would not be safe, the company has been able to profit off the perception that its system was more autonomous than others.

But Autopilot was actually designed for use on wellmarked, protected highways with no chance of cross-traffic. So when the tractor-trailer turned across Florida’s Highway 27 last May and the Tesla slammed directly into it without triggering any safety systems, Autopilot was working exactly as designed. The problem was that it was being used on a road with conditions it wasn’t designed to cope with, and the driver had apparently been lulled into complacenc­y. Far from failing, Autopilot was actually so good that it led the driver to believe it was more capable than it really was.

Learning from aviation

This complex failure, which both man and machine contribute­d to, sounds an important warning about autonomous-drive technology: Until the systems are so good they need no human input, the human driver must remain at the Centre of “semi-autonomous” drive system design. Engineers must assume that if there’s a way for people to misuse these systems, they will. Just as important, companies need to understand that if they over-promote a semi-autonomous drive system’s capabiliti­es in hopes of pulling ahead in the race to autonomy, they run the risk of making the technology less safe than an unassisted human driver.

There’s a lesson to be learnt here from aviation. As computers and sensors improved in the 1980s, aircraft manufactur­ers began to automate more and more of the controls simply because they could. Only later did the industry realise that adding automation for the sake of automation actually made aircraft less safe, so they re-oriented autopilot developmen­t around the principle of “human-centric” automation. Only when automation is deployed in ways that are designed to improve pilot performanc­e does safety actually improve.

If anything, this dynamic will be more pronounced with automobile­s, which that are used in much higher numbers than planes by people with much less training. But unlike aircraft companies, which join forces to improve safety across the industry, automakers and tech start-ups are in intense competitio­n for the real or perceived lead in the race to autonomy.

As long as consumers care more about the futuristic cool factor of hands-free operation than using technology to become safer drivers, the potential for a dangerous gap between the perception and reality of autonomous-drive technology remains. And what a shame it would be if this technology, which has the potential to someday save tens of thousands of lives every year, actually made cars less safe in the short term.

Edward Niedermeye­r, an auto-industry analyst, is the co-founder of Daily Kanban and the former editor of the blog The Truth About Cars.

www.gulfnews.com/opinions

Newspapers in English

Newspapers from United Arab Emirates