Safety les­son from Au­topi­lot tragedy

Hu­man driver must re­main un­til sys­tems are deemed good enough with­out hu­man in­put

Pretoria News Weekend - - OPINION - EDWARD NIEDERMEYER

AL­MOST as soon as news broke of a fa­tal crash in­volv­ing Tesla’s Au­topi­lot last year, fans and de­trac­tors of the elec­tric-car man­u­fac­turer have been clear on the tragedy’s causes.

Tesla’s sup­port­ers and in­vestors never doubted that the sys­tem im­proved safety, so the driver must have failed to heed Tesla’s warn­ings and re­main at­ten­tive.

De­trac­tors and short in­vestors are all but cer­tain that Au­topi­lot some­how failed to pro­tect the car’s driver.

After more than a year of de­bate a con­clu­sive an­swer is fi­nally at hand, courtesy of a Na­tional Trans­porta­tion Safety Board (NTSB) in­ves­ti­ga­tion whose fi­nal re­sults were pre­sented this week.

But the board’s find­ings aren’t likely to leave ei­ther side happy: Rather than blame man or machine alone, it seems both hu­man driv­ers and the Au­topi­lot sys­tem – specif­i­cally the com­plex re­la­tion­ship be­tween the two – con­trib­uted to the deadly event.

At the heart of the mat­ter is a dan­ger­ous dy­namic: With bil­lions at stake in the fran­tic race to de­velop self-driv­ing car tech­nol­ogy, there are huge in­cen­tives for car­mak­ers to cre­ate the im­pres­sion that ve­hi­cles for sale today are “au­ton­o­mous”.

But as the NTSB made clear, no ve­hi­cle now on the mar­ket is ca­pa­ble of safe au­ton­o­mous driv­ing. When con­sumers take hi-tech hype at face value, a lethal gap be­tween per­cep­tion and re­al­ity can open.

Tesla reaped months of lauda­tory cov­er­age and bil­lions worth of mar­ket cap by pre­sent­ing its Au­topi­lot sys­tem as be­ing more au­ton­o­mous than any other ad­vanced driver as­sist sys­tems, even as it warned own­ers they must re­main at­ten­tive and in con­trol at all times. Though Au­topi­lot did of­fer bet­ter per­for­mance than other ad­vanced driver as­sis­tance sys­tems, the key to its suc­cess was the lack of lim­i­ta­tions Tesla put on its use.

Be­cause Au­topi­lot al­lows own­ers to drive hands-free anywhere, even on roads where Tesla has warned that such use would not be safe, the com­pany has been able to profit off the per­cep­tion that its sys­tem was more au­ton­o­mous than oth­ers.

But Au­topi­lot was de­signed for use on well-marked, pro­tected high­ways with no chance of cross-traf­fic. So when the trac­tor-trailer turned across Florida’s Highway 27 last May and the Tesla slammed di­rectly into it with­out trig­ger­ing any safety sys­tems, Au­topi­lot was work­ing ex­actly as de­signed.

The prob­lem was that it was be­ing used on a road with con­di­tions it wasn’t de­signed to cope with, and the driver had ap­par­ently been lulled into com­pla­cency. Far from fail­ing, Au­topi­lot was ac­tu­ally so good that it led the driver to be­lieve it was more ca­pa­ble than it re­ally was.

This com­plex fail­ure, which both man and machine con­trib­uted to, sounds an im­por­tant warn­ing about au­ton­o­mous-drive tech­nol­ogy: un­til the sys­tems are so good they need no hu­man in­put, the hu­man driver must re­main at the cen­tre of “semi-au­ton­o­mous” drive sys­tem de­sign.

En­gi­neers must assume that if there’s a way for peo­ple to mis­use th­ese sys­tems, they will. Just as im­por­tant, com­pa­nies need to un­der­stand that if they over-pro­mote a semi-au­ton­o­mous drive sys­tem’s ca­pa­bil­i­ties in hopes of pulling ahead in the race to au­ton­omy, they run the risk of mak­ing the tech­nol­ogy less safe than an unas­sisted hu­man driver.

There’s a les­son to be learnt here from avi­a­tion. As com­put­ers and sen­sors im­proved in the 1980s, air­craft man­u­fac­tur­ers be­gan to au­to­mate more and more of the con­trols sim­ply be­cause they could. Only later did the in­dus­try re­alise that adding au­to­ma­tion for the sake of au­to­ma­tion made air­craft less safe, so they re-ori­ented au­topi­lot de­vel­op­ment around the prin­ci­ple of “hu­man-centric” au­to­ma­tion. Only when au­to­ma­tion is de­ployed in ways that are de­signed to im­prove pilot per­for­mance does safety ac­tu­ally im­prove.

If any­thing, this dy­namic will be more pro­nounced with cars, which are used in much higher num­bers than planes by peo­ple with much less train­ing. But un­like air­craft com­pa­nies, which join forces to im­prove safety across the in­dus­try, au­tomak­ers and tech start-ups are in in­tense com­pe­ti­tion for the real or per­ceived lead in the race to au­ton­omy. As long as con­sumers care more about the fu­tur­is­tic cool fac­tor of hands-free op­er­a­tion than us­ing tech­nol­ogy to be­come safer driv­ers, the po­ten­tial for a dan­ger­ous gap be­tween the per­cep­tion and re­al­ity of au­ton­o­mous-drive tech­nol­ogy re­mains. And what a shame it would be if this tech­nol­ogy, which has the po­ten­tial to some­day save tens of thou­sands of lives ev­ery year, made cars less safe in the short term.

• Niedermeyer, an auto-in­dus­try an­a­lyst, is the co-founder of Daily Kan­ban and for­mer editor of the blog The Truth About Cars.

For more col­umns from Bloomberg View, visit view.

Both man and machine were to blame for this fa­tal crash, ac­cord­ing to find­ings by the Na­tional Trans­porta­tion Safety Board.

Newspapers in English

Newspapers from South Africa

© PressReader. All rights reserved.