Self-driv­ing cars will need peo­ple

Manteca Bulletin - - On The Road - Lafayette Col­lege

Self-driv­ing cars are ex­pected to rev­o­lu­tion­ize the au­to­mo­bile in­dus­try. Rapid ad­vances have led to work­ing pro­to­types faster than most peo­ple ex­pected. The an­tic­i­pated ben­e­fits of this emerg­ing tech­nol­ogy in­clude safer, faster and more eco-friendly trans­porta­tion.

Un­til now, the pub­lic di­a­logue about self-driv­ing cars has cen­tered mostly on tech­nol­ogy. The pub­lic’s been led to be­lieve that en­gi­neers will soon re­move hu­mans from driv­ing. But re­searchers in the field of hu­man fac­tors — ex­perts on how peo­ple in­ter­act with ma­chines — have shown that we shouldn’t ig­nore the hu­man el­e­ment of au­to­mated driv­ing. High ex­pec­ta­tions for re­mov­ing hu­man driv­ers Au­to­ma­tion is the tech­ni­cal term for when a ma­chine – here a com­plex ar­ray of sen­sors and com­put­ers – takes over a task that was for­merly ac­com­plished by a hu­man be­ing. Many peo­ple as­sume that au­to­ma­tion can re­place the per­son al­to­gether. For ex­am­ple, Google, a leader in the self­driv­ing car quest, has re­moved steer­ing wheels from pro­to­type cars. Mercedes-Benz pro­mo­tional ma­te­ri­als show self-driv­ing ve­hi­cles with rear-fac­ing front seats. The hype on self-driv­ing cars im­plies that the driver will be un­needed and free to ig­nore the road.

The pub­lic also has be­gun to em­brace this no­tion. Stud­ies show that peo­ple want to en­gage in ac­tiv­i­ties such as read­ing, watch­ing movies, or nap­ping in self-driv­ing cars, and also that au­to­ma­tion en­cour­ages th­ese dis­trac­tions. A study in France even in­di­cated that rid­ing while in­tox­i­cated was a per­ceived ben­e­fit. Au­to­ma­tion still re­quires peo­ple Un­for­tu­nately, th­ese ex­pec­ta­tions will be dif­fi­cult to ful­fill. Hand­ing con­trol of a process to a com­puter rarely elim­i­nates the need for hu­man in­volve­ment. The re­li­a­bil­ity of au­to­mated sys­tems is im­per­fect. Tech in­no­va­tors know from ex­pe­ri­ence that au­to­ma­tion will fail at least some of the time. An­tic­i­pat­ing in­evitable au­to­ma­tion glitches, Google re­cently patented a sys­tem in which the com­put­ers in “stuck” self-driv­ing cars will con­tact a re­mote as­sis­tance cen­ter for hu­man help.

Yet the per­cep­tion that self-driv­ing cars will per­form flaw­lessly has a strong foothold in the pub­lic con­scious­ness al­ready. One com­men­ta­tor re­cently pre­dicted the end of au­to­mo­tive deaths. An­other cal­cu­lated the eco­nomic wind­fall of “free time” dur­ing the com­mute. Self-driv­ing tech­nolo­gies will un­doubt­edly be en­gi­neered with high re­li­a­bil­ity in mind, but will it be high enough to cut the hu­man out of the loop en­tirely?

A re­cent ex­am­ple was widely re­ported in the me­dia as an in­di­ca­tor of the readi­ness of self-driv­ing tech­nol­ogy. A Del­phiengi­neered self-driv­ing ve­hi­cle com­pleted a cross-coun­try trip. The tech­nol­ogy drove 99% of the way with­out any problems. This sounds impressive — the hu­man en­gi­neers watch­ing at the wheel dur­ing the jour­ney took emer­gency con­trol of the ve­hi­cle in only a hand­ful of in­stances, such as when a po­lice car was present on the shoul­der or a construction zone was painted with un­usual line mark­ings.

Th­ese sce­nar­ios are infrequent, but they’re not es­pe­cially un­usual for a long road trip. In large-scale de­ploy­ment, how­ever, a low in­di­vid­ual au­to­ma­tion fail­ure rate mul­ti­plied by hun­dreds of mil­lions of ve­hi­cles on US high­ways will re­sult in a non­triv­ial num­ber of problems. Fur­ther, to­day’s most ad­vanced pro­to­types are sup­ported by teams of en­gi­neers ded­i­cated to keep­ing a sin­gle ve­hi­cle safely on the road. In­di­vid­ual high-tech pit crews won’t be pos­si­ble for ev­ery self-driv­ing car on the road of the fu­ture. Peo­ple need to be able to take con­trol How will flaws in au­to­ma­tion tech­nol­ogy be ad­dressed? De­spite Google’s re­mote as­sis­tance cen­ter patent, the best op­tion re­mains in­ter­ven­tion by the hu­man driver. But en­gi­neer­ing hu­man in­ter­ac­tions with self-driv­ing cars will be a sig­nif­i­cant chal­lenge.

We can draw in­sights from avi­a­tion, as many el­e­ments of pi­lot­ing planes al­ready have been taken over by com­put­ers. Au­to­ma­tion works well for rou­tine, repet­i­tive tasks, es­pe­cially when the con­se­quences of au­to­ma­tion mis­takes are mi­nor – think au­to­matic sewing ma­chines or dish­wash­ers. The stakes are higher when au­to­ma­tion fail­ures can cause harm. Peo­ple may rely too much on im­per­fect au­to­ma­tion or be­come outof-prac­tice and un­able to per­form tasks the old-fash­ioned way when needed.

Sev­eral re­cent plane ac­ci­dents have been at­trib­uted to fail­ures in the ways pi­lots in­ter­act with au­to­ma­tion, such as when pi­lots in cor­rectable sit­u­a­tions have re­sponded in­ap­pro­pri­ately when au­to­ma­tion fails. A term – au­to­ma­tion sur­prises – has even been coined to de­scribe when pi­lots lose track of what the au­to­ma­tion is do­ing.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.