The great­est men­ace to self-driv­ing cars is hu­man driv­ers

The run-ins high­light an emerg­ing cul­ture clash between peo­ple and law-abid­ing au­ton­o­mous cars

Toronto Star - - BUSINESS - RYAN BEENE BLOOMBERG

As auto ac­ci­dents go, it wasn’t much: 12 min­utes be­fore noon on a cool June day, a Chevro­let Bolt was rear-ended as it crawled from a stop­light in down­town San Fran­cisco.

What made this fen­der ben­der note­wor­thy was the Bolt’s driver: a com­puter.

In Cal­i­for­nia, where com­pa­nies such as Cruise Au­to­ma­tion Inc. and Waymo LLC are ramp­ing up test­ing of self-driv­ing cars, hu­man driv­ers keep run­ning into them in low-speed fen­der ben­ders.

The run-ins high­light an emerg­ing cul­ture clash between hu­mans who of­ten treat traf­fic laws as guide­lines and au­ton­o­mous cars that refuse to roll through a stop sign or ex­ceed the speed limit.

“They don’t drive like peo­ple. They drive like ro­bots,” said Mike Ram­sey, an an­a­lyst at Gart­ner Inc. who spe­cial­izes in ad­vanced au­to­mo­tive tech­nolo­gies.

Com­pa­nies are test­ing au­ton­o­mous ve­hi­cles from Phoenix to Pitts­burgh, and de­vel­op­ers are closely watch­ing how they in­ter­act with their hu­man-driven coun­ter­parts as they pre­pare for a fu­ture in which they will be shar­ing the road.

What they’ve found is that while the pub­lic may most fear a ma­raud­ing ve­hi­cle with­out a driver be­hind the wheel, the re­al­ity is that the ve­hi­cles are overly cau­tious. They creep out from stop signs af­ter com­ing to a com­plete stop and mostly obey the let­ter of the law — un­like hu­mans.

Smooth­ing out that in­ter­ac­tion is one of the most im­por­tant tasks ahead for de­vel­op­ers of the tech­nol­ogy, says Karl Iagnemma, chief ex­ec­u­tive of­fi­cer of self-driv­ing soft­ware de­vel­oper NuTonomy Inc.

“If the cars drive in a way that’s re­ally dis­tinct from the way that ev­ery other mo­torist on the road is driv­ing, there will be, in the worst case, ac­ci­dents and, in the best case, frus­tra­tion,” he said. “What that’s go­ing to lead to is a lower like­li­hood that the pub­lic is go­ing to ac­cept the tech­nol­ogy.”

Sen­sors em­bed­ded in au­ton­o­mous cars al­low them to “see” the world with far more pre­ci­sion than hu­mans, but the cars strug­gle to translate vis­ual cues on the road into pre­dic­tions about what might hap­pen next, Iagnemma said. They also strug­gle to han­dle new sce­nar­ios they haven’t en­coun­tered be­fore.

Cal­i­for­nia is the only state that specif­i­cally re­quires re­ports when an au­ton­o­mous ve­hi­cle is in­volved in an ac­ci­dent. The records show ve­hi­cles in au­ton­o­mous mode have been rear-ended 13 times in the state since the be­gin­ning of 2016, out of 31 col­li­sions in­volv­ing self-driv­ing cars in to­tal, ac­cord­ing to the Cal­i­for­nia Depart­ment of Mo­tor Ve­hi­cles.

The col­li­sions also al­most al­ways oc­cur at in­ter­sec­tions rather than in free-flow­ing traf­fic. A Cruise au­ton­o­mous ve­hi­cle was rear-ended last month, for ex­am­ple, while brak­ing to avoid a ve­hi­cle drift­ing into its lane from the right as traf­fic ad­vanced from a green light.

Waymo’s now-re­tired Fire­fly au­ton­o­mous ve­hi­cle pro­to­types were rear-ended twice at the same in­ter­sec­tion in Moun­tain View, Calif., in sep­a­rate in­stances less than a month apart in 2016. In both cases, the Way­mos were pre­par­ing to make a right­hand turn be­fore they stopped to yield for on­com­ing traf­fic and got hit from be­hind.

An­other time, a truck rac­ing to pass a slow-mov­ing self-driv­ing ve­hi­cle be­fore a stop sign clipped it as it scooted back to the right.

The state’s crash re­ports don’t as­sign blame and pro­vide only sum­maries of the in­ci­dents, but a few themes emerge. They’re al­most al­ways low-speed fen­der ben­ders with no in­juries. The Bolt was trav­el­ling at less than 2 km/h when it was rearended. While they rep­re­sent a mi­nus­cule share of crashes in the state, au­ton­o­mous ve­hi­cles are also a very small share of the ve­hi­cles on the road.

“You put a car on the road, which may be driv­ing by the let­ter of the law, but com­pared to the sur­round­ing road users, it’s act­ing very con­ser­va­tively,” Iagnemma said.

“This can lead to sit­u­a­tions where the au­ton­o­mous car is a bit of a fish out of water.”

“You put a car on the road, which may be driv­ing by the let­ter of the law, but com­pared to the sur­round­ing road users, it’s act­ing very con­ser­va­tively.” KARL IAGNEMMA NUTONOMY CEO

ERIC RISBERG/THE AS­SO­CI­ATED PRESS FILE PHOTO

Waymo’s au­ton­o­mous pro­to­types were rear-ended twice at the same in­ter­sec­tion.

Newspapers in English

Newspapers from Canada

© PressReader. All rights reserved.