Waymo

The Washington Post - - FRONT PAGE - BY MICHAEL LARIS michael.laris@wash­post.com

sub­mit­ted a 43-page safety re­port to the U.S. Trans­porta­tion Depart­ment, the most de­tailed de­scrip­tion yet of its driverless cars’ test­ing.

To help keep tabs on the safety of driverless cars rolling around U.S. cities, the fed­eral govern­ment last year, and again last month, sug­gested that tech firms and car com­pa­nies sub­mit safety check­lists.

None of the com­pa­nies rushed to meet Wash­ing­ton’s wishes.

Now, Waymo, for­merly Google’s self-driv­ing car project, has sub­mit­ted a 43-page safety re­port to the Trans­porta­tion Depart­ment, of­fer­ing the most de­tailed de­scrip­tion yet of how it equips and pro­grams ve­hi­cles to avoid the range of mun­dane and out­ra­geous problems that are part of driv­ing in Amer­ica.

“We’ve staged peo­ple jump­ing out of can­vas bags or porta-pot­ties on the side of the road, skate­board­ers ly­ing on their boards, and thrown stacks of pa­per in front of our sen­sors,” ac­cord­ing to the re­port, which was sub­mit­ted Thurs­day and de­scribes how com­pany en­gi­neers use a 91-acre Cal­i­for­nia test fa­cil­ity mocked up like a city, as well as com­puter sim­u­la­tions cover­ing hun­dreds of thou­sands of vari­a­tions of pos­si­ble road sce­nar­ios.

The Na­tional High­way Traf­fic Safety Ad­min­is­tra­tion (NHTSA) has sug­gested a set of 28 “be­hav­ioral com­pe­ten­cies,” or ba­sic things an au­ton­o­mous ve­hi­cle should be able to do. Some are ex­ceed­ingly ba­sic (“De­tect and Re­spond to Stopped Ve­hi­cles,” “Nav­i­gate In­ter­sec­tions and Per­form Turns”) and oth­ers are more in­tri­cate (“Re­spond to Ci­ti­zens Di­rect­ing Traf­fic Af­ter a Crash”).

Waymo lists an ex­tra 19 ex­am­ples of chal­lenges it uses for test­ing, in­clud­ing that its cars must be able to “de­tect and re­spond” to an­i­mals, mo­tor­cy­clists, school buses, slip­pery roads, unan­tic­i­pated weather, and faded or miss­ing road signs.

The com­pany says it has used fed­eral data on hu­man crashes to fo­cus its ef­forts on im­prov­ing its soft­ware-and-sen­sor driv­ers. Top prob­lem sce­nar­ios for flesh-and­blood driv­ers in­clude rear-end crashes, turn­ing or cross­ing at in­ter­sec­tions, run­ning off the edge of the road, and chang­ing lanes. So those “fig­ure promi­nently in the eval­u­a­tion of our ve­hi­cles,” ac­cord­ing to the re­port.

And then nu­mer­ous per­mu­ta­tions are gen­er­ated from those sce­nar­ios. “We can mul­ti­ply this one tricky left turn to ex­plore thou­sands of vari­able sce­nar­ios and ‘what ifs?,’ ” the re­port says. “The scene can be made busier and more com­plex by adding . . . jog­gers zigzag­ging across the street.”

NHTSA said in a state­ment Thurs­day that Waymo is “the first com­pany to make a vol­un­tary safety self-as­sess­ment pub­lic.” While such re­ports are now vol­un­tary, the House and Sen­ate each passed bills that would re­quire com­pa­nies to sub­mit safety as­sess­ments in the com­ing years.

Some road safety ad­vo­cates ar­gue that driverless cars should be re­quired to pass spe­cific safety tests be­fore be­ing put on the roads, just like hu­man driv­ers. And they say the fed­eral govern­ment has taken a dan­ger­ously lais­sez-faire ap­proach to the bur­geon­ing in­dus­try.

But with tens of thou­sands of peo­ple killed each year on U.S. roads, driverless-ve­hi­cle firms prom­ise big im­prove­ments over­all. Waymo ex­ec­u­tives say their safety re­port is part of an ef­fort to be more trans­par­ent about their ex­pe­ri­ences, which they hope will be good for pub­lic un­der­stand­ing — and busi­ness.

“This over­view of our safety pro­gram re­flects the im­por­tant lessons learned through the 3.5 mil­lion miles Waymo’s ve­hi­cles have self­driven on pub­lic roads, and bil­lions of miles of sim­u­lated driv­ing, over the last eight years,” Waymo chief ex­ec­u­tive John Kraf­cik wrote in a let­ter Thurs­day to Trans­porta­tion Sec­re­tary Elaine Chao.

The re­port of­fered a view into how Waymo’s soft­ware breaks down the 360 de­grees of data con­stantly pour­ing in from radar, laser sen­sors, high-def­i­ni­tion cam­eras, GPS and an au­dio de­tec­tion sys­tem the com­pany says can hear sirens hun­dreds of feet away.

First is per­cep­tion, which is where the ve­hi­cle clas­si­fies ob­jects and stitches them into a “co­he­sive real-time view of the world,” the com­pany said. That means dis­tin­guish­ing between cars and peo­ple, and also bi­cy­cles and mo­tor­cy­cles.

Next is mod­el­ing and pre­dict­ing the be­hav­ior of the ob­ject it en­coun­ters. So, for ex­am­ple, the soft­ware knows that walk­ers move more slowly than bik­ers of ei­ther va­ri­ety, but also that pedes­tri­ans can change di­rec­tion abruptly.

Then the pieces come to­gether in what the com­pany calls its “plan­ner,” which fig­ures out where the car ac­tu­ally will go and is im­bued with a “de­fen­sive driv­ing” sen­si­bil­ity. It keeps the car out of the blind spots of nearby hu­man driv­ers, gives cy­clists ex­tra room and games out what is com­ing sev­eral steps ahead of time.

But cars, like hu­mans, can­not think of ev­ery­thing. How well they man­age that re­al­ity — and deal with the un­ex­pected — will help de­ter­mine how good they re­ally are.

“You can’t ex­pect to pro­gram the car for ev­ery­thing you’re pos­si­bly go­ing to see,” said Ron Med­ford, Waymo’s safety di­rec­tor and a for­mer se­nior NHTSA of­fi­cial. Ex­ten­sive driv­ing ex­per­i­ments feed sim­u­la­tions that es­sen­tially pro­vide the car with ex­pe­ri­ence, which helps greatly, and what it learns is passed on to the en­tire fleet.

And if it re­ally doesn’t know what to do, it can pull over safely, he said.

PAUL SANCYA/AS­SO­CI­ATED PRESS

Waymo chief ex­ec­u­tive John Kraf­cik dis­plays a Chrysler Paci­fica hy­brid with Waymo’s suite of sen­sors in Detroit in Jan­uary.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.