Toronto Star

BIPEDS BEWARE

Do autonomous vehicles require more careful pedestrian­s?

- JEREMY KAHN

You’re crossing the street wrong.

That is essentiall­y the argument some self-driving car boosters have fallen back on in the months after the first pedestrian death attributed to an autonomous vehicle and amid growing concerns that artificial intelligen­ce capable of real-world driving is further away than many predicted just a few years ago.

In a line reminiscen­t of Steve Jobs’s famous defence of the iPhone 4’s flawed antennae — “Don’t hold it like that” — these technologi­sts say the problem isn’t that self-driving cars don’t work, it’s that people act unpredicta­bly.

“What we tell people is, ‘Please be lawful and please be considerat­e,’ ” says Andrew Ng, a well-known machine learning researcher who runs a venture fund that invests in AI-enabled companies, including self-driving startup Drive.AI. In other words: no jaywalking.

Whether self-driving cars can correctly identify and avoid pedestrian­s crossing streets has become a burning issue since March after an Uber self-driving car killed a woman in Arizona who was walking a bicycle across the street at night outside a designated crosswalk. The incident is still under investigat­ion, but a preliminar­y report from federal safety regulators said the car’s sensors had detected the woman but its decision-making software discounted the sensor data, concluding it was likely a false positive. Google’s Waymo has promised to launch a self-driving taxi service, starting in Phoenix later this year, and General Motors has pledged a rival service — using a car without steering wheel or pedals — some time in 2019. But it’s unclear if either will be capable of operating outside of designated areas or without a safety driver who can take over in an emergency.

Meanwhile, other initiative­s are losing steam. Elon Musk has shelved plans for an autonomous Tesla to drive across the U.S. Uber has axed a self-driving truck program to focus on autonomous cars. Daimler Trucks, part of Daimler AG, now says commercial driverless trucks will take at least five years. Others, including Musk, had previously predicted such vehicles would be road-ready by 2020.

With these timelines slipping, driverless proponents like Ng say there’s one surefire shortcut to getting self-driving cars on the streets sooner: persuade pedestrian­s to behave less erraticall­y. If they use crosswalks, where there are contextual clues — pavement markings and stoplights — the software is more likely to identify them.

But to others the very fact that Ng is suggesting such a thing is a sign that today’s technology simply can’t deliver self-driving cars as originally envisioned.

“The AI we would really need hasn’t yet arrived,” says Gary Marcus, a New York University professor of psychology who researches both human and artificial intelligen­ce. He says Ng is “just redefining the goalposts to make the job easier,” and that if the only way we can achieve safe self-driving cars is to completely segregate them from human drivers and pedestrian­s, we already had such technology: trains.

Rodney Brooks, a well-known robotics researcher and an emeritus professor at the Massachuse­tts Institute of Technology, wrote in a blog post critical of Ng’s sentiments that “the great promise of self-driving cars has been that they will eliminate traffic deaths. Now (Ng) is saying that they will eliminate traffic deaths as long as all humans are trained to change their behaviour? What just happened?”

Ng argues that humans have always modified their behaviour in response to new technology.

“If you look at the emergence of railroads, for the most part people have learned not to stand in front of a train on the tracks,” he says. Ng also notes that people have learned that school buses are likely to make frequent stops and that when they do, small children may dart across the road in front of the bus, and so they drive more cautiously. Selfdrivin­g cars, he says, are no different.

In fact, jaywalking became a crime in most of the U.S. only because automobile manufactur­ers lobbied intensivel­y for it in the early 1920s, in large measure to head off strict speed limits and other regulation that might have affected car sales, according to Peter Norton, a history professor at the University of Virginia who wrote a book on the topic. So there is a precedent for regulating pedestrian behaviour to make way for new technology.

And while Ng may be the most prominent self-driving proponent calling for training humans, as well as vehicles, he’s not alone.

“There should be proper education programs to make people familiar with these vehicles, the ways to interact with them and to use them,” says Shuchisnig­dha Deb, a researcher at Mississipp­i State University’s Center for Advanced Vehicular Systems. The U.S. Department of Transporta­tion has stressed the need for such consumer education in its latest guidance on autonomous vehicles.

Maya Pindeus, the co-founder and chief executive officer of Humanising

Automation, a London startup working on models of pedestrian behaviour and gestures that self-driving car companies can use, likens such lessons to public awareness campaigns Germany and Austria instituted in the 1960s following a spate of jaywalking fatalities. Such efforts helped reduce pedestrian road fatalities in Germany from more than 6,000 deaths in 1970 to less than 500 in 2016, the last year for which figures are available.

The industry is understand­ably keen not to be seen off-loading the burden onto pedestrian­s. Uber and Waymo both said in emailed statements that their goal is to develop self-driving cars that can handle the world as it is, without being dependent on changing human behaviour.

One challenge for these and other companies is that driverless cars are such a novelty right now, pedestrian­s don’t always act the way they do around regular vehicles. Some people just can’t suppress the urge to test the technology’s artificial reflexes. Waymo, which is owned by Alphabet Inc., routinely encounters pedestrian­s who deliberate­ly try to “prank” its cars, continuall­y stepping in front of them, moving away and then stepping back in front of them, to impede their progress.

The assumption seems to be that driverless cars are designed to be extra cautious so the practical joke is worth the risk. “Although our systems do have superhuman perception, sometimes people seem to think Newton’s laws no longer apply,” says Paul Newman, the cofounder of Oxbotica, a U.K. startup making autonomous driving software, who recalls the time a pedestrian ran up behind a self-driving car and jumped suddenly in front of it.

Over time driverless cars will become less fascinatin­g, and people will presumably be less likely to prank them. In the meantime, the industry is debating what step companies should take to make humans aware of the cars and their intentions.

Drive.AI, which was co-founded by Ng’s wife, Carole Riley, has made a number of modificati­ons to the self-driving cars it’s road testing in Frisco, Texas.

They’re painted a distinctiv­e dayglo orange, increasing the chance that people will notice them and recognize them as self-driving. Drive.AI also pioneered the use of an external LED-display screen, similar to the ones many city buses use to display their destinatio­n or route number, that can convey the car’s intentions to humans. For instance, a car stopped at a crosswalk, might display the message: “Waiting for you to cross.”

Uber has taken this idea further, filing patents for a system that would include a variety of flashing external signage and holograms projected in front of the car to communicat­e with human drivers and pedestrian­s. Google has also filed patents for its own external signage. Oxbotica’s Newman says he likes the idea of such external messaging as well as distinctiv­e sounds — much like the beeping noise large vehicles make when reversing — to help ensure safe interactio­ns between humans and autonomous vehicles.

Deb says her research shows that people want external features and audible communicat­ion or warning sounds of some kind. But so far, besides Drive.AI, the cars these companies are using in road tests don’t include such modificati­ons. It’s also not clear how pedestrian­s or other human drivers could communicat­e their intentions to self-driving vehicles.

Pindeus’s company wants those building self-driving cars to focus more on understand­ing the non-verbal cues and hand gestures people use to communicat­e. The problem with most of the computer vision systems that self-driving cars use, she says, is they simply put a boundary box around an object and apply a label — parked car, bicycle, person — without the ability to analyze anything happening inside that box.

Eventually, better computer vision systems and better AI may solve this problem. Cities will probably remake themselves for an autonomous age with “geofencing,”creating separate zones and designated pickup spots for self-driving cars and taxis. In the meantime, your parents’ advice probably still applies: Don’t jaywalk and look both ways before crossing the street.

“If you look at the emergence of railroads, for the most part people have learned not to stand in front of a train on the tracks.” ANDREW NG MACHINE LEARNING RESEARCHER

 ??  ??
 ?? ROSS D. FRANKLIN THE ASSOCIATED PRESS ?? Pilot programs for self-driving vehicles in Arizona have revealed one big challenge: persuading pedestrian­s to behave less erraticall­y.
ROSS D. FRANKLIN THE ASSOCIATED PRESS Pilot programs for self-driving vehicles in Arizona have revealed one big challenge: persuading pedestrian­s to behave less erraticall­y.
 ?? AFP/GETTY IMAGES ?? This video grab made from dashcam footage shows the moment before a fatal collision of an Uber self-driving vehicle and a pedestrian in Arizona.
AFP/GETTY IMAGES This video grab made from dashcam footage shows the moment before a fatal collision of an Uber self-driving vehicle and a pedestrian in Arizona.

Newspapers in English

Newspapers from Canada