Waterloo Region Record

How driverless cars see the world around them

- CADE METZ

On Sunday night, a women died after she was hit by a self-driving car operated by Uber in Tempe, Ariz. The car was operating autonomous­ly, although a safety driver was behind the wheel, according to a statement from the local police.

Uber is one of many companies testing this kind of vehicle in Arizona, California and other parts of the country. Waymo, the self-driving car company owned by Google’s parent company, Alphabet, has said it is operating autonomous cars on the outskirts of Phoenix without a safety driver behind the wheel. On Monday, Uber said it was halting tests in Tempe, Pittsburgh, Toronto and San Francisco.

Here is a brief guide to the way these cars operate.

How do these cars know where they are?

When designing these vehicles, companies like Uber and Waymo begin by building a three-dimensiona­l map of a place. They equip ordinary automobile­s with lidar sensors — “light detection and ranging” devices that measure distances using pulses of light — and as company workers drive these cars on local roads, these expensive devices collect the informatio­n needed to build the map.

Once the map is complete, cars can use it to navigate the roads on their own. As they do, they continue to track their surroundin­gs using lidar, and they compare what they see with what the map shows. In this way, the car gains a good idea of where it is in the world. Lidar also alerts the cars to nearby objects, including other cars, pedestrian­s and bicyclists.

Is that the only important technology?

Lidar works pretty well, but it can’t do everything. It provides informatio­n only about objects that are close, which limits how

fast cars can drive. Its measuremen­ts are not always sharp enough to distinguis­h one object from another. And when multiple autonomous vehicles drive the same road, their lidar signals can interfere with one another.

Even in situations where lidar works well, these companies want backup systems in place. So most driverless cars are also equipped with other sensors.

Like what?

Cameras, radar and global positionin­g system antennas, the kind of GPS hardware that tells your smartphone where it is.

With the GPS antennas, companies like Uber and Waymo are providing cars with even more informatio­n about where they

are. With cameras and radar sensors, they can gather informatio­n about nearby pedestrian­s, bicyclists, cars and other objects. Cameras also provide a way to recognize traffic lights, street signs, road markings and other signals.

How do the cars use all that informatio­n?

That is the hard part. Sifting through all that data and responding to it require a system of immense complexity.

In some cases, engineers will write specific rules that define how a car should respond in a particular situation. A Waymo car, for example, is programmed to stop if it detects a red light.

But a team of engineers could never write rules for every situation a car could encounter. So companies like Waymo and Uber are beginning to rely on “machine learning” systems that can learn behaviour by analyzing vast amounts of data.

Waymo now uses a system that learns to identify pedestrian­s by analyzing thousands of photos that contain people walking or running across or near roads.

Is that the kind of thing that broke down in Tempe?

It is unclear what happened in Tempe. But these cars are designed so that if one system fails, another will kick in.

Self-driving cars can have difficulty duplicatin­g the subtle, nonverbal communicat­ion that goes on between pedestrian­s and drivers. An autonomous vehicle, after all, can’t make eye contact with someone at a crosswalk.

“It is still important to realize how hard these problems are,” said Ken Goldberg, a professor at the University of California, Berkeley, who specialize­s in robotics. “That is the thing that many don’t understand, just because these are things humans do so effortless­ly.”

Newspapers in English

Newspapers from Canada