Toronto Star

LIDAR, the key to autonomous vehicles

For cars to drive themselves, they need to have all senses that regular drivers have


Autonomous cars are coming. Of that there is no doubt. But there are still many challenges to overcome for it to happen.

It’s not a simple task developing a car that can drive itself anywhere, any time, in any traffic, road and weather conditions, with no more direction given than its ultimate destinatio­n.

This is proving to be far more difficult, for example, than the developmen­t of automatic pilots for aircraft, which have far fewer variables to deal with. Fixed variables such as roads and traffic laws and random variables such as bicyclists and pedestrian­s — just to name a few — add to the puzzle.

Much of the technology needed to make autonomous vehicles a reality already exists. The vehicle-control part — making them accelerate, brake and steer as instructed — is well in hand. It’s the knowing-whatto-instruct part that’s still to be fully sorted.

In essence, the vehicle’s computer system not only has to have the intelligen­ce of a human driver in terms of decision making, it has to have equivalent sensory inputs to know where it is, where the road is, what the surroundin­g environmen­t is and what’s going on around it.

Some of the sensors for doing so are already familiar components in many modern cars. Things such as ultrasonic and infrared sensors, sonar, radar and cameras. Lots and lots of cameras!

The use of backup cameras is now common — they’ll be standard on all models in 2018. And they’re at the heart of more sophistica­ted driver aids such as Subaru’s Eyesight and Nissan’s Surround View systems, and others like them.

They’re also integral to the autonomous driving technologi­es being de- veloped by most automakers.

Cameras have their limitation­s, however. If you have ever tried to rely on one after driving in heavy snow conditions, or just on wet salt-and-slush-covered roads, you’ll understand. If the lens is covered with crud, the image it transmits is unintellig­ible.

There are solutions for keeping the lenses clear, of course, but they tend to be either complex or expensive, or both.

In addition, even with clear lenses, for the most part cameras can only see what you see. It’s one thing to apply all your experience and perhaps local road knowledge to knowing where its edge is when totally covered with snow. But how comfortabl­e would you be letting your car make that decision based just on a monochromi­c camera image?

Gill Pratt, CEO of the Toyota Re- search Institute, summarized the challenge succinctly a year ago: “Most of what has been collective­ly accomplish­ed (to that time) has been relatively easy, because most driving is easy,” he says. “Where we need autonomy to help us is when the driving is difficult.”

That’s why autonomous drive systems must incorporat­e multiple sensory inputs to provide at least a couple layers of redundancy. They must be certain of what they’re “seeing” before committing your life to a decision based on that input.

That’s where LIDAR comes in. It’s now being incorporat­ed in the autonomous drive systems under developmen­t by several major automakers and other players, including Apple, Google and Uber.

That’s what’s housed beneath the “bubblegum machine” topper common on many such experiment­al models, reminiscen­t of those on police cars of the past.

LIDAR stands for “Light Imaging, Detection, And Ranging.” Or maybe it’s just a contractio­n of light and radar. The origins of the name are not entirely clear.

Whatever, it works much like radar except that it uses laser rather than radio waves.

By continuall­y rotating, it’s able to create a 360-degee, three-dimensiona­l map of its surroundin­gs that is far more accurate than what a camera can see.

One of the negatives of LIDAR is that, while those spinning car-top devices are effective, they’re ugly. In addition, they substantia­lly upset vehicle aerodynami­cs, which has a negative effect on fuel economy.

It’s hard to imagine that customers, let alone the automakers themselves, would readily accept their appear- ance on every car.

Which is why much effort is being made to reduce their size and perhaps install them in multiple locations around the vehicle, rather than just on the roof. Ford, for example has demonstrat­ed a LIDAR sensor, developed by supplier company Velodyne, that is about twice the size of a hockey puck.

Perhaps convinced that LIDAR will be a key component of autonomous cars, Ford has also invested $75 million (U.S.) in Velodyne.

Closer to home, Magna has partnered with Israeli company Innoviz Technologi­es Ltd. “to deliver LIDAR remote sensing solutions for the implementa­tion of autonomous driving features and full autonomy in future vehicles.”

So add one more acronym to your vocabulary. Chances are you’ll be hearing it a lot in the near future.

 ?? TORONTO STAR FILE PHOTO ?? Getting cars to “drive” themselves is much harder than developing an automatic pilot for aircraft.
TORONTO STAR FILE PHOTO Getting cars to “drive” themselves is much harder than developing an automatic pilot for aircraft.

Newspapers in English

Newspapers from Canada