LIDAR, the key to autonomous vehicles
For cars to drive themselves, they need to have all senses that regular drivers have
Autonomous cars are coming. Of that there is no doubt. But there are still many challenges to overcome for it to happen.
It’s not a simple task developing a car that can drive itself anywhere, any time, in any traffic, road and weather conditions, with no more direction given than its ultimate destination.
This is proving to be far more difficult, for example, than the development of automatic pilots for aircraft, which have far fewer variables to deal with. Fixed variables such as roads and traffic laws and random variables such as bicyclists and pedestrians — just to name a few — add to the puzzle.
Much of the technology needed to make autonomous vehicles a reality already exists. The vehicle-control part — making them accelerate, brake and steer as instructed — is well in hand. It’s the knowing-whatto-instruct part that’s still to be fully sorted.
In essence, the vehicle’s computer system not only has to have the intelligence of a human driver in terms of decision making, it has to have equivalent sensory inputs to know where it is, where the road is, what the surrounding environment is and what’s going on around it.
Some of the sensors for doing so are already familiar components in many modern cars. Things such as ultrasonic and infrared sensors, sonar, radar and cameras. Lots and lots of cameras!
The use of backup cameras is now common — they’ll be standard on all models in 2018. And they’re at the heart of more sophisticated driver aids such as Subaru’s Eyesight and Nissan’s Surround View systems, and others like them.
They’re also integral to the autonomous driving technologies being de- veloped by most automakers.
Cameras have their limitations, however. If you have ever tried to rely on one after driving in heavy snow conditions, or just on wet salt-and-slush-covered roads, you’ll understand. If the lens is covered with crud, the image it transmits is unintelligible.
There are solutions for keeping the lenses clear, of course, but they tend to be either complex or expensive, or both.
In addition, even with clear lenses, for the most part cameras can only see what you see. It’s one thing to apply all your experience and perhaps local road knowledge to knowing where its edge is when totally covered with snow. But how comfortable would you be letting your car make that decision based just on a monochromic camera image?
Gill Pratt, CEO of the Toyota Re- search Institute, summarized the challenge succinctly a year ago: “Most of what has been collectively accomplished (to that time) has been relatively easy, because most driving is easy,” he says. “Where we need autonomy to help us is when the driving is difficult.”
That’s why autonomous drive systems must incorporate multiple sensory inputs to provide at least a couple layers of redundancy. They must be certain of what they’re “seeing” before committing your life to a decision based on that input.
That’s where LIDAR comes in. It’s now being incorporated in the autonomous drive systems under development by several major automakers and other players, including Apple, Google and Uber.
That’s what’s housed beneath the “bubblegum machine” topper common on many such experimental models, reminiscent of those on police cars of the past.
LIDAR stands for “Light Imaging, Detection, And Ranging.” Or maybe it’s just a contraction of light and radar. The origins of the name are not entirely clear.
Whatever, it works much like radar except that it uses laser rather than radio waves.
By continually rotating, it’s able to create a 360-degee, three-dimensional map of its surroundings that is far more accurate than what a camera can see.
One of the negatives of LIDAR is that, while those spinning car-top devices are effective, they’re ugly. In addition, they substantially upset vehicle aerodynamics, which has a negative effect on fuel economy.
It’s hard to imagine that customers, let alone the automakers themselves, would readily accept their appear- ance on every car.
Which is why much effort is being made to reduce their size and perhaps install them in multiple locations around the vehicle, rather than just on the roof. Ford, for example has demonstrated a LIDAR sensor, developed by supplier company Velodyne, that is about twice the size of a hockey puck.
Perhaps convinced that LIDAR will be a key component of autonomous cars, Ford has also invested $75 million (U.S.) in Velodyne.
Closer to home, Magna has partnered with Israeli company Innoviz Technologies Ltd. “to deliver LIDAR remote sensing solutions for the implementation of autonomous driving features and full autonomy in future vehicles.”
So add one more acronym to your vocabulary. Chances are you’ll be hearing it a lot in the near future.