The Asian Age

New driver- less car tech can ‘ gaze’ at the world like humans

AEye’s new technology can help improve autonomous cars prioritise obstacles like human eyes, possibly making them more efficient and fast

-

You must have often heard that the human eye is the best form of image capturing technology in the world. Therefore, camera manufactur­ers are always in the pursuit of tweaking their products to match the way our eyes behave in real- world scenarios. The best iterations of cameras are joining the pursuit of technology and automobile firms to make autonomous mobility a reality. However, autonomous cars are more than just a few sensors and a computer mounted on top of a car.

The basic concept behind driver- less cars is pretty basic — the computers need to ‘ see’ ( analyse) the road and use clever algorithms to drive between them in a way that is considered to be safe by human standards. Up until now, all the autonomous vehicles you have seen equip lidar sensors to get a view of the road ahead. However, the quality of lidar sensor varies — some expensive ones can scan the surroundin­gs in 320- degrees whereas the affordable solid- state lidar units are only suitable for slow urban conditions.

However, cheaper technology can be put to good use through clever implementa­tion. An athlete and a commoner have been God- gifted similar eyes but both use it in completely different ways — the former uses it cleverly in a fraction of seconds to jump hurdles whereas the latter may use it luxuriousl­y to see around and look for an alternate route. AEye, an AI startup, is following the athlete’s way and using affordable sensors to let the vehicle’s computer focus on the road ahead.

While driving, the human mind focuses on the road ahead while staying alert of dangers looming in from the surroundin­gs. AEye’s system is following a similar approach by designing their algorithms in a way so that the computer focuses its ‘ laser vision’ on a particular angle of the road while staying aware of the surroundin­gs for pedestrian­s and roadside obstacles. The company says that their AI- enabled system is able to see as far as 300 meters with an angular resolution as small as 0.1 degrees. The system can scan certain areas of its field of vision in high resolution whereas the ‘ notso- important’ bits can be scanned in lower resolution, thereby effectivel­y utilise the hardware. “You can trade resolution, scene revisit rate, and range at any point in time,” Luis Dussan says in an interview to MIT Technology Review. “The same sensor can adapt.” The system can also put colour to raw lidar images, which can help the system understand key human visual messages such as brake lights and indicators.

Therefore, while driving on the highways, the system can focus on cars, trucks and oncoming traffic while staying alert of the roadside obstacles. In urban areas, the system will balance its focus equally in the complete field of vision, shifting its focus occasional­ly to scan for any obstacles that have been missed. Will this make autonomous technology more affordable? Luis Dussan says that “if you compare true apples- to- apples, we’re going to be the lowest- cost system around.” AEye’s system will require multiple 70- degree field- of- view sensors to be fitted on a car to cover a 360- degree view, which is bound to raise the costs. However, with AI on board, autonomous cars can perceive like humans and make themselves more efficient in the real world. Will they be safer? Humans are known to make mistakes on the road and we can only expect the system to perform the way they are programmed to be.

 ?? PHOTO: PIXAR ??
PHOTO: PIXAR

Newspapers in English

Newspapers from India