Infrared and iris sensors, the future of road safety
OEM
Spanish based SEAT, a member of the Volkswagen Group, and its Smart Quality team is using eye tracker glasses to track the driver’s gaze through infrared sensors, cameras and algorithms.
It says that knowing where users are looking helps to achieve a more intuitive and secure interaction with devices such as infotainment.
“Infrared light sensors, high resolution images and a sophisticated algorithm. All this technology is used to find out exactly where people are looking.
“As we drive, the road must obviously be the main focus. That’s why it’s key to safety to be able to locate everything we’re looking for on the central console of the infotainment system at a glance, from the navigation system to the air conditioning or the radio,” the company says.
“We must guarantee the minimum interaction time with the screen, and to do this the information must be where users intuitively and naturally look for it,” says Rubén Martínez, head of SEAT’s Smart Quality department.
To accomplish this, they now have an innovative system.
Eye tracking is a technology that enables a computer to know where a person is looking. It does so through glasses with infrared sensors in the lenses and a camera in the centre of the frame. “The sensors detect the exact position of the iris at every moment, while everything the user sees is recorded,” Martínez explains. A complex 3D eye model algorithm interprets all this data and obtains the exact viewing point.
This technology makes it possible to obtain very precise studies on human interaction with all kinds of devices. For example, it will serve to analyse the usability of mobility apps. “We can know where users expect to find information such as battery level or range of kilometres,” he says.
The team is now working on a pilot test in order to introduce the eye tracker glasses in the testing of new models. They select users with different profiles who, while wearing them, will get behind the wheel of the SEAT Leon.
“We’ll ask them, for example, to turn up the temperature or change the radio station and we’ll analyse which part of the screen they’ve directed their gaze at first, how long it takes them to do so and how many times they look at the road while interacting with the device.”
Before, these tests were done by asking people questions, but “the brain often misleads and where you think you’re looking is not where you’re actually doing it,” he adds. Now they will have accurate data.
The company says all of these usability patterns will be key in developing the central consoles of tomorrow’s cars, determining the location, size and distribution of information that is most comfortable for users. “This technology will help us humanise the interfaces, improving the user experience. With it we’ll certainly go a step further in the quality of the infotainment console of the future,” he says.