The Mail on Sunday

Amy the Audi, the self-drive car that can read your mind

. . . and can even have your lunch delivered

- By Ben Oliver

MEET Amy, the Audi that can look deep into your eyes and guess what you’re thinking. And no, I’m not joking. This concept car really can work out which option you’re looking at on its huge, 55in display by tracking the movement of your eyes.

I tried it myself, ordering lunch from an on-screen menu. The car then shared my location and ETA with a food delivery service, so my takeaway was delivered just as this autonomous car got me ‘home’.

And this Audi really is called Amy: although it’s spelled AI:ME, reflecting how it aims to use artificial intelligen­ce (AI) too meet the personal needs of its owner.

Audi has made four AI concepts designed to explore different aspects of the future of motoring, including a sports car, an off-roader and a long-distance cruiser. The MoS was given the first chance to be driven by AI:ME, which is an autonomous city car with a cabin that explores how we might be cosseted and entertaine­d once we no longer have to drive. I couldn’t give the self-driving tech much of a test, though. AI:ME was confined to a deserted rooftop car park in Las Vegas, so there was no traffic to deal with. But unusually for a concept car not designed to be driven on public roads, AI:ME is a pretty competent driver. I soon forgot that the car was driving, and relaxed enough to explore the tech.

Just as well. One of the cabin’s most striking features is a steering wheel that flips horizontal and retracts into the dashboard, so I couldn’t take the wheel if I wanted to. Audi’s designers have been exploring how to make a cabin that functions like a normal car’s when a human is driving, but can then transform into a lounge when the car takes over. A much simpler innovation is AI:ME’s seats, which are broad and flat, so you can turn to your fellow passengers for a more natural conversati­on.

AI:ME also features Audi’s Holoride tech. All four occupants have a virtual- reality headset which can transport you to a more scenic location than the rush-hour M25. And then it was time to order lunch. It was incredible how accurately the hidden cameras trained on my face could track my eye movement. They can even do it when you’re wearing sunglasses by analysing the movement of your facial muscles. Each food i t em on t he on- screen menu enlarged as I looked at it, and I confirmed with a voice command.

But are we likely to see any of this tech on cars we can buy any time soon? Eye-tracking is already used in our cars to monitor tiredness and advise us to take a break. But eye-control of our car’s functions would require us to take our eyes off the road for too long. So although the tech already works, we’ll have to wait for fully self-driving cars to arrive before we can use it, and that’s probably still a decade away.

And the VR? Passengers could use that now, of course. It’s fun, but it feels like a gimmick.

 ??  ?? A TASTE OF THE
FUTURE: Ben explores the car and, left, collects his lunch
A TASTE OF THE FUTURE: Ben explores the car and, left, collects his lunch
 ??  ??

Newspapers in English

Newspapers from United Kingdom