Other guide dog replacements
There are other ways to get around when blind without a robot dog – or an actual dog – leading the way. The use of GPS for blind navigation goes back to the 1990s with the Arkenstone Strider, but the advent of smartphones made it possible to add such features with an app rather than a standalone device.
One recent technology is RouteNav, an iOS app designed by researchers at the University of California, Santa Cruz, to help blind travellers navigate a transport hub. Rather than use Bluetooth beacons – which ping devices to help pinpoint locations – RouteNav uses GPS. But as the signal can be weak indoors, the researchers paired GPS with a dead reckoning system that breaks the space up into “tiles” to plan a route and track progress.
In tests at the Palo Alto Transit Center in California, blind participants navigated three routes from trains to buses and vice versa, covering an underground tunnel, boarding areas and finding a gate in a fence. Similar apps include Identifi, which helps users navigate inside using AI image recognition, and Blind Square, which reveals road names and shops as you walk by them.
Microsoft Soundscape was another phone-based tool. This used sound to describe what a person was walking past, letting them better find their way – perhaps calling out a saved location as the user walks by. However, it was discontinued in 2023.
Researchers at the University of Technology Sydney here in Australia also looked to sound to help visually impaired people navigate the world, building smart glasses that make a noise when specific objects are seen by the computer vision system, with AI processing the data to identify whether that’s a bowl, cup, book or bottle. For example, when the glasses “see” a book, the sound of a page turning is played to notify the wearer.
Be My Eyes is an app that allows blind users to ask for help from volunteers via photos or video calls. So far, the app has been used by 19,000 blind and partially sighted people, assisted by 6.9 million volunteers. A new version, Be My AI, is set to arrive soon using OpenAI’s GPT-4 to understand requests and describe a picture to a blind user.
The developers say the AI-powered version could be used to read text – such as instructions or magazine articles or WhatsApp messages – or to choose the right colour combinations for an outfit. Even to select the right buttons on appliances such as a dishwasher and to navigate using signs in transport hubs. One similar app is TapTapSee, which uses cloud-based image recognition to identify objects, while Ask Envision also uses OpenAI’s technology to describe images and text and read them aloud for blind people.
And then there’s Ximira’s PHINIX (Perceptive Helper with Intelligent Navigation and Intuitive eXperience), unveiled at CES 2024. Worn as a backpack, this system uses cameras and headphones powered by a laptop motherboard to analyse the world around a wearer with AI and describe it to the wearer – it even has facial recognition to help spot friends.
In short, technologies such as smartphones and AI are being used to help blind people better understand the world around them. And soon they might have the option of a robot dog, too.