AI backpack aims to assist visually impaired
Technology designers are beginning to consider everyone when designing tech solutions. People who are physically challenged were left behind.
Among the latest developed products is a backpack powered by Intel’s artificial intelligence software. It’s designed to audibly alert wearers when they’re approaching possibly hazardous situations like crosswalks or strangers.
The backpack, which has yet to be named, was revealed on Wednesday but could face years of development before a consumer-ready version is launched. Still, the product offers a glimpse at what a future could look like as progress in AI and machine learning increasingly help people with vision issues better perceive their environments and, therefore, live more independently. The backpack was created by researchers at the University
of Georgia in the US, who took existing computer vision techniques and combined them into a system that seeks to replace the need for a cane or guide dog.
In a demonstration video, the user also wears a vest with tiny holes to conceal an embedded AI camera. When connected to the computer, the 4K camera captures depth and colour information used to help people avoid things like hanging branches. The camera can also be embedded in a fanny pack or other waist-worn pouches.
The spatial camera, built by the computer vision company Luxonis, can read signs, detect crosswalks and see upcoming changes in elevation.
Bluetooth earphones allow the user to communicate with the system and vice versa. So the wearer can ask out loud for location information, and the system will tell them where they are. If the camera spots a threat like an incoming pedestrian, it can tell the wearer. It’s too soon to know how much such a device would cost consumers. WEWALK’S smart cane with obstacle detection sells for $600, 10 times as much as an ordinary white cane. Orcam Myeye Pro, a wireless smart camera that reads what’s in front of you, runs $4 250.
Researchers at the University of Georgia went with a backpack design because it would help visually impaired people avoid unwanted attention. They used Intel’s Movidius computing chip because it was small and powerful enough to run advanced AI functions with low latency.
The next step is to raise funds and expand testing. They hope to one day unleash an open-source, Ai-based, visual-assistance system. The researchers have formed a team called Mira, made up of some visually impaired volunteers.