Mind-reading AI helps paralysed man talk
A MIND-READING device that can peer into the brain and pluck out language has been developed to help paralysed people communicate.
Neuroscientists in the United States tested the machine on a man who could not talk because of severe vocal and limb paralysis.
By speaking each letter of the phonetic alphabet silently in his head, he was able to produce sentences at a speed of seven words per minute, and master a vocabulary of more than 1,000 words.
“So if he was trying to say ‘cat’, he would say ‘charlie-alpha-tango’,” Sean Metzger, of the University of California San Francisco, said.
A spelling interface then used language-modelling to crunch the data in real time, working out possible words or errors. When the man finished a sentence, he was asked to squeeze his hand to indicate a full stop.
As well as simply deciphering the brain waves, the device also uses deep learning algorithms and natural language models to predict the sentence and insert spaces between words.
The team’s early attempt to decode the brain activity behind entire words stalled as the researchers found they could only interpret about 50 words – and only if the participant attempted to speak the word out loud, which took significant effort. Researchers were also concerned that it would not be useful for people who were not able to make any sounds at all.
The scientists believe the system could eventually be trained to understand 9,000 common English words, which exceeds the threshold for basic fluency and would enable general communication.
It could help those who struggle to communicate because of conditions such as amyotrophic lateral sclerosis, stroke, a spinal cord injury or muscular dystrophy.
At present, people with severe paralysis are forced to use touchscreens, laboriously pointing to letters or words to communicate using head or eye movements to control a laser or stylus.
The research was published in the journal