Science Illustrated

You can hear it in your voice

Just as your voice can reflect your mood, it can also reveal when you’re not well. In the future, artificial intelligen­ce on your mobile phone could analyse voice samples to make a diagnosis and provide an early warning.

-

You are feeling rather unwell, but don’t fancy making a trip to the doctor. Instead you pick up your mobile phone, saying: “Hi Siri, is anything the matter with me?” Moments later, the personal assistant responds, telling you whether you have COVID-19, are depressed, or even have early signs of Parkinson’s disease.

Your voice can reveal what is wrong with you. This isn’t a new realisatio­n: doctors have known this for many years. A hoarse and nasal voice is a clear indication of a sore throat or a cold. One of the criteria for making a diagnosis of bipolar disorder is that manic patients speak a lot, and fast.

However, the voice includes many more aspects that can change in connection with a long series of diseases, and some of the changes are so slight that doctors can’t tell the difference reliably. But they can be assisted by sophistica­ted algorithms that use artificial intelligen­ce to analyse a voice and listen to identify abnormal patterns of voice, a change in register, the rate of speech, even the choice of words. By comparing voice patterns from thousands of healthy and sick people, the algorithms can learn to tell the difference and accurately identify the details that reveal if the speaker suffers from heart disease, migraine, or depression.

Artificial intelligen­ce offers help

Already by 2017 Amazon had applied for a patent on a technology that could use voice analysis via the company’s personal assistant, Alexa, to determine if a person has a sore throat. From the patent, it appears that Amazon’s plan was to show advertisem­ents for cold and flu medication to any users so identified, and offer delivery of the medication within one hour.

Thankfully it is not only commercial interests that are driving research into the connection between your voice and your health. When people call 000 (or 911 in the United States), they often panic and sound so confused that the emergency call centre has difficulti­es understand­ing the subject and severity of the situation.

A Danish start-up company, Corti, has now developed an algorithm that can help. It monitors emergency calls and searches in real time for patterns of words and expression­s that indicate cardiac arrest. The combinatio­n of machine and human responder can identify 95% of calls about cardiac arrest – as compared to only 73% when the emergency crew is not assisted by AI.

One syllable reveals Parkinson’s

Whereas Corti's system to identify heart attacks is based on the choice of words

and hence reflects the person’s state of mind, other systems concentrat­e on the voice itself, in principle uninterest­ed in what is said. Many diseases somehow influence the ability to pronounce words. A cold infects the vocal chords causing hoarseness, whereas frostbite in the lips causes lisping. Similarly, Parkinson’s disease alters many aspects of the ability to speak.

The motions of Parkinson’s patients are weaker, slower, and more jerky; the same is true for the tongue. Speech becomes slower and quieter, the voice starts to tremble, and the patient pronounces some syllables in a different way. Neither lab analyses such as blood samples nor brain scans can make a reliable Parkinson’s diagnosis, so some scientists have begun to research whether voice analysis could solve the problem.

In 2012, scientists from the University of Oxford in the UK recorded the sound of 10 healthy people and 33 Parkinson’s patients saying “ahhh”. With the assistance of voice analysis computer programs, the scientists identified no fewer than 132 different details in the way this simple syllable can be pronounced. When the scientists compared the 132 voice details of all 43 test subjects, they could identify 10 phonetic details of the syllable in which healthy people and Parkinson’s patients differed. The difference was so marked that the scientists were able to identify 99% of Parkinson’s patients and healthy people simply by listening to his or her way of saying ‘ahhh’.

Speech reveals Alzheimer’s

Alzheimer’s turns out to be another brain disease that could be revealed by the voice. Patients have difficulti­es finding the correct words, and pauses emerge between them. The patients also tend to use pronouns such as ‘he’ and ‘she’ instead of people’s names, and general words such as ‘house’ instead of more specific designatio­ns such as ‘villa’ or ‘terraced house’.

In 2016, scientists from the University of Toronto in Canada undertook an experiment with 167 Alzheimer’s patients and 97 healthy control subjects. The test subjects were given 45 seconds to describe as many details as possible from a drawing in which children steal cakes from a cupboard while their mother is doing the dishes. The scientists identified 400 different parameters of the descriptio­ns given by the test subjects, such as whether they had used the word ‘mother’, how many verbs they used, the average word length, how quickly syllables were pronounced after each other, and more. A computer was then used to compare how the various parameters had been met in healthy and sick people, and to search for patterns or relationsh­ips in the data that showed the two groups of test subjects differing from each other. When the learning process had been completed, the artificial intelligen­ce could identify 82% of the Alzheimer’s patients just by looking at 35 of the 400 parameters. Since 2016, the scientists have improved their algorithms, and the artificial intelligen­ce can now diagnose 92% of the Alzheimer's patients.

COVID-19 revealed by cough

Ever since the COVID-19 epidemic emerged in early 2020, scientists around the world have not only been working on vaccine developmen­t; they have also been competing to design the first mobile phone app that can identify COVID-19 via speech or coughing into a microphone.

Among the participan­ts in the race are scientists from the Massachuse­tts Institute of Technology (MIT) in Cambridge, USA. In September 2020, they introduced a system that had learned how to tell the difference between healthy people and people with COVID-19 from a cough into a microphone.

The neural networks used in the artificial intelligen­ce system had been trained by listening to coughing from 4256 people, and being informed which were healthy and which were sick. Subsequent­ly, the scientists fed the system coughing sounds from 1064 new people, and the neural network was able to make the correct diagnosis in 97% of cases.

Computer scientists at Australia’s RMIT University are also working on an AI model that can hear the effects of COVID. They believe it outperform­s other systems because the algorithm was trained using unlabelled cough data from different countries, genders and ages, including crowd-sourced recordings. The team is now looking for partners to develop the technology further.

The coughing tests are almost as reliable as the PCR tests used in testing centres for COVID-19, and now usually required before internatio­nal travel. But whereas a PCR test for travel can cost A$150, requires a personal appearance and takes hours or days for the results, a coughing test via app would be free, could be carried out at home, and would give you the answer right away. Other apps will surely follow, allowing self-diagnosis of all kinds of diseases from your phone.

Newspapers in English

Newspapers from Australia