Science Illustrated

Scientists Can Read Your Mind... Almost

A brain scanner and newly developed artificial intelligen­ce have allowed American scientists to decode complex thoughts. Soon, the method can be used to control computers only by the power of thought.

-

“The young girl was playing soccer.” Pause. “The lawyer was drinking coffee.” Pause. “The witness shouts in the court room.” Pause. A young man slowly and methodical­ly reads out 240 short sentences, as a large fMRI device scans his brain activity.

The scanner data is converted into detailed scan images with coloured 3D spots that indicate the exact brain activity. Scientists upload all the scan images and the accompanyi­ng sentences but one set to a computer, which is to analyse the relations between the sentences and the activated parts of the brain. The computer is artificial intelligen­ce, which can learn from data without having been explicitly programmed to do so – also known as machine learning. The computer has two tasks: it is to predict the brain activity of the left out scan No. 240 only based on the sentence, and it is to decode the missing sentence only based on the detailed scan image of the brain activity.

Based on four days of scans from seven different test subjects, scientists have gone through the test 240 times, leaving out a new sentence every time. And the results of their efforts are ground-breaking. With a success rate of 87 %, the AR has decoded complex thoughts in a human brain for the very first time.

Hammer paved the way

The scientists behind the 2017 breakthrou­gh are Marcel Adam Just, Jing Wang, and Vladimir L. Cherkassky from the Carnegie Mellon University in the US. The team has carried out mind reading experiment­s before. The three scientists have shown that when we think of objects, we already know – such as a hammer – the brain does not only treat hammer as a word. The word also causes activity in areas at the centre of the brain’s frontal lobe, which are related to visual representa­tions of motor functions, etc. When we think about the word hammer, we hence also associate specific actions or concepts with the object, such as how we hold it or use a hammer to build things.

The building blocks that the brain uses to think about individual words can be identified in specific regions of the brain. And that was the discovery that made the scientists develop the new mind reading AR. If the brain links specific words with specific brain areas, it will theoretica­lly be possible to decode all thoughts, no matter how complex the sentences, only based on the activity patterns that the thoughts cause.

Thoughts are invisible waves

The new scientific result is a milestone in more than 100 years of efforts to develop a technology that can disclose people’s innermost thoughts. The first test was made in the late 1800s by US scientist Julius Emmner.

He was inspired by a new invention, the phonograph, that demonstrat­ed what sound waves looked like on paper. According to Emmner, thoughts – just like sound – emit invisible waves, and he tried to build a machine that could measure them. Emmner’s experiment never left his lab, but it triggered a wave of similar experiment­s.

In 1924, German scientist Hans Berger was responsibl­e for the very first EEG reading. EEG i s short for ele ctro encephalog­raphy and measures the brain’s electric activity via electrodes located in central areas of the cerebral cortex. 50 years later, scientist Lawrence Pinneo tried to use the method for mind reading. In 1973, he designed a mind reading helmet with lots of electrodes, which transmitte­d the brain’s electric activity to a computer, indicating a small dot in the display. If the computer recognized the words “up”, “down”, "left", and “right” in the test subject’s thoughts, the dot moved accordingl­y across the display.

Pinneo’s helmet involved major limitation­s, but it paved the way for a fusion of brain and computer – known as a brain-computer interface. In 2010, scientists from the University of Utah translated brain signals into words by means of electrodes on the speech centre of a patient with locked-in syndrome, by which the aware patient is totally paralysed. During the experiment, scientists read out 10 words such as “yes” and “no” to the patient, as they measured the brain activity. A computer linked the brain activity pattern measuremen­ts with the words, matching activity and words with a success rate of up to 90 %.

Brain to control computer

Carnegie Mellon University’s mind reading experiment­s prove that an envisaged word is not just made up of the activity that individual words such as yes and no cause in the brain’s language centres. Thoughts consist of more complex mental concepts, which are linked with the words.

Based on the experiment, the scientists identified 42 building blocks involving all 240 sentences. The 42 Neurally Plausible Semantic Features – NPSFs – are basically divided into four main groups: people, places, emotions, and actions. The sentence “the witness shouted during the trial” activates 9 NPSFs, “the witness” accounting for four of the mental building blocks: social norms, knowledge, person, and communicat­ion.

The experiment also showed that the sentences triggered the same brain activity in all test subjects, i.e. the mind reading model is universal – and that is useful for technology companies such as Intel. In 2009, the company began to develop computer chips that rest on the same principle as the scientists’ mind reading experiment and can control computers and smartphone­s by the power of thought. The computer chip is to be implanted in the user’s brain to function as a sensor, registerin­g brain activity and converting the thoughts into control signals. If the user thinks “delete document” or “call mother”, the computer or smart-phone carries out the task. Other companies aim to make the technology control small private aircraft and cars.

In spite of the major breakthrou­gh, the Carnegie Mellon University researcher­s are not yet satisfied with the mind reading technology, so they are still developing the AR. The next step is to learn how to decode brain activity in connection with abstract concepts such as skateboard­ing or geology, but chief researcher Marcel Adam Just hopes that in the long term, the technology can result in a complete mapping out of what all knowledge looks like in the brain.

 ?? MARCEL JUST/CARNEGIE MELLON UNIVERSITY ?? Marcel Just (left) heads a team of brain researcher­s, who have developed a new, accurate mind reading technology.
MARCEL JUST/CARNEGIE MELLON UNIVERSITY Marcel Just (left) heads a team of brain researcher­s, who have developed a new, accurate mind reading technology.

Newspapers in English

Newspapers from Australia