Sci­en­tists Can Read Your Mind... Al­most

A brain scan­ner and newly de­vel­oped ar­ti­fi­cial in­tel­li­gence have al­lowed Amer­i­can sci­en­tists to de­code com­plex thoughts. Soon, the method can be used to con­trol com­put­ers only by the power of thought.

Science Illustrated - - HUMANS/ THE BRAIN -

“The young girl was play­ing soc­cer.” Pause. “The lawyer was drink­ing cof­fee.” Pause. “The wit­ness shouts in the court room.” Pause. A young man slowly and me­thod­i­cally reads out 240 short sen­tences, as a large fMRI de­vice scans his brain ac­tiv­ity.

The scan­ner data is con­verted into de­tailed scan images with coloured 3D spots that in­di­cate the ex­act brain ac­tiv­ity. Sci­en­tists upload all the scan images and the ac­com­pa­ny­ing sen­tences but one set to a com­puter, which is to an­a­lyse the re­la­tions be­tween the sen­tences and the ac­ti­vated parts of the brain. The com­puter is ar­ti­fi­cial in­tel­li­gence, which can learn from data without hav­ing been ex­plic­itly pro­grammed to do so – also known as ma­chine learn­ing. The com­puter has two tasks: it is to pre­dict the brain ac­tiv­ity of the left out scan No. 240 only based on the sen­tence, and it is to de­code the miss­ing sen­tence only based on the de­tailed scan im­age of the brain ac­tiv­ity.

Based on four days of scans from seven dif­fer­ent test sub­jects, sci­en­tists have gone through the test 240 times, leav­ing out a new sen­tence ev­ery time. And the re­sults of their ef­forts are ground-break­ing. With a suc­cess rate of 87 %, the AR has de­coded com­plex thoughts in a hu­man brain for the very first time.

Ham­mer paved the way

The sci­en­tists be­hind the 2017 break­through are Mar­cel Adam Just, Jing Wang, and Vladimir L. Cherkassky from the Carnegie Mel­lon Uni­ver­sity in the US. The team has car­ried out mind read­ing ex­per­i­ments be­fore. The three sci­en­tists have shown that when we think of ob­jects, we al­ready know – such as a ham­mer – the brain does not only treat ham­mer as a word. The word also causes ac­tiv­ity in ar­eas at the cen­tre of the brain’s frontal lobe, which are related to vis­ual rep­re­sen­ta­tions of mo­tor func­tions, etc. When we think about the word ham­mer, we hence also as­so­ciate spe­cific ac­tions or con­cepts with the ob­ject, such as how we hold it or use a ham­mer to build things.

The build­ing blocks that the brain uses to think about in­di­vid­ual words can be iden­ti­fied in spe­cific re­gions of the brain. And that was the dis­cov­ery that made the sci­en­tists de­velop the new mind read­ing AR. If the brain links spe­cific words with spe­cific brain ar­eas, it will the­o­ret­i­cally be pos­si­ble to de­code all thoughts, no mat­ter how com­plex the sen­tences, only based on the ac­tiv­ity pat­terns that the thoughts cause.

Thoughts are in­vis­i­ble waves

The new sci­en­tific re­sult is a mile­stone in more than 100 years of ef­forts to de­velop a tech­nol­ogy that can dis­close peo­ple’s in­ner­most thoughts. The first test was made in the late 1800s by US sci­en­tist Julius Emm­ner.

He was in­spired by a new in­ven­tion, the phono­graph, that demon­strated what sound waves looked like on paper. Ac­cord­ing to Emm­ner, thoughts – just like sound – emit in­vis­i­ble waves, and he tried to build a ma­chine that could mea­sure them. Emm­ner’s ex­per­i­ment never left his lab, but it trig­gered a wave of sim­i­lar ex­per­i­ments.

In 1924, Ger­man sci­en­tist Hans Berger was re­spon­si­ble for the very first EEG read­ing. EEG i s short for ele ctro en­cephalog­ra­phy and mea­sures the brain’s elec­tric ac­tiv­ity via elec­trodes lo­cated in cen­tral ar­eas of the cere­bral cor­tex. 50 years later, sci­en­tist Lawrence Pin­neo tried to use the method for mind read­ing. In 1973, he de­signed a mind read­ing hel­met with lots of elec­trodes, which trans­mit­ted the brain’s elec­tric ac­tiv­ity to a com­puter, in­di­cat­ing a small dot in the dis­play. If the com­puter rec­og­nized the words “up”, “down”, "left", and “right” in the test sub­ject’s thoughts, the dot moved ac­cord­ingly across the dis­play.

Pin­neo’s hel­met in­volved ma­jor lim­i­ta­tions, but it paved the way for a fu­sion of brain and com­puter – known as a brain-com­puter in­ter­face. In 2010, sci­en­tists from the Uni­ver­sity of Utah trans­lated brain sig­nals into words by means of elec­trodes on the speech cen­tre of a pa­tient with locked-in syn­drome, by which the aware pa­tient is to­tally paral­ysed. Dur­ing the ex­per­i­ment, sci­en­tists read out 10 words such as “yes” and “no” to the pa­tient, as they mea­sured the brain ac­tiv­ity. A com­puter linked the brain ac­tiv­ity pat­tern mea­sure­ments with the words, match­ing ac­tiv­ity and words with a suc­cess rate of up to 90 %.

Brain to con­trol com­puter

Carnegie Mel­lon Uni­ver­sity’s mind read­ing ex­per­i­ments prove that an en­vis­aged word is not just made up of the ac­tiv­ity that in­di­vid­ual words such as yes and no cause in the brain’s lan­guage cen­tres. Thoughts con­sist of more com­plex men­tal con­cepts, which are linked with the words.

Based on the ex­per­i­ment, the sci­en­tists iden­ti­fied 42 build­ing blocks in­volv­ing all 240 sen­tences. The 42 Neu­rally Plau­si­ble Se­man­tic Fea­tures – NPSFs – are ba­si­cally di­vided into four main groups: peo­ple, places, emo­tions, and ac­tions. The sen­tence “the wit­ness shouted dur­ing the trial” ac­ti­vates 9 NPSFs, “the wit­ness” ac­count­ing for four of the men­tal build­ing blocks: so­cial norms, knowl­edge, per­son, and com­mu­ni­ca­tion.

The ex­per­i­ment also showed that the sen­tences trig­gered the same brain ac­tiv­ity in all test sub­jects, i.e. the mind read­ing model is uni­ver­sal – and that is use­ful for tech­nol­ogy com­pa­nies such as In­tel. In 2009, the com­pany be­gan to de­velop com­puter chips that rest on the same prin­ci­ple as the sci­en­tists’ mind read­ing ex­per­i­ment and can con­trol com­put­ers and smart­phones by the power of thought. The com­puter chip is to be im­planted in the user’s brain to func­tion as a sensor, reg­is­ter­ing brain ac­tiv­ity and con­vert­ing the thoughts into con­trol sig­nals. If the user thinks “delete doc­u­ment” or “call mother”, the com­puter or smart-phone car­ries out the task. Other com­pa­nies aim to make the tech­nol­ogy con­trol small pri­vate air­craft and cars.

In spite of the ma­jor break­through, the Carnegie Mel­lon Uni­ver­sity re­searchers are not yet sat­is­fied with the mind read­ing tech­nol­ogy, so they are still de­vel­op­ing the AR. The next step is to learn how to de­code brain ac­tiv­ity in con­nec­tion with ab­stract con­cepts such as skate­board­ing or ge­ol­ogy, but chief re­searcher Mar­cel Adam Just hopes that in the long term, the tech­nol­ogy can re­sult in a com­plete map­ping out of what all knowl­edge looks like in the brain.

MAR­CEL JUST/CARNEGIE MEL­LON UNI­VER­SITY

Mar­cel Just (left) heads a team of brain re­searchers, who have de­vel­oped a new, ac­cu­rate mind read­ing tech­nol­ogy.

Newspapers in English

Newspapers from Australia

© PressReader. All rights reserved.