‘THE AGE OF EMO­TIONAL MA­CHINES IS COM­ING’

Ro­bots that an­tic­i­pate our de­sires, apps that read our feel­ings, wrist­bands to help autis­tic chil­dren com­mu­ni­cate: ‘af­fec­tive com­put­ing’ is about to change our lives. By Mad­hu­mita Murgia. Il­lus­tra­tion by Pier­luigi Longo

The Daily Telegraph - Telegraph Magazine - - NEWS -

In a quiet break­fast cafe, on a sunny Oc­to­ber morn­ing in Bos­ton, I am watch­ing a gang of fve an­i­mated emo­tions con­trol the thoughts of a lit­tle girl called Ri­ley. On an iPad screen, the green char­ac­ter called Dis­gust gears into ac­tion, mak­ing Ri­ley over­turn her plate of broc­coli in a ft of re­vul­sion, and I gasp. When Ri­ley’s father tries to pacify her by pre­tend­ing her spoon is an aero­plane, I gig­gle. All the while, the iPad is read­ing my emo­tions.

‘Emo­tional en­gage­ment: HIGH’, the screen reads, once the 30-se­cond clip of Pixar’s flm

In­side Out has ended. On a scale of one to 100, I mostly reg­is­tered high lev­els of en­joy­ment, ac­cord­ing to the iPad. Dur­ing the bit where broc­coli goes fy­ing ev­ery­where, my sur­prise lev­els go through the roof, mixed in with a lit­tle bit of dis­like.

‘I didn’t see your face reg­is­ter any dis­like; that must be a mis­take,’ says my com­pan­ion, the in­ven­tor of the emo­tion-read­ing app.

‘I don’t like broc­coli, so I may have gri­maced,’ I say, sur prised that the app could pick up my mi­cro-ex­pres­sions.

‘Aha!’ she says, pleased. ‘That’s what it’s re­ally look­ing for.’

Show­ing of her in­ven­tion, which has been 10 yea rs in t he mak­ing, is Rana el Kaliouby, a n Eg y pt ian-born com­puter sci­ent ist. El Kaliouby stud­ied hu­man-com­puter in­ter­ac­tion in Cairo in 1993, be­fore it be­came fash­ion­able to an­a­lyse our re­la­tion­ships with our devices. ‘We used to talk about so­cial ro­bots that could re­spond to your emo­tions and it all seemed so far out. Com­puter cam­eras were mas­sive we­b­cams. But it only took about 10 years for it all to be­come real,’ she says.

The emo­tion-sens­ing app was built by her start-up Afec­tiva, which was spun out of the Mas­sachusetts In­sti­tute of Tech­nol­ogy’s (MIT) mav­er­ick Me­dia Lab – a place where de­sign­ers, com­puter sci­en­tists, artists, ar­chi­tects and neu­ro­sci­en­tists pool ideas. Its ‘anti-dis­ci­plinary’ col­lab­o­ra­tions have led to prod­ucts that be­long frmly in the fu­ture – from fold­able cars to so­cial ro­bots – and re­sulted in much-loved spin-ofs such as Gui­tar Hero and the Kin­dle.

The idea be­hind Afec­tiva was to cre­ate a com­puter that could recog­nise a range of sub­tle hu­man emo­tions, based on fa­cial ex­pres­sions. The com­pany’s work is part of a now-grow­ing feld of re­search known as ‘a f fect ive com­put­ing ’, t he sci­ent if ic eforts to give elec­tronic devices emo­tional in­tel­li­gence so that they can re­spond to our stub­bornly hu­man feel­ings and make our lives bet­ter.

Cur­rently the big hype in com­puter sci­ence is around ar­tif­cial in­tel­li­gence – im­bu­ing com­put­ers with the abil­ity to learn from data and make ra­tio­nal de­ci­sions in ar­eas such as f inan­cial t rad­ing or

health­care. From Septem­ber to De­cem­ber 2014, just nine AI com­pa­nies raised $201.6 mil­lion from Sil­i­con Val­ley in­vestors who all want in on the gold rush. But sci­en­tists like El Kaliouby think emo­tion-sens­ing is as im­por­tant for a ma­chine’s in­tel­li­gence as data-driven ra­tio­nal­ity. ‘It’s not just about hu­mancom­puter in­ter­ac­tion. I re­alised that by mak­ing ma­chines have emo­tional in­tel­li­gence, our own com­mu­ni­ca­tion could be­come bet­ter,’ she says.

To­day the idea has star ted to take root in the pub­lic imag­i­na­tion. An­other Me­dia Lab roboti­cist, Cyn­thia Breazeal, has built Jibo, a Dis­ney car­toon­like fam­ily ro­bot that can per­form sim­ple tasks such as read­ing a stor y to a child at bed­time or giv­ing voice re­minders from a to-do list. It recog­nises faces and can have sim­ple con­ver­sa­tions, and its emo­tions are pow­ered by Afec­tiva soft­ware.

There is also Pep­per, the Ja­panese ro­bot com­pan­ion that can tell apart feel­ings such as joy, sad­ness and anger, and re­spond ac­cord­ingly – by play­ing you a song, for in­stance. Even Mi­crosoft re­leased a pub­lic tool ear­lier this year that could re­veal a per­son’s emo­tions based only on their pho­tos.

Sci­en­tists all over the world, in­clud­ing phys­i­ol­o­gists, neu­rol­o­gists and psy­chol­o­gists, have joined forces with en­gi­neers to fnd mea­sur­able in­di­ca­tors of hu­man emo­tion that they can teach com­put­ers to look out for. Projects have at­tempted to de­code fa­cial ex­pres­sions, bio­met­ric data such as heart rate or elec­tro­der­mal ac­tiv­ity on the skin, the pitch and tim­bre of our voices, and even our body lan­guage and mus­cle move­ments.

The un­ex­pected source of this rich new feld of in­ven­tion is Ros­alind Pi­card, a pe­tite 53-year-old com­puter sci­en­tist at MIT. Pi­card, who calls her­self ‘the chief trou­ble­maker’, coined the term ‘afec­tive com­put­ing’. The feld now has its own aca­demic jour­nal and g roups de­voted to its study around the world.

‘Have you seen that Face­book has re­leased new “em­pa­thy” but­tons? I think that’s re­ally smart,’ Pi­card says, re­fer­ring to the re­cent an­nounce­ment that the so­cial-me­dia com­pany would add a range of emo­jis such as ‘yay’, ‘sad’ and ‘an­gry’ to sit along­side its iconic ‘like’ but­ton.

De­scrib­ing her early-1990s self, Pi­card says she was a young, blonde woman try­ing to make it in the male-dom­i­nated world of elec­tri­cal en­gi­neer­ing. She was try­ing to give com­put­ers bet­ter per­cep­tion by help­ing them process vis­ual and au­di­tory cues. One day she stum­bled across the role of emo­tion in hu­man in­tel­li­gence while read­ing The Man Who

Tasted Shapes, neu­rol­o­gist Richard Cy­towic’s book about synaes­the­sia – the con­di­tion whereby peo­ple’s senses are crossed, so they can taste shapes or see let­ters as colours. When Pi­card dug into it, she found emo­tion was one of the key in­gre­di­ents of in­tel­li­gent per­cep­tion – it tells hu­mans what to pay at­ten­tion to and what to ig­nore. But she was de­ter­mined never to study feel­ings – they were too ir­ra­tional and ‘girly’. ‘How to sab­o­tage your ca­reer in one easy step? Start work­ing on emo­tion!’ she says, laugh­ing. ‘I was afraid peo­ple wouldn’t take me se­ri­ously.’

But in her quest to build an ar­tif­cially in­tel­li­gent com­puter, the sci­en­tist, now a pro­fes­sor at MIT, kept run­ning across emo­tions. ‘I be­came con­vinced you couldn’t build a truly in­tel­li­gent com­puter with­out hav­ing emo­tional ca­pa­bil­i­ties like hu­mans do,’ she says.

Once Pi­card had de­cided to found her lab on this prin­ci­ple, she be­gan to mea­sure heart fuc­tu­a­tions, skin con­duc­tance, mus­cle ten­sion, pupil di­la­tion and fa­cial mus­cles in or­der to fgure out which changes in our body con­sis­tently re­late to emo­tions. ‘We started wiring our­selves up with all th­ese elec­trodes, pretty hideous-look­ing, then tak­ing all our data and crunch­ing it,’ she re­calls.

But it was worth it. ‘Lo and be­hold, we found that within a per­son over a long pe­riod of time there were con­sis­tent pat­terns that re­lated to sev­eral emo­tions,’ she says. ‘We could teach their wear­able com­puter to recog­nise those pat­terns .’ In other words, a com­puter with a cam­era could start to learn how to take lots of difer­ent data points from your face, and map it to a smile or a frown.

This was the frst step to­wards the prod­uct built by Afec­tiva. Co-founded by Pi­card and El Kaliouby, who was a re­searcher in her lab, Afec­tiva is one of the most suc­cess­ful com­pa­nies in fa­cial-ex­pres­sion anal­y­sis – it is backed by $20 mil­lion and has cus­tomers rang­ing from the BBC to Dis­ney. Pi­card has since left to work on a new emo­tional com­puter that fo­cuses on med­i­cal con­di­tions such as autism and epilepsy, while El Kaliouby has taken over the reins as Afec­tiva’s chief sci­en­tifc ofcer.

‘I BE­CAME CON­VINCED YOU COULDN’T BUILD A TRULY IN­TEL­LI­GENT COM­PUTER WITH­OUT HAV­ING EMO­TIONAL CA­PA­BIL­I­TIES LIKE HU­MANS DO’ AF­FEC­TIVA IS WORK­ING ON AN IN­CAR EMO­TION SEN­SOR THAT KNOWS WHEN YOU’RE DROWSY OR DIS­TRACTED, AND CAN TAKE AC­TION IN AN EMER­GENCY

El Kaliouby has that rare qual­ity of putting you at ease in­stantly. Her face is open and warm, with a daz­zling smile, and she is happy to share de­tails of her pri­vate life within min­utes of meet­ing: she had a long-dis­tance mar­riage for many years, and is cur­rently di­vorced with two kids – one plays the harp and the other is at tae kwon do. She checks that I’ve eaten break­fast. Her emo­tion-read­ing soft­ware would prob­a­bly rate her emo­tional in­tel­li­gence ‘high’. ‘I was al­ways par­tic­u­larly in­ter­ested in the face, be­cause I’m very ex­pres­sive and Egyp­tians in gen­eral are very ex­pres­sive peo­ple,’ she ex­plains.

The soft­ware she uses now is called Afdex, an evolved ver­sion of what Pi­card and she had been build­ing for years. When the soft­ware scans my face, it cov­ers my im­age with a sprin­kling of green dots. It has never seen me be­fore, but it traces my eye­brows, lips, nose and eyes in­stantly. Based on a data­base of 3.4 mil­lion unique fa­cial ex­pres­sions sourced from 75 coun­tries, it can pick up mi­cro-ex­pres­sions. ‘We have 45 difer­ent fa­cial mus­cles, and when they con­tract they con­vert to fa­cial move­ments, and that’s what the al­go­rithm is re­ally look­ing for,’ El Kaliouby ex­plains to me, while I amuse my­self by al­ter­nately smil­ing and glow­er­ing, the soft­ware record­ing my ex­pres­sions as spikes on an emo­tions bar chart.

‘When you fur­row your eye­brows, it’s look­ing for lit­tle wrinkles. When you smile, it’s look­ing for whether the shape of your mouth has changed, and whether your teeth are show­ing,’ she says.

When El Kaliouby was build­ing Afdex at the MIT

Me­dia Lab, she would con­stantly get emails from lab spon­sors – who in­cluded Google, Sam­sung, Toy­ota and Unilever – ask­ing when they could test it out. So when Afec­tiva was cre­ated, it be­gan to ad­dress the com­mer­cial mar­ket in earnest.

Cur­rently it is used by tele­vi­sion pro­duc­ers such as CBS and the BBC to test au­di­ence re­ac­tions to new shows, com­pa­nies like Sony to as­sess movie trail­ers, and ad agen­cies such as Mill­ward Brown to trial ad­ver­tise­ments for For­tune 500 clients in­clud­ing Coca-Cola and In­tel. Tens of thou­sands of vol­un­teers are re­cruited to watch clips via we­b­cam, and their emo­tional re­sponses are ag­gre­gated to pick out the over­all trends: was that joke funny? Who is the best-loved char­ac­ter?

‘One par­tic­u­lar sit­com for CBS had six char­ac­ters who were all sup­posed to be f unny. But there was this one cou­ple who, ev­ery time they showed up, an­noyed peo­ple; they were just not funny,’ El Kaliouby says. ‘CBS ended up swap­ping the char­ac­ters out.’

With its huge data­base, Afdex has its fnger on the pulse of uni­ver­sal hu­man emo­tion. It has found that women are more ex­pres­sive in gen­eral than men – their smile in­ten­si­ties are larger, and they also smile for longer. And older peo­ple are more ex­pres­sive than the young. Smiles also vary by cul­ture. ‘In the US, women smile 40 per cent more than men, in France and Ger­many it’s 25 per cent more, and in the UK there is no difer­ence. We don’t know why!’ El Kaliouby laughs.

Afec­tiva is now fo­cus­ing on av­enues be­yond ad­ver tis­ing and tele­vi­sion. It has been work­ing with a ‘very large Ja­panese car com­pany’ (Toy­ota used to spon­sor El Kaliouby’s lab at MIT) on build­ing an in-car emo­tion sen­sor that knows when you’re drowsy or dis­tracted, and can take ac­tion in an emer­gency sit­u­a­tion by call­ing 999 or alert­ing a friend or fam­ily mem­ber.

Her soft­ware also pow­ers a new live-stream­ing app called Chub­ble – sim­i­lar to Twit­ter-owned Periscope, but with an emo­tional com­po­nent. It al­lows you to stream, say, a live con­cert, to a bunch of friends but they don’t have to ac­tu­ally ap­pear on video. Their emo­tions are con­veyed back to you via lit­tle real-time emo­tional avatars.

While Af­fect iva has been fo­cused on c om­merci a l appl ic at ion s , P ic ar d de­cided to go back to the area that most fas­ci­nated her: emo­tion-sens­ing wear­ables for healt hcare. In t he ea rly days of her re­search one of Pi­card’s neigh­bours was ask­ing about her work and she ex­plained it to him as ‘teach­ing com­put­ers to recog nise fa­cial ex­pres­sions, to tr y and un­der­stand emo­tion’. He asked, ‘Could you help my brother? He has autism and he has the same difcul­ties.’ The more Pi­card read about autism, the more she be­gan to re­alise that an emo­tion-decoder could help autis­tic peo­ple in­ter­act bet­ter with oth­ers.

Mean­while, El Kaliouby was still fnish­ing her PhD at the Univer­sity of Cam­bridge, where she, too, had come across the strange par­al­lels be­tween peo­ple with autism and com­put­ers. She be­gan build­ing a sys­tem, which she called Mind Reader, that could recog­nise emo­tions and act as an emo­tional crutch for peo­ple with autism by giv­ing them feed­back. When she joined Pi­card’s lab, they put the soft­ware into a pair of glasses with a lit­tle in-built cam­era. ‘It looked a lot like Google Glass, which came much later,’ El Kaliouby laughs.

The glasses were tested on chil­dren with vary­ing de­grees of autism, rang­ing from highly func­tional to non-ver­bal, at t he Gro­den Cen­ter in Rhode Is­land. The glasses worked by look­ing at the face of whom you were speak­ing to, and map­ping their emo­tions to a lit­tle LED bulb. ‘The light would glow g reen if the per­son was in­ter­ested or ag reeing with you, yel­low would mean slow down or re­peat, and red would mean they were con­fused or look­ing away,’ Pi­card ex­plains.

In her colour­ful, busy ofce in the MIT Lab, cor­ners are stufed with weird and won­der­ful ob­jects such as a Teenage Mu­tant Ninja Tur­tle wrist-cuf that picks up stress lev­els and a beau­ti­ful ab­stract paint­ing made for her by a non-ver­bal autis­tic girl with whom she had tested some devices. A large TV screen, la­belled mit mood me­ter, picks up the ex­pres­sions of any­one who stops and looks up at it. I smile ten­ta­tively at it, and in­stantly a yel­low smi­ley face is su­per­im­posed on mine.

Pi­card’s group has de­signed a range of wear­able devices to pick up emo­tions (some, pre­cur­sors to the new wave of wear­ables such as the Jaw­bone, Fit­bit or Ap­ple Watch) – wrist­bands and cufs for daily use that can track bio­met­ric data such as your pulse or elec­tro­der­mal ac­tiv­ity and mo­tion. The new­est de­vice is known as the E4, de­signed in col­lab­o­rat ion with an Ital­ian star t-up called Em­pat­ica that is fo­cused on med­i­cal-grade wear­ables. The $1,690 de­vice, which has re­cently gone on sale to the pub­lic, has al­ready been used to study stress, autism, epilepsy, PTSD and de­pres­sion in clin­i­cal stud­ies with Nasa, In­tel, Mi­crosoft and MIT among oth­ers.

As I wrap it round my wrist tightly, it buzzes when it con­nects to an app on Pi­card’s iPhone and starts stream­ing my bio­met­ric data: my tem­per­a­ture, blood-vol­ume pulse, plus elec­tro­der­mal ac­tiv­ity that could in­di­cate stress.

One of the pri­mary uses of the E4 is to pre­dict dan­ger­ous epilep­tic seizures at home. ‘It was a com­plete ac­ci­den­tal fnd­ing,’ Pi­card says. Over Christ­mas in 2011, one of her un­der­grad­u­ate stu­dents took two autism wrist­bands home for his lit­tle brother, who couldn’t speak. He wanted to know what was stress­ing him out. ‘Over the hol­i­day I was look­ing at his data on my screen and ev­ery day looked nor­mal, but sud­denly one of the wrist­bands went through the roof and the other didn’t re­spond at all. I thought,

that’s too high, it must be bro­ken. So I gave up and called my stu­dent,’ Pi­card says. It turned out the wrist­band had spiked right be­fore the lit­tle brother had a grand-mal seizure.

Pi­card fol­lowed this up by per­form­ing a largescale clin­i­cal study on chil­dren with g rand-mal seizures and found t hat t he wrist­bands ‘had a whop­per of a re­sponse’. She dis­cov­ered that the sen­sor doesn’t ac­tu­ally pre­dict the seizure but can warn in ad­vance if it be­comes life-threat­en­ing. ‘Our sen­sor is still peak­ing af­ter the seizure has ended, and that’s a dan­ger sig n you shouldn’t leave the per­son alone,’ she says.

Here in the UK, afec­tive com­put­ing has in­fltrated the labs of re­searchers who are de­vel­op­ing a range of unique emo­tion-sens­ing devices span­ning ar­eas f rom pain to de­pres­sion. For Na­dia Berthouze, an Ital­ian-born com­puter sci­en­tist at Univer­sity Col­lege Lon­don (UCL), her cho­sen area is pain. This is a no­to­ri­ously difcult feel­ing to mea­sure – only the per­son ex­pe­ri­enc­ing it re­ally knows how bad it is. ‘Chronic pain can change in sec­onds, min­utes, and there is no way to mea­sure it ex­cept through ques­tion­naires that ask you to rank it on a scale of one to 10,’ Berthouze tells me in her lab at UCL.

Her aim is to cre­ate sen­sors that can read their users’ lev­els of pain and use that in­for­ma­tion to tai­lor a ther­apy. ‘Our work here fo­cuses on recog­nis­ing the pain emo­tion us­ing body move­ments and mus­cle ac­tiv­ity.’ With a mo­tion-cap­ture sys­tem, sim­i­lar to the Kinect for XBoxes, Berthouze and her stu­dents can recre­ate an an­i­mated ver­sion of a pa­tient’s move­ments: stand­ing upright, reach­ing for­wards, bend­ing to touch the ground and straight­en­ing up again. They also use two sen­sors to mea­sure mus­cle ac­tiv­ity in the back and neck.

By com­par­ing th­ese mod­els to those of healthy peo­ple’s move­ments, Berthouze can cre­ate com­puter a lgor it hms to di f fer­ent iate lev­els of pain. ‘Ul­ti­mately we want to de­velop a low-cost wear­able sys­tem that could be em­bed­ded in trousers, in shoes or a jacket to mon­i­tor pain lev­els, and help peo­ple feel bet­ter by rec­om­mend­ing phys­io­ther­apy ex­er­cises,’ she says.

Afec­tive sci­en­tists such as El Kaliouby, Pi­card and Berthouze all agree that emo­tion­ally in­tel­li­gent devices will soon be­come a part of our daily lives. Al­ready, wear­ables such as the Ap­ple Watch can do rudi­men­tary ‘emo­tion’ mea­sure­ments of your heart rate. And ex­am­ples of emo­tion­ally aware devices are pop­ping up in un­ex­pected places. ‘Even my tooth­brush ac­tu­ally smiles at me if I brush for two min­utes,’ Pi­card says, laugh­ing. ‘I know it’s just a lit­tle al­go­rithm with a timer, but I still think, I can brush an­other 15 sec­onds to get that smile!’

Next, your smart­phone could come with a lit­tle emo­tion chip, just like the GPS chip that pro­vides a real-time lo­ca­tion ser­vice. It might tell you to avoid sched­ul­ing an im­por­tant meet­ing when you seem tired, or sug­gest tak­ing a break when your at­ten­tion wan­ders. At home, your emo­tion-sens­ing re­frig­er­a­tor could tell you to re­sist the ice cream to­day, based on your stress lev­els, or your car could warn you to drive slowly this morn­ing be­cause you seem up­set.

‘We are go­ing to see an ex­plo­sion of rich­ness in this area,’ Pi­card says. ‘The age of the emo­tional ma­chines – it’s com­ing.’

Pho­to­graphs by Ra­nia Matar

Above and below the af­fec­tive-com­put­ing pi­o­neer Pro­fes­sor Ros­alind Pi­card in her lab­o­ra­tory at the Mas­sachusetts In­sti­tute of Tech­nol­ogy.

Newspapers in English

Newspapers from UK

© PressReader. All rights reserved.