‘THE AGE OF EMOTIONAL MACHINES IS COMING’
Robots that anticipate our desires, apps that read our feelings, wristbands to help autistic children communicate: ‘affective computing’ is about to change our lives. By Madhumita Murgia. Illustration by Pierluigi Longo
In a quiet breakfast cafe, on a sunny October morning in Boston, I am watching a gang of fve animated emotions control the thoughts of a little girl called Riley. On an iPad screen, the green character called Disgust gears into action, making Riley overturn her plate of broccoli in a ft of revulsion, and I gasp. When Riley’s father tries to pacify her by pretending her spoon is an aeroplane, I giggle. All the while, the iPad is reading my emotions.
‘Emotional engagement: HIGH’, the screen reads, once the 30-second clip of Pixar’s flm
Inside Out has ended. On a scale of one to 100, I mostly registered high levels of enjoyment, according to the iPad. During the bit where broccoli goes fying everywhere, my surprise levels go through the roof, mixed in with a little bit of dislike.
‘I didn’t see your face register any dislike; that must be a mistake,’ says my companion, the inventor of the emotion-reading app.
‘I don’t like broccoli, so I may have grimaced,’ I say, sur prised that the app could pick up my micro-expressions.
‘Aha!’ she says, pleased. ‘That’s what it’s really looking for.’
Showing of her invention, which has been 10 yea rs in t he making, is Rana el Kaliouby, a n Eg y pt ian-born computer scient ist. El Kaliouby studied human-computer interaction in Cairo in 1993, before it became fashionable to analyse our relationships with our devices. ‘We used to talk about social robots that could respond to your emotions and it all seemed so far out. Computer cameras were massive webcams. But it only took about 10 years for it all to become real,’ she says.
The emotion-sensing app was built by her start-up Afectiva, which was spun out of the Massachusetts Institute of Technology’s (MIT) maverick Media Lab – a place where designers, computer scientists, artists, architects and neuroscientists pool ideas. Its ‘anti-disciplinary’ collaborations have led to products that belong frmly in the future – from foldable cars to social robots – and resulted in much-loved spin-ofs such as Guitar Hero and the Kindle.
The idea behind Afectiva was to create a computer that could recognise a range of subtle human emotions, based on facial expressions. The company’s work is part of a now-growing feld of research known as ‘a f fect ive computing ’, t he scient if ic eforts to give electronic devices emotional intelligence so that they can respond to our stubbornly human feelings and make our lives better.
Currently the big hype in computer science is around artifcial intelligence – imbuing computers with the ability to learn from data and make rational decisions in areas such as f inancial t rading or
healthcare. From September to December 2014, just nine AI companies raised $201.6 million from Silicon Valley investors who all want in on the gold rush. But scientists like El Kaliouby think emotion-sensing is as important for a machine’s intelligence as data-driven rationality. ‘It’s not just about humancomputer interaction. I realised that by making machines have emotional intelligence, our own communication could become better,’ she says.
Today the idea has star ted to take root in the public imagination. Another Media Lab roboticist, Cynthia Breazeal, has built Jibo, a Disney cartoonlike family robot that can perform simple tasks such as reading a stor y to a child at bedtime or giving voice reminders from a to-do list. It recognises faces and can have simple conversations, and its emotions are powered by Afectiva software.
There is also Pepper, the Japanese robot companion that can tell apart feelings such as joy, sadness and anger, and respond accordingly – by playing you a song, for instance. Even Microsoft released a public tool earlier this year that could reveal a person’s emotions based only on their photos.
Scientists all over the world, including physiologists, neurologists and psychologists, have joined forces with engineers to fnd measurable indicators of human emotion that they can teach computers to look out for. Projects have attempted to decode facial expressions, biometric data such as heart rate or electrodermal activity on the skin, the pitch and timbre of our voices, and even our body language and muscle movements.
The unexpected source of this rich new feld of invention is Rosalind Picard, a petite 53-year-old computer scientist at MIT. Picard, who calls herself ‘the chief troublemaker’, coined the term ‘afective computing’. The feld now has its own academic journal and g roups devoted to its study around the world.
‘Have you seen that Facebook has released new “empathy” buttons? I think that’s really smart,’ Picard says, referring to the recent announcement that the social-media company would add a range of emojis such as ‘yay’, ‘sad’ and ‘angry’ to sit alongside its iconic ‘like’ button.
Describing her early-1990s self, Picard says she was a young, blonde woman trying to make it in the male-dominated world of electrical engineering. She was trying to give computers better perception by helping them process visual and auditory cues. One day she stumbled across the role of emotion in human intelligence while reading The Man Who
Tasted Shapes, neurologist Richard Cytowic’s book about synaesthesia – the condition whereby people’s senses are crossed, so they can taste shapes or see letters as colours. When Picard dug into it, she found emotion was one of the key ingredients of intelligent perception – it tells humans what to pay attention to and what to ignore. But she was determined never to study feelings – they were too irrational and ‘girly’. ‘How to sabotage your career in one easy step? Start working on emotion!’ she says, laughing. ‘I was afraid people wouldn’t take me seriously.’
But in her quest to build an artifcially intelligent computer, the scientist, now a professor at MIT, kept running across emotions. ‘I became convinced you couldn’t build a truly intelligent computer without having emotional capabilities like humans do,’ she says.
Once Picard had decided to found her lab on this principle, she began to measure heart fuctuations, skin conductance, muscle tension, pupil dilation and facial muscles in order to fgure out which changes in our body consistently relate to emotions. ‘We started wiring ourselves up with all these electrodes, pretty hideous-looking, then taking all our data and crunching it,’ she recalls.
But it was worth it. ‘Lo and behold, we found that within a person over a long period of time there were consistent patterns that related to several emotions,’ she says. ‘We could teach their wearable computer to recognise those patterns .’ In other words, a computer with a camera could start to learn how to take lots of diferent data points from your face, and map it to a smile or a frown.
This was the frst step towards the product built by Afectiva. Co-founded by Picard and El Kaliouby, who was a researcher in her lab, Afectiva is one of the most successful companies in facial-expression analysis – it is backed by $20 million and has customers ranging from the BBC to Disney. Picard has since left to work on a new emotional computer that focuses on medical conditions such as autism and epilepsy, while El Kaliouby has taken over the reins as Afectiva’s chief scientifc ofcer.
‘I BECAME CONVINCED YOU COULDN’T BUILD A TRULY INTELLIGENT COMPUTER WITHOUT HAVING EMOTIONAL CAPABILITIES LIKE HUMANS DO’ AFFECTIVA IS WORKING ON AN INCAR EMOTION SENSOR THAT KNOWS WHEN YOU’RE DROWSY OR DISTRACTED, AND CAN TAKE ACTION IN AN EMERGENCY
El Kaliouby has that rare quality of putting you at ease instantly. Her face is open and warm, with a dazzling smile, and she is happy to share details of her private life within minutes of meeting: she had a long-distance marriage for many years, and is currently divorced with two kids – one plays the harp and the other is at tae kwon do. She checks that I’ve eaten breakfast. Her emotion-reading software would probably rate her emotional intelligence ‘high’. ‘I was always particularly interested in the face, because I’m very expressive and Egyptians in general are very expressive people,’ she explains.
The software she uses now is called Afdex, an evolved version of what Picard and she had been building for years. When the software scans my face, it covers my image with a sprinkling of green dots. It has never seen me before, but it traces my eyebrows, lips, nose and eyes instantly. Based on a database of 3.4 million unique facial expressions sourced from 75 countries, it can pick up micro-expressions. ‘We have 45 diferent facial muscles, and when they contract they convert to facial movements, and that’s what the algorithm is really looking for,’ El Kaliouby explains to me, while I amuse myself by alternately smiling and glowering, the software recording my expressions as spikes on an emotions bar chart.
‘When you furrow your eyebrows, it’s looking for little wrinkles. When you smile, it’s looking for whether the shape of your mouth has changed, and whether your teeth are showing,’ she says.
When El Kaliouby was building Afdex at the MIT
Media Lab, she would constantly get emails from lab sponsors – who included Google, Samsung, Toyota and Unilever – asking when they could test it out. So when Afectiva was created, it began to address the commercial market in earnest.
Currently it is used by television producers such as CBS and the BBC to test audience reactions to new shows, companies like Sony to assess movie trailers, and ad agencies such as Millward Brown to trial advertisements for Fortune 500 clients including Coca-Cola and Intel. Tens of thousands of volunteers are recruited to watch clips via webcam, and their emotional responses are aggregated to pick out the overall trends: was that joke funny? Who is the best-loved character?
‘One particular sitcom for CBS had six characters who were all supposed to be f unny. But there was this one couple who, every time they showed up, annoyed people; they were just not funny,’ El Kaliouby says. ‘CBS ended up swapping the characters out.’
With its huge database, Afdex has its fnger on the pulse of universal human emotion. It has found that women are more expressive in general than men – their smile intensities are larger, and they also smile for longer. And older people are more expressive than the young. Smiles also vary by culture. ‘In the US, women smile 40 per cent more than men, in France and Germany it’s 25 per cent more, and in the UK there is no diference. We don’t know why!’ El Kaliouby laughs.
Afectiva is now focusing on avenues beyond adver tising and television. It has been working with a ‘very large Japanese car company’ (Toyota used to sponsor El Kaliouby’s lab at MIT) on building an in-car emotion sensor that knows when you’re drowsy or distracted, and can take action in an emergency situation by calling 999 or alerting a friend or family member.
Her software also powers a new live-streaming app called Chubble – similar to Twitter-owned Periscope, but with an emotional component. It allows you to stream, say, a live concert, to a bunch of friends but they don’t have to actually appear on video. Their emotions are conveyed back to you via little real-time emotional avatars.
While Affect iva has been focused on c ommerci a l appl ic at ion s , P ic ar d decided to go back to the area that most fascinated her: emotion-sensing wearables for healt hcare. In t he ea rly days of her research one of Picard’s neighbours was asking about her work and she explained it to him as ‘teaching computers to recog nise facial expressions, to tr y and understand emotion’. He asked, ‘Could you help my brother? He has autism and he has the same difculties.’ The more Picard read about autism, the more she began to realise that an emotion-decoder could help autistic people interact better with others.
Meanwhile, El Kaliouby was still fnishing her PhD at the University of Cambridge, where she, too, had come across the strange parallels between people with autism and computers. She began building a system, which she called Mind Reader, that could recognise emotions and act as an emotional crutch for people with autism by giving them feedback. When she joined Picard’s lab, they put the software into a pair of glasses with a little in-built camera. ‘It looked a lot like Google Glass, which came much later,’ El Kaliouby laughs.
The glasses were tested on children with varying degrees of autism, ranging from highly functional to non-verbal, at t he Groden Center in Rhode Island. The glasses worked by looking at the face of whom you were speaking to, and mapping their emotions to a little LED bulb. ‘The light would glow g reen if the person was interested or ag reeing with you, yellow would mean slow down or repeat, and red would mean they were confused or looking away,’ Picard explains.
In her colourful, busy ofce in the MIT Lab, corners are stufed with weird and wonderful objects such as a Teenage Mutant Ninja Turtle wrist-cuf that picks up stress levels and a beautiful abstract painting made for her by a non-verbal autistic girl with whom she had tested some devices. A large TV screen, labelled mit mood meter, picks up the expressions of anyone who stops and looks up at it. I smile tentatively at it, and instantly a yellow smiley face is superimposed on mine.
Picard’s group has designed a range of wearable devices to pick up emotions (some, precursors to the new wave of wearables such as the Jawbone, Fitbit or Apple Watch) – wristbands and cufs for daily use that can track biometric data such as your pulse or electrodermal activity and motion. The newest device is known as the E4, designed in collaborat ion with an Italian star t-up called Empatica that is focused on medical-grade wearables. The $1,690 device, which has recently gone on sale to the public, has already been used to study stress, autism, epilepsy, PTSD and depression in clinical studies with Nasa, Intel, Microsoft and MIT among others.
As I wrap it round my wrist tightly, it buzzes when it connects to an app on Picard’s iPhone and starts streaming my biometric data: my temperature, blood-volume pulse, plus electrodermal activity that could indicate stress.
One of the primary uses of the E4 is to predict dangerous epileptic seizures at home. ‘It was a complete accidental fnding,’ Picard says. Over Christmas in 2011, one of her undergraduate students took two autism wristbands home for his little brother, who couldn’t speak. He wanted to know what was stressing him out. ‘Over the holiday I was looking at his data on my screen and every day looked normal, but suddenly one of the wristbands went through the roof and the other didn’t respond at all. I thought,
that’s too high, it must be broken. So I gave up and called my student,’ Picard says. It turned out the wristband had spiked right before the little brother had a grand-mal seizure.
Picard followed this up by performing a largescale clinical study on children with g rand-mal seizures and found t hat t he wristbands ‘had a whopper of a response’. She discovered that the sensor doesn’t actually predict the seizure but can warn in advance if it becomes life-threatening. ‘Our sensor is still peaking after the seizure has ended, and that’s a danger sig n you shouldn’t leave the person alone,’ she says.
Here in the UK, afective computing has infltrated the labs of researchers who are developing a range of unique emotion-sensing devices spanning areas f rom pain to depression. For Nadia Berthouze, an Italian-born computer scientist at University College London (UCL), her chosen area is pain. This is a notoriously difcult feeling to measure – only the person experiencing it really knows how bad it is. ‘Chronic pain can change in seconds, minutes, and there is no way to measure it except through questionnaires that ask you to rank it on a scale of one to 10,’ Berthouze tells me in her lab at UCL.
Her aim is to create sensors that can read their users’ levels of pain and use that information to tailor a therapy. ‘Our work here focuses on recognising the pain emotion using body movements and muscle activity.’ With a motion-capture system, similar to the Kinect for XBoxes, Berthouze and her students can recreate an animated version of a patient’s movements: standing upright, reaching forwards, bending to touch the ground and straightening up again. They also use two sensors to measure muscle activity in the back and neck.
By comparing these models to those of healthy people’s movements, Berthouze can create computer a lgor it hms to di f ferent iate levels of pain. ‘Ultimately we want to develop a low-cost wearable system that could be embedded in trousers, in shoes or a jacket to monitor pain levels, and help people feel better by recommending physiotherapy exercises,’ she says.
Afective scientists such as El Kaliouby, Picard and Berthouze all agree that emotionally intelligent devices will soon become a part of our daily lives. Already, wearables such as the Apple Watch can do rudimentary ‘emotion’ measurements of your heart rate. And examples of emotionally aware devices are popping up in unexpected places. ‘Even my toothbrush actually smiles at me if I brush for two minutes,’ Picard says, laughing. ‘I know it’s just a little algorithm with a timer, but I still think, I can brush another 15 seconds to get that smile!’
Next, your smartphone could come with a little emotion chip, just like the GPS chip that provides a real-time location service. It might tell you to avoid scheduling an important meeting when you seem tired, or suggest taking a break when your attention wanders. At home, your emotion-sensing refrigerator could tell you to resist the ice cream today, based on your stress levels, or your car could warn you to drive slowly this morning because you seem upset.
‘We are going to see an explosion of richness in this area,’ Picard says. ‘The age of the emotional machines – it’s coming.’
Above and below the affective-computing pioneer Professor Rosalind Picard in her laboratory at the Massachusetts Institute of Technology.