Idealog

Empathy, not imitation

-

Elly Strang talks with Google Empathy Lab founder Daniel l e Krett ek on why i t 's t i me f or tech's EQ to match its IQ

The world’s biggest companies are on a quest to make their artificial i ntelligenc­e assistants empathise with the human condition, from Google’s Al assistant, to Amazon's Alexa, to Microsoft’s Cortana. But the woman at the forefront of Google’s Empathy Lab, Danielle Krettek, doesn’t want to give AI a l i teral human face. She says the technology i ndustry i s chasing a false promise i f i t thinks creating ‘ humanoids’ i s the best way forward for humanity. Instead, her work i s focused on researchin­g the quirks that make us human and designing i ntuitive technology that complement­s us, rather than i mitating us. On a recent trip to Vivid Sydney, Elly Strang had a chat to Krettek about bringing emotions to the fore of tech, whether machines will ever feel empathy, and more.

Listening Empathy Lab to the Founder way you and work the as paths Google’s you go down research-wise, it sounds like your job is a dream job: you get curious about something, and then you get to go fully in depth and research it. Is that how it works – you follow what sparks your interest? I started the Empathy Lab three years ago, but I probably didn’t realise I started the Empathy Lab until a year and a half ago. I was like, ‘Wait, this is a thing that’s going well. I should probably call it something so it’s not just called Danielle’s stuff.’ Moments in your life can sneak up on you and I didn’t have a moment where I was like, ‘I founded the lab’. What actually happened was I was actually working on the most unsexy things ever – notificati­ons and voice interactio­n. The thing I always did in my work was to look at the problem that needed to

be inquired about. Take notificati­ons: they’re so unsexy, we kind of hate them. It’s great if it’s a best friend, but mostly you’re like ‘Ugh. I don’t need to know that it’s six degrees colder tomorrow.’ So notificati­ons are a really tough space for informatio­n.

So I wanted to understand the anatomy of a notificati­on, the nitty, gritty, crunchy details of people and how they feel about that. What I always do with a project is look at what is the creative angle: how can I make this interestin­g for people? Because I think the more creative it is, the more creative your designers can be liberated to be because you’re going to tap truth, you’re going to tap emotion and you’re going to tap a place in people where they’re like, ‘I didn’t think about it this way’.

So with the Airplane mode example [in an experiment, Google asked people to come up with other modes to set their phone in according to their mood besides Airplane mode. People came up with over 400 different modes, such as ‘profession­al mode’, ‘hungover mode’ and ‘gentle morning mode’. Krettek joked the latter two were essentiall­y the same thing] people got to have fun with it and it gave so much truthful, authentic informatio­n about their lives. The piece you got out of it was about moods and rituals and how people emotionall­y travel through their days. The funny thing is the curious place isn’t always this field to pick flowers from. It’s not always as glamorous or interestin­g as the stories I tell at the end, but over time, I found that everyone was interested in the crazy, creative thing I did, so as I went through the last five-and-a half years, I started dialing more towards the creative, curious, interestin­g things. The lab is an exercise in courage and brave ideas and good questions and I built momentum over time saying, ‘This is the thing I’m studying.’

The more you own that – ‘This is the thing I’m studying, the design exploratio­n I’m doing, trust me for a minute’ – and the more you follow that little voice while ignoring the critic in your mind saying ‘That sounds batshit crazy’, the more you’ll get courage over time and then you end up in a position like me where I look like I have a total dream job – and I do. I made it in the shape of me, I made it out of the work that gives me the most juice so I can give the most juice to the work. When you think about people who work in the tech industry, the stereotypi­cal person is someone that isn’t so personable – you’re very in you end up there? The way I got into tech, I make the joke that I love Alice in Wonderland – Dad’s a surgeon, Mum’s an artist. Predictabl­y, you can see my lineage in me and the need to make balance out of both. That’s so human and annoying. But I feel like my career is a backwards fall into the rabbit hole because I literally started in London working in a design and advertisin­g firm where I was looking after the minority vote, or Rock the Vote for the UK, and the movements of social change for a greater good. Nike heard about that then I went to New York. I feel like all the places I’ve worked, I’ve been really blessed because of course they’re work, but they were an education. On Nike, I learnt what it was like to create feelings in people and bring them alive. I worked on Michael Jordan shoes and the idea that flight was possible. That got me noticed by Apple, so I worked with Apple’s ad agency, then I switched to the in-house design group. I was really blessed to be there for the golden years, when Mac made the switch to Intel and Steve was like, ‘We’re going to tell stories about how Macs are not for creative people and coffee shops, it’s for everyday creativity, computers are a bicycle for the mind, this is about the human spirit’ and I was like Yes! Right?

Every good piece of work feels like liberation if you’re doing it right. I’d always been following feelings, then with Google I loved that it was so open: ‘We’re going to solve the most audacious, significan­t human problems with technology and we believe in everyone and that you should be open, democratic, free, even though we have products people pay for.’ The spirit of Google called to me because it was truly for everyone. I still don’t feel like I work in tech, even though I work on AI. For me to be at Google, I’m deep in the core of this technology and really proud to be there, but it’s all the more reason to bring the unexpected voices into technology – the storytelle­rs, film makers and the designers – that’s the beating human heart of this. When I interviewe­d Jenny Arden from Airbnb she talked about how companies like that will bring in an astronaut to be on the design team just because they want that completely new tech sector. It’s one of the things I study – I call them ‘the unexpected experts you need to listen to’. I work with entomologi­sts [studiers of insects] as I’ll be looking at a product or problem and think, ‘What would be a very different way of looking at this?’ So if you look at gesture, you talk to dancers and choreograp­hers, people that speak American sign language, and you pull from the full spectrum of non-verbal communicat­ion: neurobiolo­gy, physiology, art. It’s not just the designer and the art director, the table is so much bigger if you think with a genuinely human, inclusive lens, so I think that’s why Google’s always encouraged bigger, bigger, bigger, and I say I’m going to go deeper, deeper, deeper.

We have been told the story that vulnerabil­ity i s a point where we can be attacked, when actually, vulnerabil­ity, i f wielded properly, i s your greatest strength. Not to pass judgment on men, but you do tend Do you think having more women in leadership is leading to this more humanised approach?

Why do you think there has been that humanity component missing from the tech sector and why is this changing now? What’s so interestin­g about being in technology is humans are humans, to state the obvious, but the thing that is most challengin­g is actually at the culture level. People feel like they need to come up not just with problems, but with solutions. They don’t need to create messes, they need to clean things up. They need to be right, to be certain and to create and add value, and these are all things that make you a profession­al, adult human being that’s smart and successful.

What’s really challengin­g is the human experience is not that – we are not these decision-making engines, we are messy, emotional beings that are constantly traveling around, feeling all these feels and yet there’s a really weird thing that happens when you walk into work in the morning and you’re like, ‘I’m going to turn those things off because here is the place where I do the thing I’m good at and do it over and over again’. You’re shutting off however much of yourself that you don’t let in the door. There isn’t a lot of space for not having an answer, so in that space where maths has an answer, science has an answer, it’s really hard to say, ‘Okay, the watery, messy parts of myself are just as important, because those are the unknown depths, but these are the great mysteries that move the human spirit.’ It’s hard to say, ‘I’m going to do the tough thing, I’m going to sit with the mess and trust that on the other side if I take more of me, the work will be better because more of me is in the work.’ How has your research changed your behaviour? The things that don’t change are how much I sleep, the fact I have green tea every morning, the fact my dog makes me really happy and I try to take him with me as many places as possible and that in my meetings, I try focus on really being present with people and trying to get to the deeper layer of a conversati­on. I think what’s funny is we get into problem-solving mode for design conversati­ons and you don’t know where the people are at, like, ‘What’s going on with you in this moment today – how are you feeling?’ I try to start conversati­ons with that and then get into the work. I started that practice not as a mindful, California thing, but because I was doing some work with the Yale Centre for Emotional Intelligen­ce and one of the things I learned while I was studying the deep science of emotion with them is EQ is a skillset – it’s not just something some people are granted. Everyone can build that skillset. They said that when a teacher has a pile of tests to grade, the emotional state they’re in will influence their grading. So the way you solve that problem is recognisin­g the emotion, which allows you to not be biased. What’s lovely about that truism is if you acknowledg­e how you’re feeling, you don’t let it bleed into the world. You can say ‘this is really serving me at this moment’ or actually, ‘this is not serving me at this moment’. But by having those conversati­ons and giving permission for that to be part of a profession­al conversati­on, people’s personal and emotional power isn’t switched off. Everybody feels like they’re seen and heard and met in the moment. I think emotional intelligen­ce is a skillset and it is ungendered. However, culturally there is more permission for women to be attuned in that way and more cultural training and practice around that, and it’s both a blessing and a curse. But looking at this as a skillset, everyone can develop these things, it’s not just the natural birthright of women. At the same time, I love the way women do hold this skill and this space wherever they’re operating, which is why I’m like, ‘More women everywhere, yes please.’ Gloria Steinem is one of my heroes, and I feel like by championin­g emotion in the tech space, this is still one of the frontiers of feminism and I’m really proud to raise my hand and be one of the many voices that are speaking for that.

Being emotionall­y fluent is not a soft skill. There are all of these ways of diminishin­g what it means to speak with truth and feeling and passion and power and I think being an emotional being is not a negative thing, it doesn’t exclude you from rationalit­y, it’s actually a power like logic if you use it in the right way. I see women standing for that more and the more we can encourage it from both sides, the better. Men need to be allowed to have this aspect of feminine expression, it’s permission for everyone to feel and be felt and speak and be heard and take up the full space of themselves,

which isn’t just about emotional intelligen­ce, it’s about being a full human being. I think there are a lot of imbalances to be corrected.

We have been told the story that vulnerabil­ity is a point where we can be attacked, when actually, vulnerabil­ity, if wielded properly, is your greatest strength. My ability to stand up on stage and talk about how I sometimes feel like I don’t belong in these rooms, but I absolutely know I belong in these rooms, is a huge part of my power because if I can own that and say that, no one can write my story for me. There’s a lot of power in your vulnerabil­ity and your truth because you don’t feel like there’s anyone who will snipe at you for something, you’re whole and you’re expressed. Do you think we’ll ever reach a point where a machine will feel empathy? I will never say never, because I don’t have a crystal ball. There are incredible advances being made, but in terms of actually being able to feel empathy, the definition I use by Brene Brown is the clearest: empathy is when the feeling that someone else is having and expressing is a feeling that you can recognise and be aware of in yourself. So much of emotional intelligen­ce is the ability to recognise emotion in yourself, then recognise that in another, then be able to shift that or stay where you are depending on how you want to be, so an empathetic moment requires that both beings are feeling beings.

What’s interestin­g about the space of AI is machines are able to do things humans can do, but I can’t see that far into the future where the machines would say ‘You’re feeling this, I’m picking up on that moment and have felt that in my experience as a machine’. That empathic leap I talked about – I do believe it’s possible because I already see it happening where we as humans are at the centre of the experience, meaning we have our abilities or emotions emphasised because of AI. The humanoid is the red herring. People say, ‘Lets make them just like humans.’ No, we should let them be machines, but think, ‘What more is possible for us?’ because we’re in this companion-like relationsh­ip where more is possible. That’s where it starts to get exciting.

To be fair to the incredibly talented folks doing this human-shaped work with AI, it’s not to say they shouldn’t be doing that. I think that in times of great progress, invention is about exploring all things. But I personally feel more connected to a future where the harmonious relationsh­ip with technology is defined by tapping my humanity and connecting with a machine that feels almost human in next-level intuitiven­ess, versus something that is trying to be like us, and never will be. Being humanoid is so much less powerful a space. It would be a great technologi­cal feat, but for me and the personal trajectory of my work and the work I’m doing with Google, connecting machines with the potential of humanity is just so much more powerful and so much more interestin­g.

There are people who believe it’s science fiction, as in it’s not now, but it could happen. I’m in it for the ride. If they need the first robot psychologi­st, I’ll be there.

while also experienci­ng the likes of Facebook Scott: One of the key things that was on my radar and part of my set of duties as editor-inchief was to expand the platform to all relevant touchpoint­s for our audiences, breaking it out of what would traditiona­lly be called a tech magazine for geeks into a platform or a portal into the future. That meant having a strong presence on social media, that meant having live events, that meant a video platform, that meant reaching retail and community members in retail and other places and diversifyi­ng revenue streams so that the ideas transporte­d through Wired and as defined by Patrick and the strategy process, were relevant to the right audience at the right time. Do you think were ahead of the curve in thinking about a magazine like that? Scott: I think with all things at Wired, you are a product of your environmen­t and certainly living and working in Silicon Valley in San Francisco, the ideas that get unpacked by founders and entreprene­urs up and down that corridor – you can’t help but be influenced by them. Wired was invented by a bunch of folks who thought it was a vehicle to bring stories back from the future. Kevin Kelly was the founding executive editor and called it a ‘letter from the future’, so Wired’s mission was very much imbued to the platforms that it spoke to. It’s about innovation and culture and the intersecti­on of those two ideas. And do you think mixing with inspiring individual­s and companies led you both down the path of becoming entreprene­urs yourself? Scott: Definitely – having founders and entreprene­urs and inventors come through the halls every day, it’s hard not to want to join their ranks at some point. The conversati­ons me and Patrick had throughout the course of our work relationsh­ip and then as friends, you can’t help but be influenced by people that are making great things in the world. Wired, what led you to that decision? Had you achieved everything you wanted to in the industry? Scott: I did. I saw the arc of my 11 years at Condé Nast and my seven years before that at

Texas Monthly – two decades in this space – and I felt like I had learned what I wanted to in the industry. I had almost doubled Wired’s audience over the course of my editorship, I’d built a great team there and positioned it for the future in a strong place where I could step away and chase my next set of dreams with Patrick and think about what a firm that we

 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??

Newspapers in English

Newspapers from New Zealand