Ji­hee Junn learns about Soul Ma­chines' 'dig­i­tal hu­mans'

AF­TER ALL

Idealog - - CON­TENT -

New Zealand com­pany Soul Ma­chines i s on a mis­sion to re­verse engi­neer the brain and hu­man­ise AI i nter­ac­tions. And i t’s mak­ing very good progress. J i hee J unn ex­plores the rise of – and po­ten­tial uses for – i ts ‘dig­i­tal hu­mans’. In Spike Jonze’s 2013 film Her, Joaquin Phoenix plays Theodore Twombly, a lonely, im­pend­ing di­vorcee whose job in­volves dic­tat­ing other peo­ple’s hand­writ­ten let­ters. But when Sa­man­tha, an ar­ti­fi­cially in­tel­li­gent voice op­er­at­ing sys­tem, en­ters his life, Theodore finds his emo­tional des­o­la­tion off­set by Sa­man­tha’s re­mark­ably life­like per­son­al­ity. Pithy, hu­mor­ous, em­pa­thetic and even em­bar­rassed at times, Sa­man­tha’s spring­board of emo­tions en­thrals Theodore, who ad­mits to her that “I don’t feel like I can say [things] to any­body, but I feel like I can say [things] to you”.

While the film goes on to ex­plore Sa­man­tha and Theodore’s blos­som­ing (al­beit doomed) ro­man­tic re­la­tion­ship, Sa­man­tha’s scope for emo­tional re­sponse con­veys an oft-ex­plored topic in the realm of sci­ence fic­tion: what if com­put­ers were ca­pa­ble of not just think­ing, but feel­ing as well? In Ri­d­ley Scott’s 1980s film Blade Run­ner, an­droid ‘repli­cants’ are so ad­vanced they're prac­ti­cally in­dis­tin­guish­able from hu­mans, re­quir­ing a fine-tuned VoightKampff test (equiv­a­lent to to­day’s Tur­ing test) in or­der to de­ter­mine who’s who. Si­m­il­iar sen­tient be­ings can also be seen in re­cent hits like HBO’s TV series West­world, Alex Gar­land’s film Ex Machina, and in the lat­est in­stal­ments

of the Alien fran­chise, all of which de­pict tech­no­log­i­cally ad­vanced an­droids ca­pa­ble of feel­ing joy, grief, anger, hope and even a de­sire for vi­o­lent re­tal­i­a­tion.

Although not quite at the hy­per­con­scious level de­picted in these sci­ence fic­tion clas­sics, it’s en­tirely plau­si­ble to say that the tech­nol­ogy hosted by Auck­land­based com­pany Soul Ma­chines has come closer to mak­ing it a re­al­ity than any­one else. As the name sug­gests, Soul Ma­chines cre­ates emo­tion­ally in­tel­li­gent, life­like avatars (or, as it prefers to call them, ‘dig­i­tal hu­mans’) that act as a vis­ual in­ter­face for cus­tomer ser­vice chat­bots, vir­tual as­sis­tants and a host of other prac­ti­cal uses.

While ar­ti­fi­cial in­tel­li­gence (AI) has be­come a term even the most tech­no­log­i­cally in­ept among us have be­come fa­mil­iar with, emo­tional in­tel­li­gence (EI) – the ca­pac­ity to iden­tify and man­age one’s own emo­tions and the emo­tions of oth­ers – has been a term ap­plied more com­monly among psy­chol­o­gists than in com­puter pro­gram­ming cir­cles. But as ro­bot­ics and au­to­ma­tion be­come in­creas­ingly in­grained into the work­ings of so­ci­ety, ex­perts have re­alised that to ex­tend the pos­si­bil­i­ties of AI, they must equip these tech­nolo­gies with the ca­pa­bil­ity to form en­gag­ing in­ter­ac­tions with hu­mans. In fact, the in­clu­sion of EI is what dis­tin­guishes Soul Ma­chines from the rest of the pack: its avatars can recog­nise emo­tions by analysing an in­di­vid­ual’s fa­cial and vo­cal ex­pres­sions in real time, while re­cip­ro­cat­ing these re­ac­tions with an un­prece­dented level of hu­man­like re­sponse. Like AI, EI de­vel­ops through ex­pe­ri­ence – the more it in­ter­acts with you, the more emo­tion­ally sen­tient it gets.

These life­like in­ter­ac­tions can most no­tably be seen in sev­eral demon­stra­tions of BabyX run by Soul Ma­chines CEO and co-founder Dr. Mark Sa­gar. With a past ca­reer as Weta Dig­i­tal’s spe­cial projects su­per­vi­sor for block­busters like Avatar, King Kong and Rise

of the Planet of the Apes, Dr. Sa­gar joined the Uni­ver­sity of Auck­land’s Lab­o­ra­tory for An­i­mate Tech­nolo­gies in 2012 where he be­gan to de­velop the BabyX tech­nol­ogy that now un­der­pins Soul Ma­chines. BabyX, an in­ter­ac­tive vir­tual in­fant pro­to­type, ap­pears on screen as a rosy cheeked, straw­berry blonde, doe-eyed tod­dler. Just like a real child, BabyX whim­pers and cries when it’s in­sulted or ig­nored, and smiles and coos when it’s en­cour­aged or en­ter­tained.

While the tech­nol­ogy be­hind Soul Ma­chines has been a project sev­eral years in the mak­ing, it’s still a new­comer to the com­mer­cial realm, hav­ing only for­mally launched in 2016 af­ter re­ceiv­ing a $7.5 mil­lion in­vest­ment from Hong Kong­based Hori­zon Ven­tures. From the start, the com­pany has at­tracted a huge amount of at­ten­tion. Elon Musk’s bi­og­ra­pher Ash­lee Vance vis­ited Sa­gar as part of his tech­nol­ogy show Hello World; Bill Re­ichert, en­tre­pre­neur and man­ag­ing di­rec­tor of Garage Tech­nol­ogy Ven­tures, listed Soul Ma­chines as one of the star­tups that im­pressed him the most dur­ing a re­cent visit to New Zealand; and in PwC’s 2017 Com­mer­cial­is­ing In­no­va­tion Re­port, Soul Ma­chines was again cited as a prime ex­am­ple of “lead­ing the way in the AI space”. But the hype ap­pears to be war­ranted. In Fe­bru­ary this year, Soul Ma­chines un­veiled ‘Na­dia’ to the pub­lic, its vir­tual as­sis­tant de­vel­oped for the NDIS (Na­tional Dis­abil­ity In­sur­ance Scheme) in Aus­tralia. De­signed to bet­ter help dis­abled peo­ple that tra­di­tion­ally strug­gle with tech­nol­ogy in­ter­faces, Na­dia, whose voice was recorded by none other than ac­tress Cate Blanchett, as­tounded many with her re­mark­ably de­tailed phys­i­ol­ogy and as­tutely aware in­ter­ac­tions.

July will mark one year since the com­pany spun out of the Uni­ver­sity of Auck­land, even­tu­ally trad­ing its aca­demic head­quar­ters for an of­fice in Auck­land’s his­toric Ferry Build­ing. With just nine full time em­ploy­ees at the time of its com­mer­cial launch, Soul Ma­chines now boasts more than 40 peo­ple on its bur­geon­ing staff ros­ter. And while the com­pany al­ready has plenty to pride it­self on, it cer­tainly isn’t rest­ing on its lau­rels just yet. Hav­ing re­cently re­turned from de­but­ing Soul Ma­chines at the Cannes Lions Fes­ti­val in France, chief busi­ness of­fi­cer Greg Cross says that he and Dr. Sa­gar con­ducted a to­tal of 28 pre­sen­ta­tions in four days, demon­strat­ing their of­fer­ings to mar­ket­ing of­fi­cers from com­pa­nies like Mazda, Subaru, Airbnb and Book­ing.com.

Cross, who’s been part of the Soul Ma­chines team since launch­ing last year, has had a big hand in help­ing to com­mer­cialise Dr. Sa­gar’s re­mark­able in­no­va­tion. As a se­rial tech en­tre­pre­neur who’s helped build com­pa­nies from all over the world, Cross’ ex­cite­ment around Soul Ma­chines is pal­pa­ble when I speak to him at the com­pany’s Ferry Build­ing of­fice, ad­mit­ting that in his 30 years work­ing in the tech in­dus­try, he’s never had so much fun in his life.

“The cool thing about this tech­nol­ogy is that it’s only re­ally lim­ited by your imag­i­na­tion,” he says. “[Na­dia] was an amaz­ing first project for us be­cause you’re pro­vid­ing ser­vices to peo­ple that have his­tor­i­cally not been very well ser­viced. You’re pro­vid­ing many of them the abil­ity to be more in­de­pen­dent and get in­for­ma­tion di­rectly rather than have to work through third par­ties or have to wait for hours or even days to get some­one to talk to.”

“You can imag­ine build­ing dig­i­tal teach­ers to pro­vide ed­u­ca­tion to kids who don’t have ac­cess to teach­ers. You can imag­ine pro­vid­ing dig­i­tal ser­vice agents for refugees where gov­ern­ments can in­ter­face and in­ter­act with them in a sim­ple and easy man­ner. This is what’s re­ally ex­cit­ing, ev­ery time you sit down and talk with some­body, you come up with a dif­fer­ent use case.”

With some of these use cases, it’s not just spec­u­la­tion fu­elling them ei­ther. Although Cross is tightlipped about the spe­cific com­pa­nies in­volved, he says it’s cur­rently in the process of build­ing an­other fe­male dig­i­tal hu­man for a big soft­ware com­pany in Sil­i­con Val­ley, as well as de­vel­op­ing its first AR/ VR project for a me­dia com­pany in the UK. And per­haps in­dica­tive of its im­pend­ing launch into the fi­nan­cial sec­tor, Soul Ma­chines in­tro­duced its lat­est dig­i­tal hu­man, ‘Rachel’, on stage at the LendIt Con­fer­ence in New York City. Pow­ered by IBM Watson’s AI and Soul Ma­chines’ EI (as was Na­dia), Rachel demon­strated to an au­di­ence of FinTech ex­ec­u­tives how she could help cus­tomers pick out the ideal credit card not just ef­fi­ciently, but con­ver­sa­tion­ally as well.

A FACE I N THE CROWD

Per­haps one of the most ex­tra­or­di­nary things about Rachel (other than the com­plex neu­ral net­work plat­forms that sup­port her) is that her ap­pear­ance is based on a real-life per­son. And not just any per­son, but a Soul Ma­chines em­ployee sit­ting just three me­tres away from where Cross and I con­verse.

“Real Rachel is ac­tu­ally an avatar engi­neer. She spends half the day talk­ing to her­self. She’s got the weird­est job on the planet,” he re­marks.

Part of what makes Soul Ma­chines’ dig­i­tal hu­mans so vis­ually life­like is that, like Rachel, they’re all based on real life peo­ple. Its most re­cent dig­i­tal hu­man, for ex­am­ple, is based off of Filthy Rich star Shushila Takao, mak­ing her the first pro­fes­sional ac­tress to have her ‘like­ness’ li­censed to use as an avatar.

The process of build­ing a dig­i­tal hu­man is a three-stage process that takes ap­prox­i­mately eight

Voice only takes you so far. The anal­ogy we talk about i s what hap­pened to ra­dio when tele­vi­sion came along. Tele­vi­sion was a much more en­gag­ing, en­ter­tain­ing and i nter­ac­tive ex­pe­ri­ence. Just talk­ing to a voice can get i rri­tat­ing at times.

weeks. The first stage is vis­ual, start­ing with a 3D scan of the in­di­vid­ual can­di­date that is used to build out the graph­ics for the face. The sec­ond stage in­volves the char­ac­ter com­po­nent, where a per­son­al­ity is built and a series of emo­tional states that it’s al­lowed to ex­press are formed. Fi­nally, in the third stage, the avatar is brought to life us­ing the com­pany’s core com­put­ing tech­nol­ogy be­fore it’s ready to be used.

With the rise of the in­ter­net of things (IoT) and the pro­lif­er­a­tion of tech­nolo­gies like Ama­zon’s Alexa, Ap­ple’s Siri, Google’s ‘OK Google’ and Mi­crosoft’s Cor­tana, in­ter­ac­tive AI has al­ready be­come some­what ubiq­ui­tous. But our in­ter­ac­tions with these pro­grammes have so far been con­fined to a voice em­a­nat­ing out of an inan­i­mate ob­ject. But Dr. Sa­gar and his com­pany be­lieve that talk­ing to some­thing that looks a lot like a hu­man is far more likely to en­cour­age in­di­vid­u­als to be more open about their thoughts and ex­pres­sive with their face, al­low­ing a com­pany to pick up ad­di­tional in­for­ma­tion about what drives its cus­tomers.

“The hu­man face is in­cred­i­bly en­gag­ing. We’re nat­u­rally pro­grammed to look at and in­ter­act with them,” says Cross. “The way we look at it is over the next pe­riod of time, we’re go­ing to be spend­ing a lot more time in­ter­act­ing with ma­chines and AI. Whether it’s a vir­tual as­sis­tant on a web­site to a concierge that sits in­side your self-driv­ing car, the more we can hu­man­ise com­put­ing, the more use­ful it’s go­ing to be for us.”

While many may doubt that an ar­ti­fi­cially ren­dered face could elicit such gen­uine re­sponse from hu­man be­ings, nu­mer­ous cases have proven oth­er­wise. In 2015, Ja­panese re­searchers found that when sub­jects were ex­posed to im­ages of ro­bot hands and hu­man hands be­ing sliced with a pair of scis­sors, EEG scans showed that im­ages of both types of hands elicited the same neu­ropsy­cho­log­i­cal re­sponse. Even non-hu­man look­ing ro­bots sub­jected to vi­o­lence can gen­er­ate a strong sense of em­pa­thy. In one MIT ex­per­i­ment, par­tic­i­pants were asked to play with small, mech­a­nised di­nosaurs called Pleos. When they’re even­tu­ally asked to tor­ture their Pleos, many re­fused and even found the ex­er­cise un­bear­able to watch. And when a bomb-de­fus­ing ro­bot was left crip­pled, burnt and bru­tralised dur­ing a rou­tine mil­i­tary test in the USA, an army colonel brought the test to a halt, charg­ing that the ex­er­cise was “in­hu­mane”.

If this type of emo­tional re­sponse can be goaded from hu­mans in re­ac­tion to non-hu­man­like ro­bots, it would be nat­u­ral to as­sume that hu­manoid ma­chines with hyper re­al­is­tic fea­tures can make an even deeper, more mean­ing­ful con­nec­tion with those that in­ter­act with them on a reg­u­lar ba­sis. Af­ter all, when par­tic­i­pants for the pilot of Na­dia were asked if they’d use her again, 74 per­cent re­sponded

pos­i­tively, in­di­cat­ing they’d be happy use a dig­i­tal hu­man as their pri­mary means of in­ter­ac­tion with the gov­ern­ment.

“A lot of fo­cus is on mov­ing to that voice in­ter­face. But our view is that voice only takes you so far. The anal­ogy we talk about is what hap­pened to ra­dio when tele­vi­sion came along. Tele­vi­sion was a much more en­gag­ing, en­ter­tain­ing and in­ter­ac­tive ex­pe­ri­ence. Just talk­ing to a voice can get ir­ri­tat­ing at times.”

BUILD­ING THE DNA FAC­TORY

For busi­nesses and brands look­ing into Soul Ma­chines’ of­fer­ing, their ex­cite­ment de­rives not just from the po­ten­tial in­crease in ef­fi­ciency and cus­tomer sat­is­fac­tion, but the fact that it could be em­ploy­ing one of the very first dig­i­tal em­ploy­ees in the world. With the abil­ity to cus­tomise its em­ployee in both char­ac­ter and phys­i­cal ap­pear­ance, each dig­i­tal hu­man that’s cre­ated ex­hibits its own unique set of per­sonal traits.

Na­dia, who was de­signed by peo­ple with dis­abil­i­ties for peo­ple with dis­abil­i­ties, is rel­a­tively “con­ser­va­tive, very em­pa­thetic and not overly emo­tion­ally ex­pres­sive at this point in time”. Due to the na­ture of Na­dia’s role, it was im­por­tant she didn’t end up ex­press­ing an in­ap­pro­pri­ate emo­tion in re­ac­tion to some­thing she saw from some­one with cere­bral palsy or autism, for ex­am­ple.

At the other end of the scale, a dig­i­tal hu­man de­vel­oped in the form of Nathan Drake – a char­ac­ter in Sony Playsta­tion’s

Un­charted series – is a much more out­go­ing char­ac­ter who’s hu­mor­ous and full of bravado, while BabyX, be­ing an in­fant, is much more spon­ta­neous in her re­ac­tions than her adult coun­ter­parts. When it comes to Rachel, who’s a vir­tual cus­tomer ser­vice agent, she ex­hibits more breadth in her per­son­al­ity, with her emo­tional states rang­ing any­where from sassy to con­ser­va­tive de­pend­ing on whether she’s talk­ing to a 50-some­thing busi­ness per­son or a 20-some­thing col­lege stu­dent.

While it cur­rently takes about eight weeks for Soul Ma­chines to build a dig­i­tal hu­man ac­cord­ing to its cus­tomers wants and needs, Cross says it’s hop­ing to stream­line its avatars by cre­at­ing a “DNA fac­tory”, re­duc­ing the process down from weeks to days, and even­tu­ally, from days to hours.

“By cap­tur­ing some­where be­tween 20 to 30 dig­i­tal hu­mans of dif­fer­ent age groups, eth­nic­i­ties and gen­ders, we’ll be able to cre­ate dig­i­tal hu­mans from that dig­i­tal DNA with­out hav­ing to start from scratch,” he says.

“When we’re work­ing with big cor­po­rates, they of­ten have quite strong views on the de­sign phase. But my per­sonal view is that in the long term, they’ll move away from [the idea of hav­ing] a dig­i­tal brand rep­re­sen­ta­tive and in­stead have 20 to 30 dig­i­tal em­ploy­ees from which their con­sumers can choose to in­ter­act with. Do you want to talk to some­one who speaks Chi­nese? Do you want to in­ter­act with a male or fe­male? Or would you pre­fer a car­toon char­ac­ter be­cause dig­i­tal hu­mans aren’t your thing? I think peo­ple’s ap­proach to this will change quickly, but it’s still very early days.”

THE SHOCK OF THE NEW

While sen­tient be­ings have long fea­tured in mod­ern day sci­ence fic­tion, it goes with­out say­ing that cul­tural in­stances of emo­tional and ar­ti­fi­cial in­tel­li­gence com­ing to­gether have, for the most part, ex­hib­ited a cyn­i­cally dystopian slant. In Ex Machina, the hu­manoid ro­bot Ava man­ages to es­cape the locked down fa­cil­ity by emo­tion­ally ma­nip­u­lat­ing a young pro­gram­mer, leav­ing him trapped in a room to pre­sum­ably starve to death. In Stan­ley Kubrick’s 1960s epic 2001: A Space Odyssey, the ship’s com­puter, Hal, fa­mously goes rogue, tak­ing con­trol of the pods and turn­ing off the life sup­port sys­tems of all of its crew on board. Even as far back as the 1860s, es­says like Sa­muel But­ler’s ‘Dar­win among the Ma­chines’ ar­gued that me­chan­i­cal in­ven­tions were un­der­go­ing con­stant evo­lu­tion, and that even­tu­ally, the hu­man race would be sup­planted from its as sta­tus as the dom­i­nant species.

The ex­am­ples are end­less when it comes to show­ing how tech­nol­ogy could turn from a state of benev­o­lent sub­servience to malev­o­lent self-in­ter­est. Hav­ing been con­di­tioned with this re­cur­ring nar­ra­tive over the years, it’s no sur­prise ap­pre­hen­sion has been the pre­vail­ing re­ac­tion among those ex­posed to AI/ EI be­ings. Com­bined with the nat­u­ral techno­pho­bia that arises when new tech­nolo­gies are in­tro­duced and the rel­a­tive lack of un­der­stand­ing around AI in New Zealand, highly ad­vanced com­pa­nies like Soul Ma­chines have a lot to con­tend with.

“Very few peo­ple on the planet have ac­tu­ally had a chance to have a live in­ter­ac­tion with one of our dig­i­tal hu­mans, so it’s an in­tel­lec­tual thing,” says Cross. “I think it's one of those things that you just have to ex­pe­ri­ence. There's al­ways that per­cent­age of peo­ple who will be com­pletely turned off. It's like all new tech­nol­ogy. There's a per­cent­age of peo­ple who don't like Face­book, there's peo­ple who don't like voice­mail, and a lot of peo­ple don't use Siri and that's their choice.”

Cross un­der­scores that de­spite peo­ple’s fears that ro­bots will re­duce the num­ber of paid em­ploy­ment op­por­tu­ni­ties in the near fu­ture (a sus­pi­cion that dates back to the first In­dus­trial Revo­lu­tion), tech­nol­ogy like that of Soul Ma­chines’ is on course to en­abling hu­mans to do more with their lives rather than less.

“We’ve had this con­cept of a 40 hour work week for a very long time now. But what hap­pens if we only had to work 20 hours? We can spend more time in our com­mu­ni­ties, more time with our fam­i­lies. Is that such a bad thing for mankind? I don’t think so.”

And while sci­ence fic­tion’s cyn­i­cal nar­ra­tive pre­vails in most in­stances in pop cul­ture, it’s im­por­tant to note that not all fic­tional com­put­ers have made it their covert mis­sion to an­ni­hi­late the hu­man race. Just ask David Has­sel­hoff ’s favourite pal on four wheels.

“Think KITT the car in the TV show Knight Rider,” says Cross. “It was a per­son­al­ity in a car. It had flash­ing lights but it didn’t have a face. Now, we ac­tu­ally have the op­por­tu­nity to give KITT a face, or put a face in­side a lux­ury car. We can see how we can make that sci­ence fic­tion a re­al­ity.”

Let’s just hope that sci­ence fic­tion is a lit­tle more Knight Rider and lit­tle less West­world.

You can imag­ine build­ing dig­i­tal teach­ers to pro­vide ed­u­ca­tion to kids who don’t have ac­cess to teach­ers. You can imag­ine pro­vid­ing dig­i­tal ser­vice agents for refugees where gov­ern­ments can in­ter­face and in­ter­act with the min a sim­ple and easy man­ner. This is what’s re­ally ex­cit­ing, ev­ery time you sit down and talk with some­body, you come up with a dif­fer­ent use case.”

Greg Cross Dr. Mark Sa­gar

Newspapers in English

Newspapers from New Zealand

© PressReader. All rights reserved.