Ma­chine lan­guage


One startup’s quest to use AI to bring game di­a­logue to life

Of their two big­gest prom­ises, videogames have pretty much de­liv­ered on one. Huge and di­verse worlds filled with de­tail to dis­cover and things to do are now com­mon, even ex­pected. But the other prom­ise, that of get­ting to in­ter­act with char­ac­ters that re­spond nat­u­ral­is­ti­cally to your ev­ery word and ac­tion, is still lag­ging be­hind.

The NPCs you meet in games are the same scripted talk­ing heads that they’ve al­ways been. Some games are writ­ten bet­ter than oth­ers, but in com­par­i­son to the vis­ually opu­lent and sys­tem­i­cally deep worlds in which they stand, NPCs are wooden, their var­i­ous con­ver­sa­tional gam­bits con­stricted into se­ries of di­a­logue trees in which you lose all of the free­doms you en­joyed in the wider world.

One new tech startup is hop­ing to use AI to help solve the prob­lem of NPC di­a­logue. Spir­itAI’s aim is to cre­ate dy­namic con­ver­sa­tions that feel like speak­ing to au­ton­o­mous char­ac­ters, re­spon­sive to what you ex­press and ask, and will­ing to of­fer their own points of view. Spir­itAI has de­vel­oped var­i­ous tech­nolo­gies called Char­ac­ter En­gine, which in­cludes nat­u­ral-lan­guage clas­si­fiers, speech-to-text anal­y­sis and key­word ex­am­i­na­tion to in­ter­pret what play­ers are try­ing to say and to con­struct re­sponses by mod­el­ling emo­tion and the char­ac­ter’s knowl­edge about the world.

“OK, the far end of this is pass­ing the Tur­ing test, right? If it’s done and com­plete and per­fect, then it talks like a per­son,” says Emily Short, who man­ages Char­ac­ter En­gine, hav­ing long been a lead­ing fig­ure in in­ter­ac­tive fic­tion as a writer and co-devel­oper of var­i­ous text-ad­ven­ture en­gines. “And no, we’re not there yet.” But, she says, Char­ac­ter En­gine is stand­ing on a road of it­er­a­tive de­vel­op­ment that will fig­ure out how to cre­ate nat­u­ral­is­tic NPCs that are more fun and in­ter­est­ing to en­counter in a game.

For Spir­itAI co-founder chief cre­ative of­fi­cer Mitu Khan­daker, the chal­lenge comes down to how much data the sys­tem has to work with. “How big are the nat­u­ral-lan­guage clas­si­fiers? It’s mostly just a data prob­lem, and how you’re au­thor­ing the re­sponses.” Khan­daker was pre­vi­ously the in­die devel­oper be­hind Red­shirt, a sim of a so­cial net­work aboard a space sta­tion that tasked play­ers with mov­ing up the so­cial lad­der, be­fore be­com­ing an as­sis­tant arts pro­fes­sor of game de­sign at New York Uni­ver­sity. For her, Char­ac­ter En­gine is about al­low­ing writ­ers and nar­ra­tive de­sign­ers to craft spe­cific sto­ries with au­ton­o­mous char­ac­ters act­ing within them.

Its first demo was re­vealed at GDC ear­lier this year. Called The In­ter­ro­ga­tion, it has play­ers talk­ing to a Scot­tish ro­bot, try­ing to fig­ure out whether it’s guilty of a mur­der. Co-de­vel­oped with London-based devel­oper of Sur­geon Simulator and forth­com­ing MMO Worlds Adrift Bossa Stu­dios, you can type – or in the VR ver­sion, sim­ply speak – your ques­tions, and she’ll re­spond, re­veal­ing de­tails about other char­ac­ters and their re­la­tion­ships and the events dur­ing the

“OK, the far end of this is pass­ing the Tur­ing test, right? And no, we’re not there yet”

killing. As you delve, the ro­bot’s emo­tional state changes, ex­pressed in the UI and by its voice in­flec­tion and stut­ters, and you can ma­nip­u­late it to pro­voke dif­fer­ent re­sponses. You can make it anx­ious or an­gry by mov­ing the view­point closer or us­ing threat­en­ing or in­sult­ing lan­guage, and you can put it at ease by mov­ing away and be­ing kin­der.

The In­ter­ro­ga­tion plays a lit­tle like a dy­nam­i­cally driven Her Story – or, more di­rectly, Galatea, a cel­e­brated piece of in­ter­ac­tive fic­tion that Short wrote in 2000 which presents a sur­pris­ingly nat­u­ral­is­tic con­ver­sa­tion with a statue. “We were look­ing to dig into whether we could cre­ate the sense of a con­tin­u­ous con­ver­sa­tion in which new in­for­ma­tion is un­fold­ing and you get a sense of mak­ing an emo­tional dif­fer­ence to a char­ac­ter if you’re mean or nice to them,” says Short. “One of the big ob­jec­tives I had was to let the player par­tic­i­pate in a con­ver­sa­tion. I feel your stan­dard di­a­logue tree is con­strain­ing, right? You have this choice of two-to-four op­tions and it’s not fun to re-play. There’s no room for style or per­son­al­ity.” Char­ac­ter En­gine works through writ­ers giv­ing NPCs two sources of in­for­ma­tion, a ‘script space’ of words and phrases the NPC can say and a knowl­edge model, which is in­for­ma­tion the NPC knows about the world. In The In­ter­ro­ga­tion, the ro­bot knows the height and weight of the char­ac­ters she men­tions; if you ask her whether Ali­cia is strong, then the sys­tem can use that in­for­ma­tion to sur­mise an an­swer with­out the writer need­ing to specif­i­cally note it.

In many ways, the magic in Char­ac­ter En­gine lies in de­sign and writ­ing, rather than the tech­nol­ogy it­self. “The demo is a dif­fi­cult de­sign prob­lem be­cause it’s a sce­nario where the ro­bot is try­ing to be eva­sive, but at the same time, play­ers don’t know what to ask and how to in­ter­act with it,” says Khan­daker. The sys­tem has to there­fore seed the on­go­ing con­ver­sa­tion with point­ers and clues as to what to ask.

On its sur­face, the demo seems ut­terly freeform, but un­der­neath it’s care­fully struc­tured us­ing Char­ac­ter En­gine’s au­thor­ing tools, which can make par­tic­u­lar pieces of di­a­logue avail­able or un­avail­able de­pend­ing on the scene or stage in the con­ver­sa­tion the player has reached. The tools give the chance to give NPCs cer­tain nar­ra­tive beats to hit or facts they have to re­veal, and trig­gers for points at which the scene will end, whether through timers, reach­ing cer­tain emo­tional states or re­lat­ing cer­tain bits of in­for­ma­tion.

For Short, au­thor­ing a Char­ac­ter En­gine NPC is a lit­tle like scriptwrit­ing, in the sense that it makes a writer think first about mo­ti­va­tion and dra­matic struc­ture be­fore the words them­selves. But the tools are be­ing de­vel­oped to be highly adapt­able, so de­vel­op­ers can be as rigid or free as they want to be, and to use which­ever com­po­nents they like. It can even gen­er­ate on-the-fly mul­ti­ple choice di­a­logue op­tions rather than rely on play­ers in­putting nat­u­ral lan­guage.

“But the idea is mak­ing it so the mo­ments of con­straint are the rar­ity and most of the time the player has more flex­i­bil­ity,” Short says, though she’s not aim­ing for Char­ac­ter En­gine to live up to the dream of the holodeck – of en­tirely im­mer­sive vir­tual worlds that cor­rectly in­ter­pret and re­spond to play­ers’ ev­ery in­ter­ac­tion. “Fun­da­men­tally I don’t be­lieve in that,” she says. “Not even in the sense of whether we can ac­tu­ally do that, but in a de­sign sense. Would that be a sat­is­fy­ing and en­joy­able thing?”

For her, Char­ac­ter En­gine is for games, and when play­ers don’t have di­rec­tion they get paral­ysed with un­cer­tainty. “It’d be like con­stantly be­ing forced to be on an im­prov stage with­out be­ing trained. I’m not so much in­ter­ested in mak­ing it so you can never find the bound­aries, it’s more like, can we make nice, smooth edges so when you en­counter them they redi­rect you in a way that feels nat­u­ral? As a player, if you run out of ideas, the NPC pulls you back into the sto­ry­line, but you still have a level of free­dom you don’t get in a lot of cur­rent game struc­tures. That’s what I see as the sweet spot.”

For Spir­itAI, Char­ac­ter En­gine is a foun­da­tion, but its tech­nol­ogy can do more. As she be­gan to ex­plore the po­ten­tials of what it means for a bot to un­der­stand what a player is say­ing, Khan­daker re­alised that it could also be used to ad­dress an­other is­sue fac­ing videogames: a rather more press­ing one, of on­line ha­rass­ment. If it can ex­am­ine what play­ers are say­ing, what they’ve

“The idea is the mo­ments of con­straint are the rar­ity and most of the time the player has more flex­i­bil­ity”

pre­vi­ously said and also watch player-toplayer in­ter­ac­tion, it can iden­tify toxic player be­hav­iour.

Ally is a set of tools that does just this, which Spir­itAI has al­ready be­gun re­leas­ing to beta part­ners to be­gin us­ing with their live data. One demon­stra­tion, built to test the SDK only, shows a player fol­low­ing an­other and bom­bard­ing them with party re­quests. Ally no­tices the num­ber of re­quests and the prox­im­ity of their avatars and chats to the po­ten­tial vic­tim: “You seem to be hav­ing a prob­lem; are you OK?” That per­son can then re­spond in nat­u­ral lan­guage to say yes or no, where­upon the bot can take ac­tion on an of­fend­ing player, mut­ing them or ban­ning them as ap­pro­pri­ate to the game’s poli­cies, or sim­ply shut­ting up.

“There are lots of pa­ram­e­ters,” says Khan­daker, ac­knowl­edg­ing the fuzzy and in­con­sis­tent na­ture of on­line in­ter­ac­tions. “Are they up­set by any­one do­ing this, or just this par­tic­u­lar per­son? It asks ques­tions to un­der­stand, and it then learns from that for the fu­ture. Our bound­aries are very dif­fer­ent in dif­fer­ent sit­u­a­tions, right? We are OK with dodgy lan­guage when it’s peo­ple we know some­times, but not when it’s a stranger.”

The aim is to sup­port mod­er­a­tors and GMs, who can be deal­ing with thou­sands of sup­port tickets a day, and the level of en­gage­ment is up to the devel­oper, whether the bot talks di­rectly with play­ers or merely flags up po­ten­tial is­sues for hu­man mods to look at. De­vel­op­ers are also able to write a char­ac­ter for the bot, just as they might an NPC, so its in­ter­ac­tions fit into the game’s world.

And it doesn’t have to be an en­forcer. It can also iden­tify pos­i­tive be­hav­iour, re­ward­ing or sup­port­ing helpful play­ers. Ei­ther way, Ally could help shape healthy player com­mu­ni­ties so they be­come safer places to play, in what­ever way devs deem to make sense for their game.

Spir­itAI’s fo­cus on nat­u­ral lan­guage and bots fol­lows an ex­plo­sive growth in the field. Bots on Ten­cent’s WeChat so­cial net­work, for ex­am­ple, dom­i­nate the way peo­ple in China man­age so­cial ser­vices and query in­for­ma­tion on their phones, and they’re grow­ing on Face­book, Skype and other plat­forms in the west, too. It’s there­fore fol­low­ing a gen­eral trend in de­vel­op­ing nat­u­ral-lan­guage tech­nol­ogy, but is fo­cused on us­ing them to make games bet­ter, richer and safer places to play. Once upon a time, Edge asked, “What if you could talk to the mon­sters?” That ques­tion, it seems, is fi­nally close to be­ing an­swered.

Char­ac­ter En­gine’s script­ing sys­tem al­lows writ­ers and nar­ra­tive de­sign­ers to set up a ‘script space’ of in­for­ma­tion and di­a­logue, such as a time­line of events a char­ac­ter knows about. The sys­tem then al­lows them to im­pro­vise within that space, re­spond­ing to the player’s in­put in a nat­u­ral, or­ganic way

Emily Short (top) heads Spir­itAI’s Char­ac­ter En­gine; Mitu Khan­daker is the firm’s co-founder and cre­ative di­rec­tor

Ally, Spir­itAI’s on­line safety ser­vice, is be­ing tri­alled; here its helper bot is in­te­grated with an off-the-shelf Unity MMO pack­age to demon­strate how it recog­nises po­ten­tial so­cial prob­lems

Newspapers in English

Newspapers from Australia

© PressReader. All rights reserved.