Tech gi­ants di­vided over dig­i­tal as­sis­tants

Sassy woman or ma­chine?

Kuwait Times - - TECHNOLOGY -

SAN FRANCISCO: When users ask Siri, Ap­ple’s dig­i­tal as­sis­tant, what she likes to drink, she is quick with an an­swer. “I have a thirst for knowl­edge,” she re­sponds. Her coun­ter­part at Mi­crosoft, Cor­tana, opts for a very, very dry mar­tini.

But M, the dig­i­tal as­sis­tant Face­book is test­ing, de­flects the ques­tion. “I don’t have an opin­ion about that. What’s your fa­vorite drink?” As the tech gi­ants race to build ever bet­ter ar­ti­fi­cial in­tel­li­gence plat­forms, they are ob­sess­ing over the nu­ances of their dig­i­tal as­sis­tants’ per­son­al­i­ties.

For users, dig­i­tal as­sis­tants are a gate­way to pow­er­ful ar­ti­fi­cial in­tel­li­gence tools de­vel­op­ers ex­pect to in­flu­ence ma­jor de­ci­sions about what to buy and how to spend time. The more tech com­pa­nies can get users to rely on their dig­i­tal as­sis­tants, the more valu­able data they will ac­cu­mu­late about the spend­ing habits, in­ter­ests and pref­er­ences of users. The in­for­ma­tion could be fod­der for lu­cra­tive dig­i­tal ad­ver­tis­ing or a lever for com­pa­nies to keep users locked into their ecosys­tems.

But com­pa­nies are split on the best way to forge deep con­nec­tions with users. Siri and Cor­tana are wag­ing charm of­fen­sives, both quick to crack a joke or tell a story. Their elab­o­rate per­sonas are meant to keep users com­ing back. Face­book has built M with no gen­der, per­son­al­ity or voice. The de­sign bears some re­sem­blance to Google’s sim­i­larly im­per­sonal as­sis­tant.

While catchy one-lin­ers gen­er­ate buzz, a dig­i­tal as­sis­tant with per­son­al­ity risks alien­at­ing users or, the com­pa­nies say, mis­lead­ing them about the soft­ware’s true pur­pose: car­ry­ing out sim­ple tasks, much like a real-life as­sis­tant. Face­book’s no-non­sense as­sis­tant fo­cuses on han­dling chores such as or­der­ing flow­ers or making restau­rant reser­va­tions. “We wanted M to be really open and able to do any­thing - a really white piece of pa­per - and see how peo­ple use it,” Alex Lebrun, a Face­book ex­ec­u­tive who over­sees the AI team for M, said in an in­ter­view with Reuters.

Dig­i­tal as­sis­tants

For tech com­pa­nies, the stakes are high, said Matt McIl­wain, man­ag­ing di­rec­tor of Madrona Ven­ture Group, since dig­i­tal as­sis­tants can guide users to their own prod­ucts and those of their ad­ver­tis­ers and part­ners - and away from those of com­peti­tors. Google’s dig­i­tal as­sis­tant, for ex­am­ple, uses the com­pany’s search en­gine to ful­fill user re­quests for in­for­ma­tion rather than Ya­hoo or Mi­crosoft’s Bing.

“That trusted as­sis­tant could func­tion as my agent for all kinds of trans­ac­tions and ac­tiv­i­ties,” McIl­wain said. Re­search from the late Stan­ford pro­fes­sor Clifford Nass, an ex­pert on hu­man­com­puter in­ter­ac­tion, shows that users can be­come deeply in­vested in AI that seems hu­man, though they are also more dis­ap­pointed when the sys­tems come up short, rais­ing the stakes for com­pa­nies that make the at­tempt. And what charms one user can an­noy an­other a dan­ger that Face­book and Google have largely sidestepped.

Nev­er­the­less, the Siri team con­cluded that per­son­al­ity was in­dis­pens­able, said Gary Morgenthaler, an in­vestor in Siri, the startup that cre­ated the epony­mous as­sis­tant and was later ac­quired by Ap­ple.

“If you are em­u­lat­ing a hu­man be­ing,” he said, “then you are half­way into a hu­man type of in­ter­ac­tion.” Google has de­cided it doesn’t want to take per­son­al­ity fur­ther with­out hav­ing a bet­ter han­dle on hu­man emo­tion. “It’s very, very hard to have a com­puter be por­trayed as a hu­man,” said Ta­mar Ye­hoshua, vice pres­i­dent of mo­bile search. The Google app, making use of pre­dic­tive tech­nol­ogy known as Google Now, re­sponds to ques­tions in a fe­male voice but has few other gen­dered touches and lit­tle per­son­al­ity. The Google app does re­flect its cre­ator’s spirit of cu­rios­ity, how­ever, by shar­ing fun facts, Ye­hoshua said. Face­book has a team of hu­man “train­ers” be­hind M, who an­swer some re­quests that are be­yond the ca­pa­bil­i­ties of its ar­ti­fi­cial in­tel­li­gence. The com­pany hopes to gather data on users’ most fre­quent re­quests in or­der to im­prove M so it can han­dle them in the fu­ture.

That data is lim­ited, how­ever, as M is so far avail­able only to 10,000 peo­ple in the San Francisco Bay area. De­spite M’s de­sign, users fre­quently ask to hear jokes, a re­quest the as­sis­tant obliges. Hu­mans tend to an­thro­po­mor­phize tech­nol­ogy, aca­demics say, of­ten look­ing for a per­son­al­ity or con­nec­tion even when tech com­pa­nies in­ten­tion­ally have veered away from such things.

Ar­ti­fi­cial in­tel­li­gence

“When you give peo­ple this open mic, they will ask any­thing,” said Babak Hod­jat, co-founder of AI com­pany Sen­tient Tech­nolo­gies.

Siri’s per­son­al­ity did not change much af­ter Ap­ple ac­quired the startup in 2010, though she switched from re­spond­ing in text to speech at the in­sis­tence of the late Ap­ple co-founder Steve Jobs, said Adam Cheyer, a co-founder of Siri who is now a vice pres­i­dent at an­other AI com­pany, Viv Labs. “He was right on that call,” Cheyer said. “The voice is some­thing that peo­ple really con­nect with.” Mi­crosoft in­ter­viewed real-life per­sonal as­sis­tants to help shape Cor­tana’s per­son­al­ity, said Jonathan Foster, Cor­tana’s ed­i­to­rial man­ager. The as­sis­tant’s tone is pro­fes­sional, but she has her whims.

She loves any­thing science-fic­tion or math­re­lated - her fa­vorite TV show is “Star Trek” - and ji­cama is her fa­vorite food be­cause she likes the way it sounds. Such at­ten­tion to de­tail is crit­i­cal be­cause hu­mans are very par­tic­u­lar when it comes to ar­ti­fi­cial in­tel­li­gence, said Henry Lieber­man, a vis­it­ing sci­en­tist at the Mas­sachusetts In­sti­tute of Tech­nol­ogy who stud­ies hu­man-com­puter in­ter­ac­tion. Com­pa­nies must be mind­ful, he said, not to ven­ture into what re­searchers call the “un­canny val­ley,” the point at which an ar­ti­fi­cial in­tel­li­gence tool falls just short of seem­ing hu­man. Users be­come fix­ated on the small dis­crep­an­cies, he said. “It be­comes creepy or bizarre, like a mon­ster in a movie that has vaguely hu­man fea­tures,” Lieber­man said. iDA­vatars CEO Nor­rie J. Daroga said he walked a fine line in cre­at­ing So­phie, a med­i­cal avatar that as­sesses pa­tients’ pain. He gave So­phie a Bri­tish ac­cent for the U.S. au­di­ence, find­ing users are more crit­i­cal of as­sis­tants that speak like they do. And she has flaws built in be­cause hu­mans dis­trust per­fec­tion, said Daroga, whose avatar uses tech­nol­ogy from IBM’s Watson ar­ti­fi­cial in­tel­li­gence plat­form.

Some aca­demics say Siri’s per­son­al­ity has been her great­est suc­cess: Af­ter her release in 2011, users raced to find all her quips. But some of her re­torts have caused headaches for Ap­ple. When asked what to do with a dead body, Siri used to of­fer jok­ing sug­ges­tions such as swamps or reser­voirs-an ex­change that sur­faced in a 2014 mur­der trial in Florida. She is more eva­sive when asked the ques­tion to­day. “I used to know the

this” an­swer to she says. Even in that re­sponse, Morgenthaler sees traces of the true Siri.

“It’s a lit­tle bit of a protest against the cor­po­ra­ti­za­tion,” he said. “I don’t forget, but I’ve been made to forget.” — Reuters

BEIJING: A robot car­ries a cup to a jour­nal­ist (R) dur­ing the World Robot Con­fer­ence in Beijing yes­ter­day. The World Robot Con­fer­ence is be­ing held in Beijing from Novem­ber 23 to 25. — AFP

NEW YORK: This Tues­day, Sept. 1, 2015, photo shows Sam­sung’s Gear S2 smart­watch dur­ing a pre­sen­ta­tion in New York. Sam­sung smart­watches have im­proved tremen­dously. In­stead of swip­ing through screen af­ter screen, you now ro­tate the watch’s cir­cu­lar outer ring to se­lect apps or view no­ti­fi­ca­tions. — AP

Newspapers in English

Newspapers from Kuwait

© PressReader. All rights reserved.