A PATH FOR ALL WALKS

iD magazine - - A Photo And Its Story -

The march of progress is a grad­ual move­ment. We’re well on our way to a fu­ture where hu­mans and ro­bots rou­tinely in­ter­act as ef­fort­lessly as on the show Fu­tu­rama. Lay­ing the ground­work for a new so­cial in­fra­struc­ture is the start of a new so­ci­etal par­a­digm. Now a team of re­searchers is tak­ing the steps nec­es­sary to con­sci­en­tiously en­gi­neer the fu­ture.

Though it’s hard to scowl at some­thing as cute as Pep­per the ro­bot, it had to be done in the name of science. Since there’s lit­tle doubt ro­bots will be­come more ubiq­ui­tous in our daily lives, it makes sense to con­sider the chal­lenges of let­ting them move freely among us. Now a re­search team com­pris­ing so­cial sci­en­tists, roboti­cists, and com­puter en­gi­neers from the Univer­sity of North Carolina at Chapel Hill and the Univer­sity of Mary­land has created an emo­tion­ally in­tel­li­gent ro­bot that can as­sess peo­ple’s emo­tional states in or­der to ef­fec­tively nav­i­gate among them in a pedes­trian set­ting.

How does emo­tion play into the so­cially aware nav­i­ga­tion that would al­low man and ma­chine to share the side­walk ef­fec­tively and com­fort­ably? “Emo­tional states dic­tate the way peo­ple tend to move; for ex­am­ple, an an­gry per­son might plod along his in­tended path with­out re­gard for the per­sonal space of others, and an unhappy per­son may go out of his way to avoid hav­ing to in­ter­act,” says UNC com­puter science re­search pro­fes­sor and study coau­thor Aniket Bera.

The team built upon the PAD (plea­sure, arousal, dom­i­nance) psy­cho­log­i­cal model as well as Cnnbased learn­ing and Bayesian in­fer­ence to make a real-time al­go­rithm for emo­tion-aware nav­i­ga­tion. Their cru­cial col­league is Pep­per, a com­mer­cially avail­able semi-hu­manoid ro­bot that was ba­si­cally a blank slate be­fore the team ap­plied its al­go­rithm. Pep­per uses its cam­eras and sen­sors to an­a­lyze tra­jec­to­ries and fa­cial fea­tures. Emo­tion de­tec­tion ac­cu­racy is 85.33%, and the ro­bot reaches its goal with­out vi­o­lat­ing pedes­tri­ans’ prox­emic spa­ces.

“There is a lot of po­ten­tial for the de­vel­op­ment of emo­tion­ally in­tel­li­gent ro­bots and us­ing them for many ap­pli­ca­tions, in­clud­ing per­sonal ro­bots, de­liv­ery ro­bots, and au­ton­o­mous driv­ing ro­bots. There are still many chal­lenges and open is­sues in this area,” says study coau­thor Di­nesh Manocha, a pro­fes­sor of com­puter science and elec­tri­cal and com­puter en­gi­neer­ing at the Univer­sity of Mary­land. His col­league on the project Pro­fes­sor Bera be­lieves that in the fu­ture ro­bots could de­tect anom­alies like de­pres­sion and alert au­thor­i­ties to fol­low up, or even be used to ma­nip­u­late hu­man emo­tion in a ther­a­peu­tic set­ting. He also de­scribes a larger-scale aspect: “Ro­bot-hu­man sce­nar­ios pro­vide the op­por­tu­nity for em­bed­ded sur­veil­lance and con­trol in po­ten­tially hos­tile crowds. Cur­rently such sur­veil­lance tasks are per­formed by hu­mans but present sig­nif­i­cant risks to them. We be­lieve emo­tion­ally in­tel­li­gent ro­bots will be bet­ter suited to crowd con­trol within crowds.” The hope of a bright fu­ture is a great mo­ti­va­tor. Says Bera, “Some­day ro­bots and hu­mans will in­deed walk side by side. Mak­ing ro­bots learn is just a piece of the puz­zle. Only when they un­der­stand emo­tions and be­come more ‘hu­man’ can we dream of such co­ex­is­tence.”

Univer­sity of North Carolina grad­u­ate stu­dent Tan­may Rand­ha­vane and com­puter science pro­fes­sor Aniket Bera are two of the team mem­bers who work with Pep­per, along with Ro­han Prinja, Kyra Kap­saskis, Austin Wang, Kurt Gray, and Di­nesh Manocha.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.