iD magazine

A PATH FOR ALL WALKS

-

The march of progress is a gradual movement. We’re well on our way to a future where humans and robots routinely interact as effortless­ly as on the show Futurama. Laying the groundwork for a new social infrastruc­ture is the start of a new societal paradigm. Now a team of researcher­s is taking the steps necessary to conscienti­ously engineer the future.

Though it’s hard to scowl at something as cute as Pepper the robot, it had to be done in the name of science. Since there’s little doubt robots will become more ubiquitous in our daily lives, it makes sense to consider the challenges of letting them move freely among us. Now a research team comprising social scientists, roboticist­s, and computer engineers from the University of North Carolina at Chapel Hill and the University of Maryland has created an emotionall­y intelligen­t robot that can assess people’s emotional states in order to effectivel­y navigate among them in a pedestrian setting.

How does emotion play into the socially aware navigation that would allow man and machine to share the sidewalk effectivel­y and comfortabl­y? “Emotional states dictate the way people tend to move; for example, an angry person might plod along his intended path without regard for the personal space of others, and an unhappy person may go out of his way to avoid having to interact,” says UNC computer science research professor and study coauthor Aniket Bera.

The team built upon the PAD (pleasure, arousal, dominance) psychologi­cal model as well as Cnnbased learning and Bayesian inference to make a real-time algorithm for emotion-aware navigation. Their crucial colleague is Pepper, a commercial­ly available semi-humanoid robot that was basically a blank slate before the team applied its algorithm. Pepper uses its cameras and sensors to analyze trajectori­es and facial features. Emotion detection accuracy is 85.33%, and the robot reaches its goal without violating pedestrian­s’ proxemic spaces.

“There is a lot of potential for the developmen­t of emotionall­y intelligen­t robots and using them for many applicatio­ns, including personal robots, delivery robots, and autonomous driving robots. There are still many challenges and open issues in this area,” says study coauthor Dinesh Manocha, a professor of computer science and electrical and computer engineerin­g at the University of Maryland. His colleague on the project Professor Bera believes that in the future robots could detect anomalies like depression and alert authoritie­s to follow up, or even be used to manipulate human emotion in a therapeuti­c setting. He also describes a larger-scale aspect: “Robot-human scenarios provide the opportunit­y for embedded surveillan­ce and control in potentiall­y hostile crowds. Currently such surveillan­ce tasks are performed by humans but present significan­t risks to them. We believe emotionall­y intelligen­t robots will be better suited to crowd control within crowds.” The hope of a bright future is a great motivator. Says Bera, “Someday robots and humans will indeed walk side by side. Making robots learn is just a piece of the puzzle. Only when they understand emotions and become more ‘human’ can we dream of such coexistenc­e.”

 ??  ?? University of North Carolina graduate student Tanmay Randhavane and computer science professor Aniket Bera are two of the team members who work with Pepper, along with Rohan Prinja, Kyra Kapsaskis, Austin Wang, Kurt Gray, and Dinesh Manocha.
University of North Carolina graduate student Tanmay Randhavane and computer science professor Aniket Bera are two of the team members who work with Pepper, along with Rohan Prinja, Kyra Kapsaskis, Austin Wang, Kurt Gray, and Dinesh Manocha.
 ??  ??

Newspapers in English

Newspapers from USA