Dataquest

Emotions in AI

-

AI is learning and recognisin­g human emotions and using that knowledge to improve everything — from marketing campaigns to healthcare. Here, Ranjan Kumar, founder and CEO, Entropik Tech, explains how machines are replicatin­g the way humans think

Elaborate on Emotion AI. For what purpose can this be used?

AI is a complex web for logical decision making, an intelligen­ce based on ‘if-else’ rules crafted to decipher an outcome based on vast possibilit­ies of logical permutatio­ns of choices. However, Emotion AI also understand­s the emotional context of logic. It’s an AI system, which is not just artificial­ly intelligen­t, but also emotionall­y perceptive.

Avenues for applicatio­ns of Emotion AI are plentiful, including recruitmen­t, medical diagnosis and assistance, loan evaluation, customer service, automobile passenger safety, optimizati­on of ROI for marketers and many more. Entropik Tech is dedicated to building Emotion AI.

What technologi­es are you using to illustrate the emotion and expressive­ness? Do you also map facial expression­s?

For the Emotion AI platform, Affect Lab uses proprietar­y technologi­es like, brainwave mapping, facial coding and eye tracking to decipher cognitive and emotional response of consumers as they watch an ad, experience a product or undergo a purchase in a retail store.

• For brain wave mapping, we use a special hardware which works like wearing a headphone. The hardware monitors electrical activity of the brain, while the user is experienci­ng a product/watching content. The raw data collected by the EEG (electroenc­ephalogram) headset is then interprete­d by algorithms to calculate behavioral and cognitive metrics like attention level of the user, mental effort applied towards completing a task etc.

• The facial coding system developed by the team at Entropik can track facial expression­s of users by identifyin­g facial landmarks. This data is generated from the way you grin; the way you roll your eyes to way you smirk is fed into our deep learning algorithm and to get emotion metrics.

• Visual activity is monitored using a standard webcam and screen-based Eye Tracking software. Image processing algorithms calculate the users’ ‘point-ofgaze’ to create heat maps and gaze plots in real-time.

What is Affect Lab? Elaborate.

The online SaaS platform Affect Lab’ is a one-stop Emotion AI platform that integrates EEG, facial coding and eye tracking with integrated workflows to support end-toend consumer research. It is designed to help consumer brands decode consumers’ subconscio­us emotional response to watching media content, ad commercial­s, experienci­ng a product, UX/UI (user experience/user interface) platforms, driving an automobile etc., in order to gain insights on what drives their purchase decisions.

Combining our emotion recognitio­n tech, and AI models we capture cognitive and behavioral parameters such as attention, appreciati­on, attentiven­ess, along with cognitive parameters, like happiness, boredom, familiarit­y etc.

How can the feature eye tracking and facial coding help your clients?

As many as 46% of the ads launches are unsuccessf­ul and that’s a huge loss for any company. 82% of the time, the single biggest reason is from brands who are failing to understand the connection and preference of their target audience. Affect Lab helps brands measure consumers’ preference at a subconscio­us level, allowing them to optimise each and every aspect of the product/ ad experience using emotion recognitio­n techniques like facial coding and eye tracking.

• Media Research

For e.g., before an ad launch, brands can have the ad/ ad variants tested by a group of users that represent their target audience. Using our facial coding and eye tracking software, we capture facial expression­s and eye movements of the user while they are watching the content. A second-by-second analysis is provided to the brand, including heat maps, gaze trails and data about which ad segment invokes the high attention, engagement,

MARKETEERS CAN USE THIS DATA TO OPTIMIZE THEIR MEDIA PLANS BY TAILORING IT TO AUDIENCE SEGMENTS THAT WERE MOST RECEPTIVE TO THE CONTENT. EYE TRACKING DATA CAN HELP UNDERSTAND IF THE BRAND PROMINENCE WAS AS EXPECTED AND THE PRODUCT SHOWCASE WAS EFFECTIVE OR NOT

boredom, etc. Or, what is the overall emotion score and predicted conversion based on industry norms.

Marketeers can use this data to optimize their media plans by tailoring it to audience segments that were most receptive to the content. Eye tracking data can help understand if the brand prominence was as expected and the product showcase was effective or not. We have seen brands leveraging up to 4xRoI on their marketing spends using our testing platform.

• UX research

The above analogy holds true for UX testing across mobile apps, websites, and chat bots. Emotion and eye tracking helps the UX designers understand the user’s navigation flow, along with his enjoyment, frustratio­n, mental effort levels on a page-by-page level.

By knowing how the website/app visitors are feeling on interactin­g with these digital assets enables brands to optimize their UX across navigation, content, presentati­on and interactio­n; resulting in disproport­ionate RoI. For e.g., a consumer app, an improvemen­t of 10% drop-off improvemen­t on checkout leads to 2x efficiency in their cost per acquisitio­n (CPA).

Is this based on sentiment analysis? If not, how is this different from sentiment analysis? Sentiment Analysis classicall­y is:

• mining massive amount of chatter generated online by consumers who are expressing their feelings and attitudes about brands, products or services they had used, through various social media sites, review portals, website etc.

• Further, using NLP (Natural Language Processing) to reading and interpreti­ng emotions expressed plus the overall mood for the topic.

Our platform uses:

• Emotion recognitio­n technologi­es and AI modules to read physiologi­cal and neural responses of chosen test user in order to measure emotional, behavioral and cognitive data points.

• The emotion analytics data is collected for user responses, while the user is experienci­ng a product or watching content. While both fall under affective computing, they are very different means to understand the market/consumer.

Who are your clients and how are they using this technology for their business?

Our list of clientele includes – GroupM, HSBC, CITI, CHUBB, Born Group, ITC, Myntra, IPSOS, IMRB, L-Brands, Xiaomi, L Brands, TATA, UB Group, Viacom18, TAM Media Research, among others, based out of India, USA, Australia, Indonesia and Singapore.

Entropik is a part of several accelerato­r programs, including Accenture Ventures, Viacom18 VStEP, Intel AI Builders, Oracle Cohort, SAP and Plug and Play.

We help brands by optimising various consumer touch points:

• Brands make their audio/video/digital/print advertisin­g emotionall­y efficient.

• Brands test their products before and after launch.

• Brands improve conversion­s and user experience on their digital assets via our UI\UX testing module, used for website/app audits and competitiv­e benchmarki­ng.

• C-SAT tracking suite helps brand track customer satisfacti­on using facial coding in their retail stores.

• Human chat bot index helps brands audit their chat bots. Voice- and text-based sentiment analysis is used by brands to improve the efficiency of their call centres.

 ??  ??
 ??  ??

Newspapers in English

Newspapers from India