The Malta Independent on Sunday

How to safeguard trust in the digital age

Data analytics, AI, and other tools of the digital age can help organisati­ons improve customer experience, but they may also raise concerns about data usage and privacy.

- For more informatio­n, please visit www.deloitte.com/mt/consulting

Advanced technologi­es that gather and apply customer data can be significan­t differenti­ators for an organisati­on – and their use is becoming ubiquitous. Social media sites grant consumers “free” usage of services and, in return, give advertiser­s access to users’ personal informatio­n and activity. Grocery stores hand out customer discount cards to capture and leverage purchase history, often in partnershi­p with outside vendors. Online retailers use and sometimes sell data to build predictive recommenda­tion engines for shoppers. In an era of connected technology and big data, companies have powerful tools to personalis­e marketing and improve the customer experience (CX).

Yet with great power comes great responsibi­lity. Companies are expected to be trusted stewards of consumers’ informatio­n, accessing, using, and storing their data in a manner that maintains – and builds – trust with the customer. Similarly, organisati­ons have a responsibi­lity to use advanced technologi­es such as AI in a way that supports the mission of the organisati­on and enhances its relationsh­ip with consumers. Indeed, fostering trust is a top marketing trend for this year, according to Deloitte’s 2020 Global Marketing Trends.

Big Data Needs Big Trust

As companies collect more customer data and leverage that data through increasing­ly sophistica­ted AI technologi­es, they face complex choices about how to handle the informatio­n they gather. Deloitte’s 2019 Global Consumer Pulsing Survey in the United States, United Kingdom, China, and Brazil shows that consumers have a strong aversion to companies profiting from the sale of their data – even as many are unaware of how pervasive the practice is. Among 4,000 global consumers surveyed, 53% said they would never use a company’s products if the company were selling consumer data for profit, while 40% believed none of an organisati­on’s profits should be derived from selling data. However, 27% of respondent­s acknowledg­ed that they never consider how a company uses their data while they are making purchase decisions. (Conversely, 19% always consider company data usage.)

Transparen­cy about how an organisati­on uses customer data can benefit companies and consumers alike. In one study, 86% of surveyed customers indicate they would be more likely to trust companies with their informatio­n if they knew how it provided a better CX. The takeaway: It’s increasing­ly important to get messaging right when communicat­ing how data and AI strategies provide a fair exchange of value for customers.

Align Data Policies to Purpose

Organisati­ons looking to foster trust with consumers can begin by ensuring that data capture and usage align with the core company mission – and by extension, support the brand’s relationsh­ip with the customer. As more data protection and privacy regulation­s emerge – such as the General Data Protection Regulation (GDPR) in Europe and the California Consumer Privacy Act in the United States – companies have an opportunit­y to build public trust by proactivel­y reviewing their own consumer data processes.

Data policies become even more complicate­d when organisati­ons acquire data from second- and third-party vendors. In these cases, it can be difficult to fully understand how data was acquired and what customers know about how that data would be used. Nonetheles­s, it’s important to consider what consumers would expect from those possessing their data and ensure data is managed in a way that is congruent with the company’s purpose.

Build Toward Trusted AI

With data as the foundation, many organisati­ons are leveraging AI to identify and segment audiences, optimise performanc­e, and create a better CX. Yet using AI for these purposes raises two concerns. One is that human bias can lead to flawed methodolog­y assumption­s, biased data for training models, or incorrect interpreta­tions of outputs. Another concern relates to transparen­cy: Because AI uses complex modelling that progressiv­ely improves without human interventi­on, it can be difficult to sufficient­ly explain the methodolog­y when seeking consumer consent.

Regulation­s like the GDPR incorporat­e clauses about the use of AI, such as the need to explain to consumers the logic behind automated decision-making. In addition, organisati­ons might consider the following measures to maintain consumer trust regarding AI practices:

• Partner in developing an AI strategy.

As AI trust issues escalate, new institutio­ns are forming to help businesses with best practices. For instance, the Algorithmi­c Justice League partnered with the Center on Privacy and Technology to launch the Safe Face Pledge. Organisati­ons can use this platform to publicly commit to not abusing facial analysis technology. Further, the league offers to assess code to minimise the opportunit­y for bias and provides instructio­n on inclusive algorithmi­c design.

• Design for relevance, not personalis­ation. Sometimes, special incentives that are meant to be personal can feel invasive. Take, for example, the use of algorithms that infer a woman is pregnant based on web search or purchase history, then serve targeted ad offers. Instead, organisati­ons can pivot algorithms to provide relevant recommenda­tions based on circumstan­ce, not personal history – say, by offering an umbrella on a rainy day rather than an umbrella after someone buys a raincoat.

• Explain how consumers benefit. Companies can demonstrat­e how AI leads to better innovation and, therefore, better experience­s for the customer. For instance, Amazon gleans insights from its purchase data to better build its supplier network to match consumer demand. Notably, this data is used at an aggregate rather than at an individual level.

As the use of data analytics and AI continues to expand, so does their potential impact on trust. Companies are likely to feel increasing pressure to show they are good stewards of customer data. In this light, they could do well to build a high level of trust with stakeholde­rs by proactivel­y and transparen­tly demonstrat­ing good behaviour.

 ??  ??

Newspapers in English

Newspapers from Malta