Fiji Sun

Facebook is killing democracy with its personalit­y profiling data

The informatio­n that you shared on Facebook exposed your hopes and fears. That innocent-looking Facebook quiz isn’t so innocent

- TIMOTHY SUMMERS | The Conversati­on

Where state should you move to based on your personalit­y? What character on television would you be? What breed of dog is best for you? Some enormous percentage of Facebook’s 2.13 billion users must have seen Facebook friends sharing results of various online quizzes.

They are sometimes annoying, senseless and a total waste of time. But they are irresistib­le. Besides, you’re only sharing the results with your family and friends. There’s nothing more innocent, right? Wrong.

Facebook is in the business of exploiting your data. The company is worth billions of dollars because it harvests your data and sells it to advertiser­s. Users are encouraged to like, share and comment their lives away in the name of staying connected to family and friends. However, as an ethical hacker, security researcher and data analyst, I know that there is a lot more to the story. The bedrock of modern democracy is at stake.

You are being psychograp­hically profiled Most people have heard of demographi­cs – the term used by advertiser­s to slice up a market by age, gender, ethnicity and other variables to help them understand customers. In contrast, psychograp­hics measure people’s personalit­y, values, opinions, attitudes, interests and lifestyles. They help advertiser­s understand the way you act and who you are.

Historical­ly, psychograp­hic data were much harder to collect and act on than demographi­cs. Today, Facebook is the world’s largest treasure trove of this data. Every day billions of people give the company huge amounts of informatio­n about their lives and dreams. This isn’t a problem when the data are used ethically – like when a company shows you an ad for a pair of sunglasses you recently searched for. However, it matters a lot when the data are used maliciousl­y – segmenting society into disconnect­ed echo chambers, and customcraf­ting misleading messages to manipulate individual­s’ opinions and actions. That’s exactly what Facebook allowed to happen. Quizzes, reading your mind and predicting your politics. Recent reports have revealed how Cambridge Analytica, a UK-based company owned by an enigmatic billionair­e and led at the time by candidate Donald Trump’s key adviser Steve Bannon, used psychograp­hic data from Facebook to profile American voters in the months before the 2016 presidenti­al election. Why? To target them with personalis­ed political messages and influence their voting behaviour. A whistleblo­wer from Cambridge Analytica, Christophe­r Wylie, described in detail how the company exploited Facebook users by harvesting their data and building models to “target their inner demons.”

How did Facebook let this happen?

The company does more than just sell your data. Since the early 2000s, Facebook has provided access to academic researcher­s seeking to study you. Many psychologi­sts and social scientists have made their careers analysing ways to predict your personalit­y and ideologies by asking simple questions. These questions, like the ones used in social media quizzes, do not appear to have obvious connection­s to politics. Even a decision like which web browser you are using to read this article is filled with clues about your personalit­y.

In 2015, Facebook gave permission to academic researcher Aleksandr Kogan to develop a quiz of his own. Like other quizzes, his was able to capture all of your public informatio­n, including name, profile picture, age, gender and birthday; everything you’ve ever posted on your timeline; your entire friends list; all of your photos and the photos you’re tagged in; education history; hometown and current city; everything you’ve ever liked; and informatio­n about the device you’re using including your web browser and preferred language. Kogan shared the data he collected with Cambridge Analytica, which was against Facebook policy – but apparently the company rarely enforced its rules.

Going shopping for impression­able users

Analysing these data, Cambridge Analytica determined topics that would intrigue users, what kind of political messaging users were susceptibl­e to, how to frame the messages, the content and tone that would motivate users, and how to get them to share it with others. It compiled a shopping list of traits that could be predicted about voters.

Then the company was able to create websites, ads and blogs that would attract Facebook users and encourage them to spread the word. In Wylie’s words: “they see it … they click it … they go down the rabbit hole.” This is how American voters were targeted with fake news, misleading informatio­n and contradict­ory messages intended to influence how they voted – or if they voted at all. This is how Facebook users’ relationsh­ips with family and friends are being exploited for monetary profit, and for political gain. Facebook could have done more to protect users. The company encouraged developers to build apps for its platform. In return, the apps had access to vast amounts of user data – supposedly subject to those rules that were rarely enforced.

But Facebook collected 30 percent of payments made through the apps, so its business interest made it want more apps, doing more things. People who didn’t fill out quizzes were vulnerable, too.

Timothy Summers is Director of Innovation, Entreprene­urship and Engagement, University of Maryland, USA

 ??  ??

Newspapers in English

Newspapers from Fiji