Otago Daily Times

How Facebook influences elections

Facebook is tilting the political playing field more than ever, and it’s no accident, Michael Brand, writes.

- Michael Brand is an adjunct associate professor of data science and artificial intelligen­ce, Monash University, Melbourne.

AS the US presidenti­al election polling day draws close, it’s worth recapping what we know about how Facebook has been used to influence election results.

The platform is optimised for boosting politicall­y conservati­ve voices calling for fascism, separatism and xenophobia. It’s also these voices that tend to generate the most clicks.

In recent years, Facebook has on several occasions been made to choose between keeping to its community standards or taking a path that avoids the ire of conservati­ves. Too many times, it has chosen the latter.

The result has been an onslaught of divisive rhetoric that continues to flood the platform and drive political polarisati­on in society.

How democracy can be subverted online

According to The New York

Times, earlier this year US intelligen­ce officials warned that Russia was interferin­g in the 2020 presidenti­al campaign, with the goal of getting President Donald Trump reelected.

This was corroborat­ed by findings from the US Brennan Centre for Justice. A research team led by journalism and communicat­ions professor Young Mie Kim identified a range of Facebook troll accounts deliberate­ly sowing division ‘‘by targeting both the left and right, with posts to foment outrage, fear and hostility’’.

Most were linked to Russia’s Internet Research Agency (IRA), the company also behind a 2016 US election influence campaign. Prof Kim wrote the troll accounts seemed to discourage certain people from voting, with a focus on swing states.

Earlier this month, Facebook announced a ban (across both Facebook and Instagram, which Facebook owns) on groups and pages devoted to the farright conspiracy group QAnon. It also removed a network of fake accounts linked to a conservati­ve US political youth group, for violating rules against ‘‘coordinate­d inauthenti­c behaviour’’.

However, despite Facebook’s repeated promises to clamp down harder on such behaviour — and occasional efforts to actually do so — the company has been widely criticised for doing far too little to curb the spread of disinforma­tion, misinforma­tion and election meddling.

According to a University of Oxford study, 70 countries (including Australia) practised either foreign or domestic election meddling in 2019. This was up from 48 in 2018 and 28 in 2017. The study said Facebook was ‘‘the platform of choice’’ for this.

The Conversati­on approached Facebook for comment regarding the platform’s use by political actors to influence elections, including past US elections. A Facebook spokesman said:

‘‘We’ve hired experts, built teams with experience across different areas, and created new products, policies and partnershi­ps to ensure we’re ready for the unique challenges of the US election.’’

When Facebook favoured one side

Facebook has drawn widespread criticism for its failure to remove posts that clearly violate its policies on hate speech, including posts by President Trump.

The company openly exempts politician­s from its factchecki­ng program and knowingly hosts misleading content from politician­s, under its ‘‘newsworthi­ness exception’’.

When Facebook tried to clamp down on misinforma­tion in the aftermath of the 2016 presidenti­al elections, exRepublic­an staffer turned Facebook executive Joel Kaplan argued doing so would disproport­ionately target conservati­ves, The Washington Post reported.

The Conversati­on asked Facebook whether Mr Kaplan’s past political affiliatio­ns indicated a potential for conservati­ve bias in his current role. The question wasn’t answered.

Facebook’s board also now features a major Trump donor and vocal supporter, Peter Thiel. Facebook’s chief executive, Mark Zuckerberg, has himself been accused of getting ‘‘too close’’ to Trump.

Moreover, when the US Federal Trade Commission investigat­ed Facebook’s role in the Cambridge Analytica scandal, it was Republican votes that saved the company from facing antitrust litigation.

Overall, Facebook’s model has shifted towards increasing polarisati­on. Incendiary and misinforma­tionladen posts tend to generate clicks.

As Zuckerberg himself notes, ‘‘when left unchecked, people on the platform engage disproport­ionately’’ with such content.

Over the years, conservati­ves have accused Facebook of anticonser­vative bias, for which the company faced financial penalties by the Republican Party. This is despite research indicating no such bias exists on the platform.

Fanning the flames

Facebook’s addictive news feed rewards us for simply skimming headlines, conditioni­ng us to react viscerally.

Its sharing features have been found to promote falsehoods. They can trick users into attributin­g news to their friends, causing them to assign trust to unreliable sources, so providing a breeding ground for conspiracy theories.

Studies have also shown social media to be an ideal environmen­t for campaigns aimed at creating mistrust, which explains eroding trust in science and expertise.

Worst of all are Facebook’s ‘‘echo chambers’’, which convince people that only their own opinions are mainstream. This encourages hostile ‘‘us versus them’’ dialogue, which leads to polarisati­on. This pattern suppresses valuable democratic debate and has been described as an existentia­l threat to democracy.

Facebook’s staff haven’t been shy about skewing liberal, even suggesting in 2016 that Facebook should work to prevent Trump’s election. Around 2017, they proposed a feature called ‘‘Common Ground’’, which would have encouraged users with different political beliefs to interact in less hostile ways.

Mr Kaplan opposed the propositio­n, according to The Wall Street Journal, due to fears it could trigger claims of bias against conservati­ves. The project was shelved in 2018.

Facebook’s track record isn’t good news for those who want to live in a healthy democratic state. Polarisati­on certainly doesn’t lead to effective political discourse.

While several blog posts from the company outline measures being taken to supposedly protect the integrity of the 2020 US presidenti­al elections, it remains to be seen what this means in reality. — theconvers­ation.com

 ?? PHOTO: REUTERS ?? Partisan tendencies . . . Facebook’s chief executive Mark Zuckerberg has been accused of getting ‘‘too close’’ to United States President Donald Trump.
PHOTO: REUTERS Partisan tendencies . . . Facebook’s chief executive Mark Zuckerberg has been accused of getting ‘‘too close’’ to United States President Donald Trump.

Newspapers in English

Newspapers from New Zealand