The Guardian (USA)

Facebook isn't going to influence the next election – until it does

- Alex Hern

If you live in the US or UK, Facebook wants you to know that it isn’t going to influence who wins the next election.

But the company also wants you to know that it’s going to do its best to make sure that no one else exploits its platform to tilt the elections through attacks on democracy, deliberate misinforma­tion or foreign interferen­ce.

Unless the deliberate misinforma­tion comes from a political candidate – in which case Facebook’s not going to do anything, because, again, it doesn’t want to influence who wins the next election.

If this all seems a bit confusing, you aren’t alone. Facebook has set itself an impossible challenge. It needs to reassure everyone – of every political persuasion, in every country it operates in – that it’s not going to do anything to harm the democratic process. And it wants to convey this message without reminding everyone of the elephant in the room: that, if it wanted, Facebook could almost certainly choose the winner of the next election, completely legally and without anyone knowing.

Mark Zuckerberg doesn’t want that power. He’s said so, again and again. Last week, in a speech at Georgetown University, he defended the company’s laissez-faire approach to political misinforma­tion, saying: “As a principle, in a democracy, I believe people should decide what is credible, not tech companies. Of course there are exceptions, and even for politician­s we don’t allow content that incites violence or risks imminent harm – and of course we don’t allow voter suppressio­n.”

That same desire, to absent himself and his company from any responsibi­lity to shepherd the democratic process, is also what lies behind Facebook’s tortuous third-party factchecki­ng programme, introduced in 2016. To wit: the company selects profession­al factchecki­ng organisati­ons, ranging from respected impartial nonprofits (FullFact in the UK, FactCheck.org in the US) to … others (Tucker Carlson’s Daily Caller).

It pays those nonprofits to focus some of their time and attention on viral posts on Facebook and Instagram. If they flag a specific post or claim as false, Facebook then marks it as such on its own sites, and makes unspecifie­d tweaks to its undocument­ed algorithms to reduce how many people see the posts. On Monday, the company announced that it would also start hiding false pictures and videos behind a clickthrou­gh link declaring them as incorrect, and warning users if the post they are about to share is untrue.

All of which, three years after the programme was launched, adds up to a vaguely reasonable system to keep some of the clearest falsehoods from going mega-viral on Facebook’s social networks. But the bizarre hoops the company has jumped through to achieve those ends have nothing to do with whether or not they are the best way to actually fight misinforma­tion, and everything to do with Facebook’s desire to be seen as a company that is powerless over the content it hosts.

It’s this same factchecki­ng programme that Facebook recently exempted political candidates from, in a move that had the knock-on effect of explicitly allowing those candidates to lie in adverts on the site. Not content with tying itself to the mast, Facebook bound the factchecke­rs alongside it: it wasn’t going to decide what was credible, but neither were they. Never mind that that’s their job; Facebook needs to be seen to have its hands off politics.

But Facebook’s problem is that politics is always in its hands. There’s a longstandi­ng myth about the 1960 presidenti­al election: that Richard Nixon came off better among voters who listened to the debates on radio, but that John F Kennedy won amongst those who watched on TV. The story goes that the shifting technologi­cal base of the US helped the eventual winner shore up his vote.

Sixty years on, we have another shifting technologi­cal base. God help a candidate who comes off worse in vertical video, the format favoured by so many mobile apps. But not every change in the media environmen­t is purely technologi­cal. Some are deliberate. Over the past few years, for instance, Facebook has artificial­ly intervened in its timeline to boost video, then live video, then quality news, then posts from your friends.

Suppose one of the presidenti­al candidates in the 2020 election comes off better in livestream­s, while the other looks more natural in slickly edited videos. What would Facebook do in response? Would it commit to boosting both types of content equally? Would it freeze its algorithm in place for six months? Or declare that it doesn’t want to pick which candidate gets an advantage, so it’s ditching videos altogether until after the election? Of course not. But if Facebook did decide to boost livestream­s over the following period, it would be hard to escape the conclusion that, just like in 1960, the medium had defined the message.

Perhaps that’s too far-fetched. Suppose, instead, that one candidate was extremely good at wielding falsehoods for political gain, while another had built a reputation for their deep knowledge of arcane policy issues. What would we say, then, about a decision to crack down on misinforma­tion in politics? Or, conversely, a decision to explicitly allow any and all falsehoods to gain the full weight of the viral machine?

Facebook is like a giant who wanders into a busy town, then closes its eyes and declares that if it can’t see where it’s going, it’s not their fault who gets crushed underfoot. Ultimately there’s only two ways to stop it changing everything: keep it away entirely – or bring it down to size.

• Alex Hern is the UK technology editor for the Guardian

 ??  ?? ‘If it wanted, Facebook could almost certainly choose the winner of the next election, completely legally and without anyone knowing.’ Mark Zuckerberg speaks at Georgetown University. Photograph: Nick Wass/AP
‘If it wanted, Facebook could almost certainly choose the winner of the next election, completely legally and without anyone knowing.’ Mark Zuckerberg speaks at Georgetown University. Photograph: Nick Wass/AP
 ??  ?? ‘Facebook’s problem is that politics is always in its hands.’ Facebook and Instagram ads that appeared during the 2016 presidenti­al campaign. Photograph: Jon Els
‘Facebook’s problem is that politics is always in its hands.’ Facebook and Instagram ads that appeared during the 2016 presidenti­al campaign. Photograph: Jon Els

Newspapers in English

Newspapers from United States