Facebook won’t unfriend lying politicians
Social media giant will be even bigger target on Capitol Hill if it vets election ads, says Laurence Dodds in San Francisco.
‘‘We believe political message reach should be earned, not bought.’’ Jack Dorsey Twitter’s chief executive
Facebook does not want to be the arbiter of truth. That, at least, is what Mark Zuckerberg keeps telling us. In advance of a recent speech at Georgetown University in Washington DC, defending his decision not to fact-check political advertisements, he said: ‘‘I don’t think people want to live in a world where you can only say things that tech companies decide are 100 per cent true.’’
It’s a superficially plausible argument for anyone who regards Facebook’s enormous powers of censorship – covering 2.45 billion people at the last count – with alarm.
The problem is that it contradicts what Zuckerberg himself has been saying for the last three years.
The firestorm over Facebook’s new policy has now been burning for more than a month. Numerous deliberately false ads have been uploaded in order to test it, in some cases forcing Facebook to break the spirit of its own policy and ban them anyway.
Elizabeth Warren, the US presidential candidate who wants to break up tech giants, has made antagonising Facebook a key plank of her campaign. A survey by Loup Ventures, a venture capital firm, found that 46 per cent of US liberals and 47 per cent of conservatives disagree with the policy, while Marissa Mayer, the former chief executive of Yahoo, has called for social networks to ban political ads entirely.
Facebook’s rivals have also turned up the heat. On Tuesday, Snapchat said that it would factcheck the political ads shown to its mostly young users. And on Friday last week, Twitter’s new political advertising ban comes into force, forbidding candidates to promote their tweets and imposing strict targeting restrictions on ‘‘issue’’ ads.
‘‘We believe political message reach should be earned, not bought,’’ Twitter’s chief executive, Jack Dorsey, said pointedly, citing the dangers of ‘‘unchecked misleading information’’.
Zuckerberg has resisted such a ban, saying social media ads level the playing field between rich, established candidates and insurgent outsider campaigns.
There’s truth to that: social media has crashed the price of advertising, undercutting traditional media by an estimated 40 per cent.
Research suggests that, compared to television, Facebook ads come from a wider range of candidates and are less likely to be hostile ‘‘attack ads’’ (although they are also more fragmented, showing different messages from the same campaigns to different people).
Twitter’s new policy may also create its own problems. The company has already been accused of letting big companies lobby for their interests while banning the activists who point out their problems.
Its many carve-outs and exemptions could easily become loopholes for partisan media outfits or party outriders to launder their allies’ message.
Twitter will certainly be forced to make daily judgment calls – something many users have utterly lost faith in its ability to do. As has been pointed out, identifying candidate ads is simple but identifying political
issue ads can be far more complex.
When it comes to fact-checking, however, Facebook’s arguments also don’t add up. Consider what Zuckerberg said at Georgetown: that Facebook shouldn’t decide what politicians can say. Yet only six days later, in testimony before the Senate, Zuckerberg contradicted himself.
‘‘Actually, Facebook itself does not fact-check,’’ he told senators. ‘‘When content is getting a lot of distribution,
it’s flagged by members of our community, or our technical systems, [and] it goes into a queue to be reviewed by independent fact checkers.’’
That’s true enough. After the 2016 US election, in which Facebook’s algorithms were accused of systematically promoting hyper-partisan fake news, Facebook was understandably leery of simply deleting such content.
Instead it outsourced the decision to independent, accredited
fact checkers, applying a warning label to anything rated ‘‘false’’ and instructing its algorithms to show it to fewer people. It did so precisely to avoid becoming an ‘‘arbiter of truth’’.
But now Zuckerberg is saying that applying this programme to politicians would put him in that very position.
Zuckerberg seems to be trying to have it both ways, describing the same system as hands-on censorship and hands-off prudence
depending on who it is applied to. That’s not to say Facebook is neutral in either scenario. It is the company’s choice to accept fact checkers’ judgment, and the company’s choice to enforce that judgment on billions of people.
When member of Congress Alexandria Ocasio-Cortez asked Zuckerberg if she’d be allowed to run ads falsely claiming Republicans backed the Green New Deal, Zuckerberg said ‘‘I think probably.’’
‘‘Do you see a potential problem here with a complete lack of factchecking on political advertisements?’’ Ocasio-Cortez responded.
Zuckerberg’s other argument is that politicians are already uniquely scrutinised, and that if their speech is taken down or ‘‘downranked’’ before people get to see it then the electorate will be less informed. That may be so, but all politicians’ ads will still be visible in Facebook’s highly useful ad archive, and in any case applying this policy
would not have to mean deleting any false ads, as Zuckerberg has implied. Instead it could mean simply hitting them with the same warning labels and the same algorithmic distribution penalties that Facebook applies to other content. It could mean watering down those penalties, or applying only a warning label and no change in ranking. Even those measures would be less controversial than a total exception. Nevertheless, Facebook is unlikely to do any of that. To understand its reluctance, simply consider what each group of people can do to Facebook if it censors them. Ordinary users can quit one at a time or sign a petition, whereas politicians – especially American politicians – can destroy it.
During the Obama years, Facebook courted the White House extensively, appointing numerous former officials from his administration. Today, the company shows special deference to the US Republican Party, which has frequently accused it of systematically suppressing conservative voices.
There isn’t much evidence for this: research suggests Right-wing pages continue to thrive on Facebook, with the top-performing news stories often dominated by Right-leaning media sources such as Fox and Breitbart.
Still, Facebook has responded to these accusations by donning a hair shirt, even commissioning an audit by former Republican senator John Kyle, which came to no firm conclusion but aired the fears, suspicions and anecdotes of 133 unnamed respondents.
Moreover, the company now goes to great lengths to maintain good relationships with American conservatives. It has hired public relations and strategy firms linked to the Republican Party, and the top three operatives in its Washington DC office are veteran Republicans.
Joel Kaplan, Facebook’s vicepresident of public policy, is also a lifelong Republican who served in the administration of president George W Bush. According to reports, he shot down a new factchecking initiative on the basis it would unfairly target the Right.
Whatever the truth of that, it’s not hard to imagine what would happen if Facebook began demoting ads. Allies of Donald Trump, who has gone further than most in broadcasting blatantly false claims, would kick off, and Facebook would spend every week from now until the 2020 election fighting an even bigger firestorm.
If Trump got a second term, it would become a target; if not, his party would be back in power within a decade or so.
Why would Facebook take that chance? Every time it chooses to exercise its power, it creates expectations that it will do so in future, plunging it into an eternal tug-of-war between political opponents who want it to rule in their favour. Over the past 15 years it has bitten that bullet for nudity, terrorist content, hate speech and fake news – but none of those were paid for directly by politicians.
So don’t expect Facebook to change course, despite the firestorm. To do so would not be, if not exactly biting the hand that feeds, at least biting the hand that holds the slipper aloft.