Instead of trying to protect democracy, Facebook passes the buck
Last week’s announcements from Facebook aimed at protecting the integrity of Canadian elections miss the key issue.
Having admitted its platform was used by Russians to try to influence the U.S. election, and with those methods now a public playbook for any and all potential meddlers, the company offered the bare minimum: a handy 21-page guide and training regime for Canadian politicians and political staffers to choosing better passwords.
Alongside this sits some money for a Facebookfunded media literacy effort, and a special email hotline for politicians whose accounts have been compromised. The message from Facebook is: “Dear users, it’s your inability to spot fake news that’s the problem — not our platform’s publishing and dissemination of it.”
Facebook needs to take a global approach, reforming its platform across the world, instead of making tokenistic efforts in Canada alone. If it looked within, and acknowledged that some common-sense regulation is needed in this space, Facebook could actually make a dent in the issues that are shaking confidence in elections. And not just that: Facebook could shine a light on the more pernicious practice of micro-targeting, or the slicing and dicing of the electorate through highly personalized ads, that is making our politics more divided.
What’s the recipe Facebook needs to follow to take its responsibilities in a democracy more seriously?
First, Facebook should describe the measures they plan to take to bring transparency to political ads. CEO Mark Zuckerberg recently announced it will soon start to publish so-called “dark ads” (that is, ads that only the person being targeted, and no one else, is able to see). But the company has yet to tell us exactly what this entails. At the very least, it should include detailed targeting information and the money spent (or still to be spent) on an ad. It should declare how many people have seen the paid component of the ad, and how much engagement resulted from its promotion. As a Facebook user, you should be able to quickly see the ads that have appeared in your feed, and those that have appeared in everyone else’s.
This information should be made available to researchers, journalists and regulators so they can work together to understand what’s going on as it happens, not months or years after, and only when the company is under severe political pressure.
Second, Facebook should require anyone wishing to place a political ad declare it — as is generally the case with political ads in print, radio or TV. By checking a box to mark an ad as political, campaigns and candidates signal to Facebook that their ads require human verification. This declaration would give permission for the company to display these ads differently, perhaps showing a disclaimer and additional information about the ads’ targeting and ultimate backer. Political advertisements failing to make this declaration can be reported and sent for further checks, creating a backup plan for any that slip through the initial vetting.
Third, posts deemed to be fake news — demonstrably and verifiably false posting, and not just the media that Donald Trump doesn’t like — should simply be deleted. And users who interacted with these posts by sharing or commenting on them should be informed they are sharing or spreading fake news.
Finally, the company should publish clear rules of engagement with political campaigns. Currently, it offers help and support to parties in the same way it does to many of its corporate clients. Facebook isn’t being clear about the actual degree of support it offers to campaigns in using its platform and targeting its ads.
In the absence of Facebook taking meaningful steps, political parties should take the threats to free and fair elections from harmful misinformation seriously, and work together to adopt a higher standard of transparency. They should publish all of their ads, and the associated targeting and spending, on their own websites. They should declare their meetings with Facebook and the other giant internet companies. And they should stay away from dark ads and the excessively negative content frequently associated with them.
The public knows online politics is a wild west. But they have almost no tools to find out what is really going on. Politicians, and Facebook as a company, rely on more than the public’s data — they rely on public trust. Meaningful transparency in online political ads is the only way to preserve trust, and keep our democracy healthy.