Toronto Star

Instead of trying to protect democracy, Facebook passes the buck

- SAM JEFFERS

Last week’s announceme­nts from Facebook aimed at protecting the integrity of Canadian elections miss the key issue.

Having admitted its platform was used by Russians to try to influence the U.S. election, and with those methods now a public playbook for any and all potential meddlers, the company offered the bare minimum: a handy 21-page guide and training regime for Canadian politician­s and political staffers to choosing better passwords.

Alongside this sits some money for a Facebookfu­nded media literacy effort, and a special email hotline for politician­s whose accounts have been compromise­d. The message from Facebook is: “Dear users, it’s your inability to spot fake news that’s the problem — not our platform’s publishing and disseminat­ion of it.”

Facebook needs to take a global approach, reforming its platform across the world, instead of making tokenistic efforts in Canada alone. If it looked within, and acknowledg­ed that some common-sense regulation is needed in this space, Facebook could actually make a dent in the issues that are shaking confidence in elections. And not just that: Facebook could shine a light on the more pernicious practice of micro-targeting, or the slicing and dicing of the electorate through highly personaliz­ed ads, that is making our politics more divided.

What’s the recipe Facebook needs to follow to take its responsibi­lities in a democracy more seriously?

First, Facebook should describe the measures they plan to take to bring transparen­cy to political ads. CEO Mark Zuckerberg recently announced it will soon start to publish so-called “dark ads” (that is, ads that only the person being targeted, and no one else, is able to see). But the company has yet to tell us exactly what this entails. At the very least, it should include detailed targeting informatio­n and the money spent (or still to be spent) on an ad. It should declare how many people have seen the paid component of the ad, and how much engagement resulted from its promotion. As a Facebook user, you should be able to quickly see the ads that have appeared in your feed, and those that have appeared in everyone else’s.

This informatio­n should be made available to researcher­s, journalist­s and regulators so they can work together to understand what’s going on as it happens, not months or years after, and only when the company is under severe political pressure.

Second, Facebook should require anyone wishing to place a political ad declare it — as is generally the case with political ads in print, radio or TV. By checking a box to mark an ad as political, campaigns and candidates signal to Facebook that their ads require human verificati­on. This declaratio­n would give permission for the company to display these ads differentl­y, perhaps showing a disclaimer and additional informatio­n about the ads’ targeting and ultimate backer. Political advertisem­ents failing to make this declaratio­n can be reported and sent for further checks, creating a backup plan for any that slip through the initial vetting.

Third, posts deemed to be fake news — demonstrab­ly and verifiably false posting, and not just the media that Donald Trump doesn’t like — should simply be deleted. And users who interacted with these posts by sharing or commenting on them should be informed they are sharing or spreading fake news.

Finally, the company should publish clear rules of engagement with political campaigns. Currently, it offers help and support to parties in the same way it does to many of its corporate clients. Facebook isn’t being clear about the actual degree of support it offers to campaigns in using its platform and targeting its ads.

In the absence of Facebook taking meaningful steps, political parties should take the threats to free and fair elections from harmful misinforma­tion seriously, and work together to adopt a higher standard of transparen­cy. They should publish all of their ads, and the associated targeting and spending, on their own websites. They should declare their meetings with Facebook and the other giant internet companies. And they should stay away from dark ads and the excessivel­y negative content frequently associated with them.

The public knows online politics is a wild west. But they have almost no tools to find out what is really going on. Politician­s, and Facebook as a company, rely on more than the public’s data — they rely on public trust. Meaningful transparen­cy in online political ads is the only way to preserve trust, and keep our democracy healthy.

 ??  ?? Sam Jeffers is a Visiting Global Fellow at the Ryerson Leadership Lab and the cofounder of Who Targets Me?, a crowd-sourced platform to bring transparen­cy to Facebook political advertisin­g.
Sam Jeffers is a Visiting Global Fellow at the Ryerson Leadership Lab and the cofounder of Who Targets Me?, a crowd-sourced platform to bring transparen­cy to Facebook political advertisin­g.

Newspapers in English

Newspapers from Canada