Facebook’s election plan fall short, experts say
Ottawa urged to crack down on fake news targeting voters
Facebook Canada’s election integrity plan fails to get at the heart of the problem — and government should step in, say experts in digital politics.
The social media network released a cybersecurity guidebook for politicians last week and launched an exclusive email helpline in case their accounts are hacked by malicious players. It’s also partnering with non-profit MediaSmarts to run a news literacy campaign that will help citizens spot political misinformation ahead of the 2019 federal election.
The plan follows a report from Canada’s electronic spy agency in June that said the election could be vulnerable to cyberhackers and adversaries sowing fake news online.
Taylor Owen, an assistant professor of digital media and global affairs at the University of British Columbia, said Facebook’s response misses the mark and it’s up to Ottawa to fill gaps in policy.
“The economics and the functioning of the platform bump right up against our ability to govern our elections — and we’re not going to solve that through news literacy. We can only solve that by much more dramatic policy measures from governments, not from Facebook,” he said.
Because Facebook uses algorithms and profile data to tailor users’ news feeds, advertisers are able to strategically and directly target audiences based on their interests and political leanings — making it an effective campaign tool.
That can also potentially incentivize the dissemination of propaganda and misinformation in elections.
Facebook’s plan is “relying on this omni-competent citizen to be able to know that there’s something being run against them . . . It’s not actually addressing some of the root systemic causes — which is producing the possibility of fake news being lucrative, or the accountability issues in their ad targeting system,” said Fenwick McKelvey, an assistant professor of information and technology policy at Concordia University.
In the United States, the company has faced mounting pressure to shed more light on micro-targeted advertising since revealing Russian-linked groups placed approximately 3,000 ads to disrupt the 2016 presidential election.
Facebook agreed to hand over the posts to congressional investigators and the company’s lawyer is slated to testify in November about possible Russian interference in last year’s election.
Google and Twitter, also facing flak for enabling foreign influence in the election, will testify too.
Facebook previously vowed to make so-called dark advertising on its platform less opaque with a new tool — expected to be operational in time for Canada’s 2019 election — that would show users who ran a particular ad and all the other ads that organization is running. Dark advertising allows organizations to promote their message only to an intended recipient, meaning a politically charged group can direct messages with deceptive content on hotbutton issues, or false election promises, and that user is the only one to see it.
That culminated last week in U.S. senators introducing a bipartisan bill that calls for more transparency in online advertising.
Canadian policymakers must follow suit, said Elizabeth Dubois, an assistant professor at the University of Ottawa who focuses on digital democratic accountability and engagement.
“The laws are there for a reason. Right now, we don’t have the data or the ability to enforce them properly, and we need the platforms to be on board,” Dubois said. Owen and McKelvey echoed the sentiment.
“They need to demand access to these ads . . . The first step to accountability has to be at least having access to the data,” Owen said.
All three want Elections Canada’s mandate expanded to fully cover digital political campaigns, including forcing platforms to disclose all information on targeted ads posted during the campaign, such as where they are placed, who sees them, who purchased them and for how much.
Owen suggested empowering Elections Canada to impose fines on platforms that don’t promptly remove politically motivated hate speech.
The voter contact registry could al- so be expanded for social-media bots, a network of computer accounts run by one user to amplify a message, which would lower the potential for vote-suppression tactics or a robocall-esque debacle, Dubois said. Political entities that make automated phone calls to voters have to register. That was established after thousands of people received robocalls with false instructions on where to vote in the 2011 election.
“Does (the initiative) go to what the biggest threats are in our democracy? I would say no, I don’t think it is solving the major problems,” she said.
Elections Canada requires political and third parties to disclose who paid for an ad, but if there’s not enough space to do that directly in the post, it’s acceptable for the disclosure to be made on the page the ad links to. There’s no guarantee, however, that someone scrolling through their feed would click through. Political ad campaigns are also subject to spending caps, but filing requirements don’t ask for specifics on digital promotions.
Kevin Chan, Facebook Canada’s head of public policy, acknowledged that the measures in the company’s plan are not a “silver bullet” and said there are many actors involved in strengthening digital democracy.
“That’s not to say as a platform we don’t take our responsibility seriously and we don’t want to do what we can,” Chan said.
“We are very lucky and fortunate, in a way, that we have the luxury of thinking about this two years in advance.”
He stressed that the initiative is a direct response to the spy agency’s report, the first of its kind, and the potential for further action closer to the federal vote. That includes cracking down on fake accounts that create mischief in elections, as Facebook has done elsewhere, most recently in the French election, where it said it targeted 30,000 inauthentic accounts.
The platform also introduced technical safeguards to make it more difficult to post clickbait, with an eye to quashing the financial incentive to mislead.
“We have to be very careful of this challenge of removing inauthentic content, but not removing things that may be legitimate free speech,” Chan said.
Democratic Institutions Minister Karina Gould, who was front and centre for Facebook’s announcement, said cybersecurity is a “responsibility we all share.” She lauded the initiative but said it is only a first step.
“Social media platforms have become the new arbiters of information, and have an important responsibility to facilitate respectful and informed public discourse,” Gould said.
“It is important that we have conversations with social media platforms to ensure the continued protection of Canada’s democratic process.”
“The first step to accountability has to be at least having access to the data.” TAYLOR OWEN DIGITAL MEDIA AND GLOBAL AFFAIRS PROFESSOR, UBC