Truth and Cyber Security
Facebook against Fake News
On September 25, Kevin Chan, Facebook Canada’s global director and head of public policy visited Mcgill University’s Max Bell School of Public Policy to speak about the rise of fake news and Facebook’s fight against misinformation. “I’m attending this seminar because Facebook has filled a role in society where it has come to have as much influence as the New York Times or any journalist publication. If you polled people, the majority would probably say they get their news from Facebook,” said U2 International Development student Brandon Heiblum.
Although this does mean that information is more easily accessible to the public, it has resulted in the spread of “fake news,” or misinformation. Fake news, Chan acknowledged, is a significant issue in today’s world and Facebook has been making efforts to fix it.
Following the US 2016 Presidential election, Facebook faced widespread criticism for the negative impacts their platform had on election integrity. The website was a main actor in the propagation of articles with false information.
Chan admits that in 2016 Facebook was slow to act on this issue, but Facebook wants to rectify this moving forward. He explained the platform is meant to be a forum for sharing different voices and opinions. He cited Mark Zuckerberg, the creator of Facebook “I don’t want anyone to use our tools to undermine democracy because that’s not what we stand for.”
Chan stated that their team is doing everything they can to keep Facebook safe, including using artificial intelligence to find and delete fake facebook accounts and stop the spread of misinformation. They’ve also introduced new ad transparency features, and ad-checking partnerships with leading journalistic publications. “We are committed to making Facebook a force for good for democracy,” said Chan.
He opened the seminar with a discussion of Facebook’s role in a current issue: keeping provincial elections in Quebec safe from interference. The Communications Security Establishment of the Canadian government had told Facebook that misinformation and account hacking were the biggest threats to election integrity. In response, Facebook created a five-fold plan called the Canadian Integrity Initiative. First, it included a two year program with Media Smarts, Canada’s centre for digital and media literacy, to help Canadians get informed on how to detect a fake news article for themselves. Second, Facebook released its own “cyber hygiene guide” for party members and politicians to learn better cyber security practices, and protect against account hacking. Third, an emergency Facebook cyber hotline was created for political parties to address issues such as suspected hacks. Fourth, a cyber hygiene training program opened up to all political parties. Lastly, Facebook implemented an advertising transparency initiative that allows users, to view all ads being run by a particular page whenever they see an advertisement.
Facebook has also taken addi- tional measures to end the spread of misinformation. They have paired up with the Agence France-presse (AFP) a news platform, to hire fact checkers to review content in both French and English. Stories that the AFP have flagged have significantly less shares, slowing the spread of misinformation. Users are also notified of “false information” prior to potential shares of any flagged content. Chan specified that Facebook has chosen to work with independent, third party checkers, because Facebook believes these third parties are more qualified to declare misinformation.
A program called “Reality Check” comes as a result of Facebook’s partnership with Media Smarts. This initiative releases videos and tip sheets to help users stay informed. Their most recent video, Authenticity 101, lists five steps people can take to make sure the content they share is accurate.
Chan says he is frequently asked whether things are getting better or worse in terms of the spread of misinformation. In response, he stated that he truly believes Facebook is doing everything they can to move in the right direction. “You can never fully solve a security problem, threats will constantly find new ways to cause harm. But our goal is to make it much harder for actors to operate across our platforms,” said Chan. “Of course, our work can never be done and we remain vigilant to defeat bad actors and emerging cyber risks. We expect to add additional election integrity measures in the months to come leading up to the 2019 federal election,” he continued.
This past July, after intense investigation, 32 Facebook and Instagram accounts were removed due to demonstrated inauthentic behavior. Facebook has doubled their personnel working on the issue, and now have close to 20,000 members on their security team. Additionally, Facebook is doing what they can to disable fake accounts whose sole purpose is to spread misinformation. In the first quarter of 2018, they disabled over 583 million fake Facebook accounts. The majority were taken down minutes after their creation, before any human user could report it. As Chan explained, in the week prior to the seminar, two fake accounts relating to the Vancouver municipal elections were deactivated.
Mcgill students who attended the seminar said they walked away with new perspectives on the way Facebook is preventing misinformation. “I learned that they’re having this internal debate about how to regulate it themselves. In public discourse we don’t necessarily see that. It’s nice to see that they’re actually doing something even if we don’t see the effects right away,” stated Bryan Buraga, a U1 Arts and Sciences student.
As students, social media has a huge impact on each and every one of our daily lives and the information we have access to. It is the fastest and most effective way to spread information. For Kevin Chan and Facebook, making sure that this user experience (and this information) remains safe, is a top priority.
“I learned that they’re having this internal debate about how to regulate themselves. In public discourse we don’t necessarily see that. It’s nice to see that they’re actually doing something even if we don’t see the effects right away.” - Bryan Buraga, U1 Arts and Sciences student
“Our goal is to make it much harder for actors to operate across our platforms...of course, our work can never be done and we remain vigilant to defeat bad actors and emerging cyber risks. We expect to add additional election integrity measures in the months to come leading up to the 2019 Federal election” - Kevin Chan, Global Director and Head of Public Policy, Facebook