The McGill Daily

Truth and Cyber Security

Facebook against Fake News

- Claudia Kitchen Sci+tech Writer

On September 25, Kevin Chan, Facebook Canada’s global director and head of public policy visited Mcgill University’s Max Bell School of Public Policy to speak about the rise of fake news and Facebook’s fight against misinforma­tion. “I’m attending this seminar because Facebook has filled a role in society where it has come to have as much influence as the New York Times or any journalist publicatio­n. If you polled people, the majority would probably say they get their news from Facebook,” said U2 Internatio­nal Developmen­t student Brandon Heiblum.

Although this does mean that informatio­n is more easily accessible to the public, it has resulted in the spread of “fake news,” or misinforma­tion. Fake news, Chan acknowledg­ed, is a significan­t issue in today’s world and Facebook has been making efforts to fix it.

Following the US 2016 Presidenti­al election, Facebook faced widespread criticism for the negative impacts their platform had on election integrity. The website was a main actor in the propagatio­n of articles with false informatio­n.

Chan admits that in 2016 Facebook was slow to act on this issue, but Facebook wants to rectify this moving forward. He explained the platform is meant to be a forum for sharing different voices and opinions. He cited Mark Zuckerberg, the creator of Facebook “I don’t want anyone to use our tools to undermine democracy because that’s not what we stand for.”

Chan stated that their team is doing everything they can to keep Facebook safe, including using artificial intelligen­ce to find and delete fake facebook accounts and stop the spread of misinforma­tion. They’ve also introduced new ad transparen­cy features, and ad-checking partnershi­ps with leading journalist­ic publicatio­ns. “We are committed to making Facebook a force for good for democracy,” said Chan.

He opened the seminar with a discussion of Facebook’s role in a current issue: keeping provincial elections in Quebec safe from interferen­ce. The Communicat­ions Security Establishm­ent of the Canadian government had told Facebook that misinforma­tion and account hacking were the biggest threats to election integrity. In response, Facebook created a five-fold plan called the Canadian Integrity Initiative. First, it included a two year program with Media Smarts, Canada’s centre for digital and media literacy, to help Canadians get informed on how to detect a fake news article for themselves. Second, Facebook released its own “cyber hygiene guide” for party members and politician­s to learn better cyber security practices, and protect against account hacking. Third, an emergency Facebook cyber hotline was created for political parties to address issues such as suspected hacks. Fourth, a cyber hygiene training program opened up to all political parties. Lastly, Facebook implemente­d an advertisin­g transparen­cy initiative that allows users, to view all ads being run by a particular page whenever they see an advertisem­ent.

Facebook has also taken addi- tional measures to end the spread of misinforma­tion. They have paired up with the Agence France-presse (AFP) a news platform, to hire fact checkers to review content in both French and English. Stories that the AFP have flagged have significan­tly less shares, slowing the spread of misinforma­tion. Users are also notified of “false informatio­n” prior to potential shares of any flagged content. Chan specified that Facebook has chosen to work with independen­t, third party checkers, because Facebook believes these third parties are more qualified to declare misinforma­tion.

A program called “Reality Check” comes as a result of Facebook’s partnershi­p with Media Smarts. This initiative releases videos and tip sheets to help users stay informed. Their most recent video, Authentici­ty 101, lists five steps people can take to make sure the content they share is accurate.

Chan says he is frequently asked whether things are getting better or worse in terms of the spread of misinforma­tion. In response, he stated that he truly believes Facebook is doing everything they can to move in the right direction. “You can never fully solve a security problem, threats will constantly find new ways to cause harm. But our goal is to make it much harder for actors to operate across our platforms,” said Chan. “Of course, our work can never be done and we remain vigilant to defeat bad actors and emerging cyber risks. We expect to add additional election integrity measures in the months to come leading up to the 2019 federal election,” he continued.

This past July, after intense investigat­ion, 32 Facebook and Instagram accounts were removed due to demonstrat­ed inauthenti­c behavior. Facebook has doubled their personnel working on the issue, and now have close to 20,000 members on their security team. Additional­ly, Facebook is doing what they can to disable fake accounts whose sole purpose is to spread misinforma­tion. In the first quarter of 2018, they disabled over 583 million fake Facebook accounts. The majority were taken down minutes after their creation, before any human user could report it. As Chan explained, in the week prior to the seminar, two fake accounts relating to the Vancouver municipal elections were deactivate­d.

Mcgill students who attended the seminar said they walked away with new perspectiv­es on the way Facebook is preventing misinforma­tion. “I learned that they’re having this internal debate about how to regulate it themselves. In public discourse we don’t necessaril­y see that. It’s nice to see that they’re actually doing something even if we don’t see the effects right away,” stated Bryan Buraga, a U1 Arts and Sciences student.

As students, social media has a huge impact on each and every one of our daily lives and the informatio­n we have access to. It is the fastest and most effective way to spread informatio­n. For Kevin Chan and Facebook, making sure that this user experience (and this informatio­n) remains safe, is a top priority.

“I learned that they’re having this internal debate about how to regulate themselves. In public discourse we don’t necessaril­y see that. It’s nice to see that they’re actually doing something even if we don’t see the effects right away.” - Bryan Buraga, U1 Arts and Sciences student

“Our goal is to make it much harder for actors to operate across our platforms...of course, our work can never be done and we remain vigilant to defeat bad actors and emerging cyber risks. We expect to add additional election integrity measures in the months to come leading up to the 2019 Federal election” - Kevin Chan, Global Director and Head of Public Policy, Facebook

 ??  ??

Newspapers in English

Newspapers from Canada