Toronto Star

WAR ON FAKE NEWS

Facebook has ramped up its fight against misinforma­tion ahead of U.S. elections,

- GEORGIA WELLS AND LUKAS I. ALPERT

At Facebook Inc. headquarte­rs in Silicon Valley this week, engineers and researcher­s huddled around computers in a newly configured “war room” to fight misinforma­tion ahead of the mid-terms. Almost 3,000 miles away, in Philadelph­ia, the factchecke­rs hired to be on the front lines haven’t received fresh marching orders.

The disconnect highlights how Facebook’s efforts to combat fake news are playing out differentl­y this election cycle than many expected. Although the company has touted its partnershi­ps with organizati­ons including Factcheck.org in Philadelph­ia that provide human fact-checkers to vet possibly phoney posts, those groups are playing a limited role.

The vast majority of Facebook’s efforts against fake news are powered by artificial intelligen­ce, not humans.

Factcheck.org is one of five domestic groups hired by Facebook to deploy human factchecke­rs to help prevent a repeat of 2016, when the socialmedi­a giant’s platform was flooded with misinforma­tion aimed at sowing divisions ahead of the presidenti­al election.

On one recent morning, a Factcheck reporter reviewed a dubious Facebook post in which Democratic Rep. Nancy Pelosi purportedl­y praised U.S. President Trump’s tax cuts, but other staffers busied themselves with workaday tasks such as vetting traditiona­l political advertisem­ents and reviewing the public statements of elected officials.

Out of Factcheck’s full-time staff of eight people, two focus specifical­ly on Facebook. On average, they debunk less than one Facebook post a day. Some of the other third-party groups reported similar volumes. None of the organizati­ons said they had received special instructio­ns from Facebook ahead of the mid-terms, or perceived a sense of heightened urgency.

ABC News, which was part of the fact-checking effort when it began early last year, has dropped out. “We did a review, and we couldn’t tell if it was really making any difference; so we decided to reallocate the resources,” said a person familiar with ABC’s decision.

Facebook says fact-checkers were always expected to play a supporting role, and the reality is that humans can’t move quickly enough to identify and act on misinforma­tion before it goes viral on a platform the scale of Facebook’s, with billions of posts produced each day.

“Fact-checking has taken up a disproport­ionate amount of the conversati­on” around fake news, said Tessa Lyons, product manager at Facebook focused on the integrity of informatio­n in the news feed.

The most important function of human fact-checkers is to contribute to Facebook’s understand­ing of the sites that share false news and provide feedback that helps machine learning become more effective, Ms. Lyons said.

Facebook’s war room, which became operationa­l in September ahead of elections in Brazil, is staffed by employees rather than outside fact-checkers, although the company said it would include the outsiders in thorny decisions. One morning this week, several Facebook employees were tracking content across Facebook, WhatsApp and Instagram, as well as national and internatio­nal news. On one wall was a large American flag; clustered in a corner were motivation­al posters with slogans like “Focus on Impact” and “Be the Nerd.”

Facebook spends billions annually to improve its artificial intelligen­ce for a range of tasks, including content moderation. It paid Factcheck $189,000 (U.S.) in the fiscal year ended in June, according to public documents.

Earlier this year, Facebook also recruited The Associated Press to do fact-checking in all 50 states ahead of the midterms, a spokeswoma­n for Facebook said.

The other groups involved since the start of the fact-check- ing effort are PolitiFact and Snopes. After ABC News dropped out, the Weekly Standard came on board. Those groups either declined to comment or didn’t respond to requests for comment about how much they are paid, and Facebook also declined to comment.

Facebook for years resisted fact-checking content on the site, with CEO Mark Zuckerberg saying he didn’t want the company’s employees to be “arbiters of truth.” The introducti­on of third-party fact-checkers was an effort in part to insulate the company from criticism that it wasn’t taking misinforma­tion seriously and that it could potentiall­y inject the biases of its employees in the decisions to demote fake news.

Since beginning the program in early 2017, Facebook has expanded it to 19 countries and lately has made several tweaks to make the operation more efficient. It recently introduced the ability for the fact-checkers to vet videos and photos, as well as links.

Ms. Lyons wouldn’t offer an assessment on her team’s efforts to clean up the news feed, but pointed to two recent academic studies, including a col- laboration between researcher­s at Stanford University and New York University that found interactio­ns with fake news stories on Facebook have declined since early 2017. The other, from the University of Michigan, found the overall quality of content on Facebook has improved since mid-2017.

Still, trying to rid the news feed of lies, malicious rumours, fake news and misleading content remains an uphill battle.

“It’s like bringing a spoon to clear out a pig farm,” said P.W. Singer, co-author of the book “LikeWar: The Weaponizat­ion of Social Media” and senior fellow at New America, a non-partisan policy think tank in Washington, D.C. “Facebook is never going to be able to hire enough people, and the artificial intelligen­ce is never going to be able to do all of this on its own.”

At Factcheck, the editing process can be time-consuming to assure that there are no mistakes. Each post is screened by as many as four editors before being published, said Saranac Hale Spencer, one of the two reporters Factcheck hired specifical­ly to work on the Facebook initiative. When Facebook set up the initiative, it required that at least two fact-checking organizati­ons agree something was incorrect before listing it as debunked, but it has since loosened the requiremen­t in the interest of speed, Ms. Spencer said.

Facebook is also doing more to guide the fact-checkers on which items to address. On Friday, Facebook started testing a system to notify fact-checkers with a push notificati­on when it identifies an item that the company has a high degree of confidence is false. Previously, the fact-checking groups had little guidance in how to choose among the thousands of flagged posts that populate the database at any given time.

 ??  ??
 ?? NOAH BERGER AFP/GETTY IMAGES ??
NOAH BERGER AFP/GETTY IMAGES

Newspapers in English

Newspapers from Canada