National Post (National Edition)

Facebook’s fake news woes bigger than hoaxes

WEBSITE’S ‘GROUPS’ SECTION NOW POSES MAIN THREAT, ANALYST SAYS

- Sarah Frier in San Francisco

In February, the Yournewswi­re page on Facebook was at its peak popularity, boosted by its salacious post claiming Prime Justin Trudeau was Fidel Castro’s love child.

That story, shared 1,800 times, represente­d exactly the kind of content Facebook Inc. had promised to clean up on its site — and which Yournewswi­re prolifical­ly produced. It was created to drive attention, and therefore advertisin­g revenue. It was also provably false.

Yournewswi­re is still publishing. But its stories aren’t going viral on Facebook anymore, and the website is finding it more difficult to make money. In the case of Yournewswi­re, at least, Facebook delivered on its promises in time for Tuesday’s U.S. midterm elections.

“In many ways, they’re an ideal test example in at least the limited scope of what Facebook said they wanted to do — to see blatantly false news debunked, and reduce its reach,’’ said Alexios Mantzarlis, who leads the Internatio­nal Fact-checking Network at the Poynter Institute.

It’s been more than two years since the 2016 U.S. election conversati­on was muddied with viral false informatio­n, like the report that Donald Trump had been endorsed by the pope. While Facebook has admitted some responsibi­lity for the spread of fake news, its road to fixing the problem has been slow. The company decided it would limit its efforts to stories that were provably false, and it wouldn’t do so directly. Instead, Facebook works with third-party factchecke­rs, including Politifact and Snopes. In interviews, fact-checkers said often they only have enough staff members to address a few stories a week, sometimes long after they’ve gone viral. When stories are debunked, Facebook reduces their reach.

This election is unlikely to see a story go viral at the level of the fake pope endorsemen­t article. Absent more data, it’s hard to know how much to attribute that progress to Facebook. Those who study the internet’s worst offenders say they aren’t resonating as much as they have in the past. It may be that readers are wiser, said Brooke Binkowski, managing editor at Truthorfic­tion.com.

“Readers have become more savvy,” said Binkowski, who used to work at Snopes, the Facebook partner. “They understand that fake news is a problem, and they’ve become more vigilant.’’

Facebook built a system that specifical­ly addresses hoax news websites and pages. But that shifted some of the fake news activity to posts and images that go viral in Facebook groups, in which old photos are often doctored or retitled to apply to current news events. Facebook groups have helped spread misinforma­tion about a caravan of immigrants walking on foot to the U.S. border — falsely labelling the group as violent or diseased, in a fear campaign that has bubbled up to President Trump.

“Groups have become the preferred base for co-ordinated informatio­n influence activities on the platform,” not the Facebook Pages that were active ahead of the 2016 election, said Jonathan Albright of the Digital Forensics Initiative at Columbia, in a report. “It is Facebook’s Groups — right here, right now — that I feel represents the greatest short-term threat to election news and informatio­n integrity.”

If hoax publishers aren’t as much of a problem in the U.S., polarizati­on still is. Publishers on the far right or far left — who don’t publish fake news so much as news in a skewed context, meant to alarm readers — still thrive on Facebook. The social network has been asking users to rate publishers by trustworth­iness, and baking the scores into its algorithm to address the issue. Still, hyperparti­san news thrives.

A Facebook spokeswoma­n said the company is aware that photo and video misinforma­tion has become more common, and this year started enabling factchecke­rs to address it, too. Facebook also touted three recent studies from Stanford University, the University of Michigan and the French newspaper Le Monde that concluded the magnitude of misinforma­tion has declined on the social network.

“It’s challengin­g to prove we’re making progress on this because of lack of consensus on what ‘false news’ means,” a Facebook spokeswoma­n said in a statement. “But we know that this is a highly adversaria­l space and we have more work to do.”

Yournewswi­re, based in California, built its reputation on conspiracy theories, claiming public figures are pedophiles or that vaccines kill, and became one of the top broadcaste­rs of blatant misinforma­tion. Content with no basis in fact is harder to fully disprove, Mantzarlis said.

But Yournewswi­re looks like a regular news site. At the top of the page, its tag line is “News. Truth. Unfiltered.” In some ways, that has made their false content more dangerous than that of Alex Jones, the propaganda artist behind the boisterous brand Infowars, according to Aaron Sharockman, executive-director of Politifact. Jones was banned from Facebook this year after public outrage over his content, especially his perpetual claim the Sandy Hook elementary school shooting didn’t occur.

Sharockman said every time a Politifact fact-checker finds a Yournewswi­re story to be erroneous, the reporter calls to inform the website. Since Facebook’s rules about down-ranking content went into place, Yournewswi­re started to delete the posts, in an effort to avoid the consequenc­e.

A Facebook spokeswoma­n said the firm is aware some publishers think that tactic is a workaround, and is soliciting feedback from factchecki­ng partners to make its policies more clear. Deleting a post isn’t enough to eliminate a “strike” against the page, Facebook said.

Sean Adl-tabatabai, Yournewswi­re’s editor-inchief, denied the site uses that strategy to protect itself against fact-checking, but has been in touch with Facebook about fact-checkers he thinks are overeager to nitpick his stories. He said Facebook listens, but tells him to take up his complaints with the fact-checkers directly.

“I would say overall the idea that there are thirdparty fact-checkers deciding what people can and cannot see on Facebook is problemati­c,” Adl-tabatabai said. “The fact-checkers have been given more and more leeway and what can I do? If they remove me from the public square and put me in the digital gulag, what can I do?”

Facebook said while sites are able to appeal the conclusion­s of fact-checkers, the partners are all following Poynter’s fact-checking principles.

Yournewswi­re hasn’t been removed from the platform. But some of its partners have distanced themselves. Revcontent, which used to serve ads on Yournewswi­re’s site, started in August to remove promotions and withhold revenue from any stories that outside fact-checkers had debunked. Starting Oct. 19, after repeated violations, Revcontent removed Yournewswi­re’s advertisin­g support entirely, making it the first site to face that consequenc­e for misinforma­tion reasons, according to Charlie Terenzio, Revcontent’s brand manager. Google, which had in the past served ads on the site, has not for a few years, according to a person familiar with the matter.

On Saturday, the domain Yournewswi­re.com started to reroute to newspunch. com. News Punch has a different tag line: “Where mainstream fears to tread.” Sinclair Treadway, who runs the site with Adl-tabatabai, said the site had to rebrand after its reputation and moneymakin­g abilities were affected by negative publicity and Facebook’s fact-checking program.

At Facebook’s Menlo Park, Calif., headquarte­rs this week, U.S. election activity was being monitored from a War Room with constant dashboards to keep tabs on what’s going viral. Facebook said it’s in contact with secretarie­s of state and state election bureaus to combat reports on any fake news on the site that could suppress voting.

Meanwhile, the company has expanded its factchecki­ng network to 14 other countries, where it works with local-language news organizati­ons and fact-checkers. The program doesn’t extend to Whatsapp, the company’s popular messaging app that’s encrypted, where there’s no visibility into which stories are going viral. The app was a major vehicle for disinforma­tion in a recent Brazilian election. Viral content has also helped fuel lynchings in India and ethnic warfare in Myanmar, which Facebook has only started to address this year.

The episodes demonstrat­e that although Facebook may have tamped down on hoax news in the U.S., bigger problems remain, driven by the viral nature of shocking content.

“I am not convinced that Facebook’s fact-checking program has ever worked,’’ Binkowski said.

READERS HAVE BECOME MORE SAVVY.

 ?? DAVID PAUL MORRIS / BLOOMBERG ?? The War Room is symbolic of Facebook’s work against fake accounts, misinforma­tion and foreign interferen­ce in elections.
DAVID PAUL MORRIS / BLOOMBERG The War Room is symbolic of Facebook’s work against fake accounts, misinforma­tion and foreign interferen­ce in elections.

Newspapers in English

Newspapers from Canada