Facebook takes serious aim at fake news
Facebook is taking steps to weed out fake news and hoaxes, addressing the growing controversy over its role in the spread of misinformation on the Internet that sharpened political divisions and inflamed discourse during the presidential election.
The giant social network said Thursday it plans to make it easier to report a hoax and for factchecking organizations to flag fake articles. It’s also removing financial incentives for spammers and plans to pay closer attention to other signals, such as which articles Facebook users read but then don’t share. Last month, Facebook barred fake news sites from using its ad-selling services.
“We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we’re approaching this problem carefully,” Adam Mosseri, vice president of product management, said in a blog post.
Facebook took heat after the election for not doing enough to remove fake news reports, such as a widely shared but erroneous article claiming Pope Francis endorsing Donald Trump. Some 170 million people in North America use Facebook every day. Nearly half of all adults in the U.S. say they get their news from Facebook. BuzzFeed News found that people who say they rely on Facebook as a major source of news were more likely to believe politically slanted fake news stories. An earlier BuzzFeed News analy- sis found that top-performing fake news articles on the election generated more engagement on Facebook than articles from major news outlets in the last months of the presidential campaign.
Fake news creates significant public confusion about current events with nearly one-fourth of Americans saying they have shared a fake news story, according to a Pew Research Center survey.
While saying it was “extremely unlikely” that phony stories shared on Facebook changed the election outcome, Facebook CEO Mark Zuckerberg said last month that work had begun to help the nearly 1.8 billion users of the social-media service “flag fake news and hoaxes.”
City University of New York journalism professor Jeff Jarvis Changes will help define users better decide what’s true and what’s not on the Internet says he’s pleased to see Facebook take fake news seriously. There’s more that Facebook will need to do to eradicate fake news from the platform but, he says, it’s “headed in the right direction.”
Facebook is reaching out to users in those crucial moments in which they are deciding whether to share an article, Jarvis says, and that will improve the experience of Facebook, with fewer opportunities to be fooled by fake news and fewer fake news articles flowing through the News Feed.
Social network giant to take new tack with news
“It’s a fairly crappy experience to go in and see ridiculously stupid lies in your feed. Some people say people want to believe what they want to believe. I’m not so cynical about mankind. I actually believe that people want to be correct given the opportunity to be correct,” Jarvis said.
Facebook says it will now make it easier to report fake news. “We’ve relied heavily on our community for help on this issue, and this can help us detect more fake news,” Mosseri said. To flag a fake news article, users will be able to click on the upper right hand corner of a post.
News articles flagged by users will be sent to third-party factchecking organizations that are part of Poynter’s International Fact Checking Network, Facebook says. If the article is identified as fake by the fact-checking organizations, it will get flagged as “disputed” and there will be a link to an article explaining why. Disputed stories will get pushed down in News Feed.
“It will still be possible to share these stories, but you will see a warning that the story has been disputed as you share,” Mosseri said.
Also, Facebook says it’s going to remove the financial incentives for spammers. Fake news sites lure people from Facebook to show them ads.
“(This) can help us detect more fake news.” Adam Mosseri, VP Facebook