The Atlanta Journal-Constitution
Facebook fights back on fake news
Effort underway to keep false articles off social media site.
For weeks, Facebook has faced questions about its role in spreading fake news. The intense scrutiny has caused internal divisions at the social network and has pushed Mark Zuckerberg, the company’s chief executive, to say that he was trying to find ways to combat the problem.
In the company’s most concerted effort to combat fake news, Facebook said Thursday it had begun introducing a series of experiments to keep misinformation and false articles from being disseminated across its site.
The maneuvers the company is trying include one that makes it easier for its 1.8 billion members to report fake news. Facebook is also creating partnerships with outside fact-checking organizations to help it more clearly indicate when articles are false, as well as changing some ad practices to choke off the economics of fake news purveyors.
Facebook is in a tricky position with these tests. The company has long regarded itself as a neutral place where people can freely post, read and view content, and it has said it does not want to be an arbiter of truth. But as the social network’s reach and influence has grown, it has had to confront questions about its moral obligations and ethical standards in what it presents.
Its experiments on curtailing fake news show that Facebook recognizes it has a deepening responsibility for what is on its site. But Facebook also has to tread cautiously in making changes, as the company is wary of opening itself up to claims of censorship.
“We really value giving people a voice, but we also believe we need to take responsibility for the spread of fake news on our platform,” said Adam Mosseri, a Facebook vice president who is in charge of its News Feed, the company’s method of distributing information to its global audience.
He said the changes — which, if successful, may be available to a wide audience — are the results of many months of internal discussion about how to handle false news articles shared on the network. How much Facebook’s moves will make a dent in fake news is unclear. The issue is not confined to the social network, with a vast ecosystem of false news creators who thrive on online advertising and who can use other social media and search engines to propagate their work.
Still, Facebook has taken the most heat on fake news. The company has been under that spotlight since Nov. 8, when Donald Trump was elected president. Trump’s unexpected win almost immediately spurred people to focus on whether Facebook influenced the electorate, especially with the rise of hyperpartisan sites on the network and numerous examples of misinformation, like a false article about Pope Francis endorsing Trump for president that had been shared nearly 1 million times across the site.
Zuckerberg has said that he did not believe Facebook influenced the election result, calling it “a pretty crazy idea.”
In an interview, Mosseri said Facebook did not believe its News Feed directly caused people to vote for a particular candidate, given that “the magnitude of fake news across Facebook is one fraction of a percent of the content across the network.”
In Facebook’s new experiment, users will have a choice to flag a post as fake news and have the option to message the friend who originally shared the piece to let him or her know the article is false.
If an article receives enough flags as fake, it can be directed to a coalition of groups that will perform fact-checking, including Snopes, PolitiFact and ABC News, among others. Those groups will check the article and can mark it as a “disputed” piece, a designation that will be seen on Facebook.
Disputed articles will ultimately appear lower in the News Feed. If users still decide to share disputed articles, they will receive pop-ups reminding them that the accuracy of the piece is in question.
Facebook said it was casting a wide net to add more partners to its fact-checking coalition and may move outside of the United States with the initiative if early experiments go well.
Facebook also plans to impede the economics of spreading fake articles across the network. Fake news purveyors generally make money when people click on the false articles and are directed to third-party websites, the majority of which are filled with dozens of low-cost ads.
Facebook will scan those third-party links and check for things like whether the page is 85 percent advertising content — a dead giveaway for spam sites — or to see whether a link masquerades as a different site, such as a fake version of The New York Times.