Facebook or Fakebook?
As President Obama said during a campaign speech in Michigan: “And people, if they just repeat attacks enough, and outright lies over and over again, as long as it’s on Facebook and people see it, and as long as it’s on social media, people start believing it. ... And it creates this cloud of nonsense.”
Facebook founder Mark Zuckerberg has been eager to profit from the power of his creation, but reluctant to take responsibility for the consequences of that power. He’s long insisted that he runs a “technology company,” not a “media company,” sort of like a phone company that simply conveys content without shaping it.
He’s totally wrong about that. Facebook is not Verizon or T-Mobile. But he’s right about something else. As Zuckerberg puts it, Facebook should not be the final “arbiter of truth.” It should be an editor, not a censor; a guide, not a dictator. The last thing we need is a Ministry of Veracity.
What we do need is a sense of balance. Facebook has to embrace its influence, but not misuse it. It has to be part of the solution, not part of the problem. That said, it can’t be the only solution. And there are encouraging signs that Facebook is facing up to that truth.
“We really value giving people a voice,” Facebook Vice President Adam Mosseri told The New York Times, “but we also believe we need to take responsibility for the spread of fake news on our platform.”
Let’s be clear about what he -- and we -- mean by “fake news.” The term has been hijacked by conservatives who are using it as one more weapon to attack the mainstream media. And it’s certainly true that even the best reporters make mistakes, or have blind spots. But that’s not fake news.
Fake news is deliberately fabricated to generate clicks, make money and, in some cases, alter the political debate. Pew reports that 23 percent of American adults have shared fake news stories with others, and 64 percent said made-up news has caused “a great deal of confusion” among voters.
So it’s a serious issue, and as a first step, Facebook is crowdsourcing the problem, “testing several ways to make it easier to report a hoax if you see one on Facebook,” says Mosseri.
Those reports will be forwarded to third party fact-checking organizations like Snopes and PolitiFact. If those services “identify a story as fake, it will get flagged as disputed,” explains Mosseri. You’ll still have the choice to share a flagged story, but it will carry a clear warning.
In addition, Facebook is “doing several things to reduce the financial incentives” for hoaxers by cutting off their ability to sell ads through the site.
These are good steps, but small ones, and they do nothing to solve another huge problem: Facebook algorithms that create “echo chambers” by sending readers only news articles that mirror the choices and preferences they’ve expressed in the past.
“Because Facebook tailors your News Feed based on your own behavior,