SOLVING FACEBOOK’S FAKE NEWS PROBLEM
The news, the real news that is, is full of headlines about how Facebook Inc is cracking down on fake news. No matter what it does, though, the company cannot do enough because of definitional reasons, legal liability (or the prospect of it), a contradiction in its core business model, and the seeming inability of its leaders to realise just how powerful and influential the company really is. In the absence of a willingness to address these issues, nothing Facebook does will ever be enough, although, to be fair, the company does seem to be doing a lot.
For instance, just on Thursday past, the company said it is expanding fact checking to include images and videos in 17 countries, including India, through partnerships with 27 companies. The efforts seem well-intentioned (“We know that people want to see accurate information on Facebook, so far the last two years, we have made fighting misinformation a priority,” a blog post on the company’s site announcing the initiative begins); and some of the 27 partners do appear to be doing good work; yet, none of this is enough.
Even as recently as the first week of September, according to a report in Quartz, “the fourthmost engaged story on Facebook... is a story from America’s Last Line of Defense ... that Michael Jordan has resigned from Nike’s board, taking Air Jordans with him.” The Quartz report points out that fact-checker Snopes, one of Facebook’s fact-checking partners, calls America’s Last Line of Defence “a junk news network”. The fake news piece was obviously prompted by Nike’s ad featuring former NFL star Colin Kaepernick.
Even when something is labelled Fake News, it doesn’t get removed, only demoted, so that it doesn’t show up where it otherwise might have. As CEO Mark Zuckerberg said in a July interview with Recode’s Kara Swisher, “... we feel like our responsibility is to prevent hoaxes from going viral and being widely distributed”.
It isn’t clear, even after the Infowars incident, what Facebook’s policies are. Infowars, a far right media outlet owned by Alex Jones, became infamous after alleging that the Sandy Hook incident was staged and that the victims were actually child actors. In August, after much hemming and hawing, Facebook finally decided to take Infowars down, a move that came after much finessing by its senior executives on misinformation, free speech, and the like. Jones, of course, immediately became a free speech martyr, but it still isn’t clear what exactly will prompt expulsion from Facebook.
It is clear what should.
That would involve some definitional changes, though. There’s enough data to show that Facebook is the single largest source of traffic for most news companies. The exact number may vary, but it is generally accepted that this proportion is almost half in most countries. Facebook and Google also dominate the advertising landscape. Together, they account for around half of all digital advertising, with the former having an edge when it comes to mobile, and the latter when it comes to desktops.
If that business model sounds familiar, it is because this is the traditional business model of media companies. If digital giants such as Facebook are unwilling to accept this when discussing matters related to the responsibility for the content on their platforms — most gleefully admit at all other times, especially when it suits them, that they are media companies — it is because being intermediaries, and not content publishers guarantees them legal immunity in most countries. Their primary concern is that too much interference could take away that immunity. There is another problem that Facebook doesn’t talk about too much, although most people who have looked closely at the company and its growth couldn’t but help have noticed it (as indeed, the Quartz report highlights). That is the popularity of sensational news on Facebook, even sensational fake news. A company whose business model is dependent on engagement and the network effect (sharing) has no incentive to do anything that will reduce both.
The third ingredient of the company’s business model is data. Facebook’s treasure trove of data on its users is used to target the kind of content they like to consume (even if this is not necessarily accurate), and advertising. There’s now adequate evidence that this data may have been weaponised and used, among other ends, to influence elections.
In effect, Facebook isn’t an ordinary media company. It is a global media company, and a highly intrusive one, because it knows far more about its users than any traditional media company does. It knows enough to tailor messages, even content, to influence them — not just to buy a certain car or eat a certain brand of yoghurt but to vote for someone, to fan the kind of inflammatory sentiments that most traditional media companies would eschew. As governments around the world, including in India, realise this — Facebook’s power to insidiously influence elections, even fan hate crimes, say — there will be, as there are now, efforts to regulate it, to tie it down with tighter privacy and data protection laws, and to hold it responsible for what’s on it.
This writer believes that a good starting point would be to take away the immunity platforms such as Facebook enjoy. That would make Facebook responsible and accountable for the content on its website — just like other media are.
As Marshall Mcluhan wrote, “... a medium is not something neutral, it does something to people.” He also wrote about the “... roughing up that any new society gets from a medium, especially a new medium...” while explaining a variant of his famous statement that, thanks to a printer’s devil, he ended up endorsing, and which was the title of one of his books, “The Medium is the Massage”.