Gulf News

Facebook should protect the public

Livestream­ing has exposed the confused set of guidelines that underpin its moderation of images

- Suzanne Moore is an award-winning columnist for the Guardian. By Suzanne Moore

When Mark Zuckerberg launched Facebook Live last year, he said: “We built this big technology platform so we can go and support whatever the most personal and emotional and raw and visceral ways people want to communicat­e are as time goes on.” While for many people Facebook is a place to post endless pictures of perfect holiday moments, for some of its two billion users, it is a place to live-stream murders, rape, torture and suicides. That is about as raw and visceral as it gets. And yet it is just what Facebook, which continuall­y states that it is not a media company, can be used for.

If you ask users to provide unfiltered content, to share their lives with one another, the darkest parts of humanity can be posted. The onus has been on other users to ask for offensive content to be removed. Computer algorithms are not good at identifyin­g irony any more than they can identify a murder in real time and stop it being streamed.

Free speech is central to the ethos of Facebook and it does not want to be seen to censor it. Zuckerberg’s early hackeresqu­e motto “Move fast and break things” has supposedly been traded in for something more socially responsibl­e. Or has it?

What has been revealed in the Facebook Files is important and genuinely shocking. When human beings have to monitor the speech of their fellow human beings, which is what moderators do, the moderation guidelines matter. They tell us what that company or community finds acceptable. What has emerged about the internal “ethics” of Facebook is disturbing in itself. Images of nonsexual child abuse can be shown, livestream­ing of self-harm and all kinds of hate speech and racism are permitted. This platform acts as a sewer for the most painful and hateful parts of human existence. Facebook cannot control its content — possibly it has grown so big that it never will be able to.

Yet, actually, this company is 13 years old and much of what is happening now was predictabl­e. Politician­s such as Yvette Cooper have asked why Facebook has taken so long to remove videos of beheadings, or images of a child being sexually assaulted or videos of a stabbing — and then only after there has been an outcry in other media. Debates about revenge porn and “fake news” have become critical, with Facebook holding on to the claim that it is a mere “platform”. In other words, that it does not have the responsibi­lity for its content that a convention­al publisher has. What the Facebook Files reveal is a struggle over moderation and a very confused set of guidelines that need to be questioned. This is not some small community with an agreed set of principles, nor is it Zuckerberg’s idealised global community. It does not operate like other media organisati­ons that day in, day out, have to decide what is acceptable to publish.

Self-harm posts

Yet, the significan­t fact remains that Facebook makes money from its content. It makes a fortune from advertisin­g. It is not a neutral conduit. It is an exchange, a transactio­n, a business. It justifies some of its most alarming decisions, such as the showing of self-harm and suicide, by saying that Facebook can then help identify and encourage people to get help. Similarly, nonsexual child abuse images allow children at risk to be identified. We need more evidence on this surely and a debate about contagion as there has been an increase in the number of self-harm posts. Of course it is extremely difficult to measure how onscreen interactio­n may decrease empathy and it may be impossible to stop terrorists or child abusers using it. Surely, though, we have to ask what is happening when a platform can air the real-time killing of a child? Before “ordinary” people were given access to Facebook Live, it was tested by celebritie­s. Again, Zuckerberg saw a way to open up and extend his company’s reach and is now bringing in more moderators — but with the huge number of users, does anyone believe it could ever be enough? For too long it has been accepted that Facebook is what it says it is — a platform not a publisher. This has given it the freedom to become part of our everyday lives and often a useful and helpful resource. Social media, it is said, does not create but simply reflects the misogyny, racism, hatred and violence that is already there in the world. All is made visible.

The price of this visibility, though, is ethically questionab­le when all life is shown, including the taking of it. Facebook did not set out to monetise murder, but there have long been warnings about what would happen, as images and videos of terrible cruelty could be shared this way. The online world, we are continuall­y told, can never be effectivel­y policed. Companies within it can, however, effectivel­y make huge profits from “openness”, while the methods by which they operate remain secretive and closed. Facebook does indeed need to share much, much more than it is currently prepared to.

 ?? Niño Jose Heredia/©Gulf News ??
Niño Jose Heredia/©Gulf News

Newspapers in English

Newspapers from United Arab Emirates