Toronto Star

Are Facebook moderators the answer?

- Emma Teitel

Things you expect to see when you open Facebook in the morning: cute kids, capybaras, fiery political rants, humble brags (“I don’t usually post stuff like this, but I thought all of you might like to know that I aced this IQ quiz”) and invariably, a photo of somebody’s pale, sunny-side-up eggs.

Things you don’t expect to see when you open Facebook in the morning? Rape, murder and attempted suicide streamed in real time.

Yet this is what scores of Facebook users saw in their newsfeeds recently thanks to the social network’s Facebook Live function that allows a person to stream herself and the happenings around her in real time. Of course, Facebook Live streams are usually totally benign or just plain annoying (no, I don’t need to see a panoramic shot of the lunch buffet at your all-inclusive resort).

But occasional­ly, they are brutally violent and grotesque. Last week a man in Thailand murdered his infant daughter and then killed himself in a video streamed to Facebook Live. In April, a Cleveland man shot and killed an elderly man in a video streamed to Facebook Live. In March, a 15-year-old girl was sexually assaulted in a video streamed to Facebook Live. And this week, a teen in Georgia attempted to live stream her own suicide on the video platform, until authoritie­s caught wind of the situation and intervened before it was too late.

In the wake of these events and amid public outcry, Facebook has announced a plan to prevent further traumatiza­tion of its more than one billion users. This week, the social network announced it’s going to hire a staff of 3,000 new moderators to screen for and flag violent and hateful video content on the site — and inform authoritie­s about said content when necessary.

Let that sink in for a minute: Facebook, arguably the world’s largest media site and disseminat­or of fake news, isn’t laying off thousands of people. It’s hiring them. If you’re a journalist who fears the next round of cuts at your publicatio­n, have no fear: there’s always room in Facebook’s massive content mines.

The good news about the hirings is that if FB’s new team is successful in its moderating efforts, we can all go back to believing that the most horrifying thing we’ll see on the social network is a photo of our friends’ vegan cooking.

The bad news, however, is that Facebook has a history of flagging content that shouldn’t be flagged.

Last year, the site removed from the account of a Norwegian newspaper the famous photo of Vietnam War survivor Kim Phuc, running naked down a road as a kid after being burned by napalm when her village was bombed. After a backlash from the paper and other users, Facebook restored the photo, but its reputation for flagging and removing non-offensive content remains intact, especially its reputation for removing photos of women’s nipples and pubic hair.

I have a hunch that the social network will use this particular moment as an opportunit­y to exercise its penchant for prudishnes­s. After all, it’s on especially high alert for inappropri­ate content and it’s on high alert for a very good reason.

But the downside to this newfound vigilance is that some disturbing content shouldn’t be flagged — at least not right away.

Some disturbing content — for example, footage of violent protest or instances of police brutality — should be seen even if it is disturbing. I’m thinking specifical­ly about Philando Castile, the 32-year-old black man from Minneapoli­s who, after he was shot by a police officer at a traffic stop last year, bled to death in a video streamed to Facebook Live. The video horrified almost everyone who saw it, most especially Castile’s family, but it also ignited a fierce national debate about anti-black racism — a reality Mark Zuckerberg alluded to in a statement he made after the Castile tragedy.

But I wonder if in today’s political climate Facebook is growing increasing­ly uncomforta­ble with its reputation as a hub for conflict, debate and near-constant political invective.

Perhaps it wants to shed some of its Twitter-like characteri­stics in favour of its Instagram ones: in other words, perhaps it would like to reposition itself as a platform rich with lightheart­ed, good news that doesn’t exhaust, enrage or traumatize its users.

Hiring 3,000 people to monitor the appropriat­eness of its content all day long certainly seems like a good start down that path. It will be a shame though if on their mission to protect us, the new moderators of Facebook shield us from the things we don’t want to see that should be seen regardless. Emma Teitel is a national affairs columnist.

The social network announced it will hire 3,000 new moderators to screen for and flag violent and hateful video content

 ?? THE ASSOCIATED PRESS ?? Chiranut Trairat’s 11-month-old baby girl, in portrait, was hanged by her distraught father on Facebook Live.
THE ASSOCIATED PRESS Chiranut Trairat’s 11-month-old baby girl, in portrait, was hanged by her distraught father on Facebook Live.
 ??  ??

Newspapers in English

Newspapers from Canada