Sunday Times (Sri Lanka)

Facebook Live killings: Why the criticism has been harsh

-

Videos of two murders have been uploaded to Facebook and watched by hundreds of thousands of users over the past two weeks. Mark Zuckerberg has pledged to do more to prevent anything similar happening again, but the problem can't simply be blamed on Facebook.

One of the social network‘s biggest responsibi­lities is moderating users’ posts, but the sheer amount of content that’s uploaded to the site every day makes this an impossible task for people alone. That’s why it uses artificial intelligen­ce.

Facebook’s mysterious algorithms cut through the noise to filter out what’s acceptable and what’s not acceptable, saving the site’s human moderators from innocent posts and allowing them to focus on a much more manageable sample of data. That’s the theory, anyway. Every so often, something unacceptab­le gets through that AI filter system.

“The things that you never see are the successes,” Tata Communicat­ions’ future technologi­st, David Eden, said. “What you don’t see are the things that have been removed. You only see the things Facebook’s AI left, and they tend to be massive, glaring mistakes.”

Mistakes don’t come much bigger than failing to spot the video of Steve Stephens killing Robert Godwin Snr. and the Facebook Live of Wuttisan Wongtalay killing his 11-month- old daughter, before taking his own life off- camera.

Unfortunat­ely, it’s not an easy problem to solve. Facebook used an algorithm to successful­ly tackle click- bait, but only after it realised that, while lots of users were clicking on stories with phrases like “you’ll never guess what happened next…” and “this one trick…” in the headline, they didn’t actually Like the articles or spend much time reading them before returning to Facebook.

The murder videos are very different. Unlike clickbait, which was universall­y despised, some Facebook users enjoy viewing content the majority of people would consider unacceptab­le. That makes the site’s task a lot tougher.

The video of Stephens’ shooting was first reported to Facebook by a user at 12:59pm PDT - over an hour and a half after it was uploaded. Facebook disabled Stephens’ account and made the videos private 23 minutes later, at 1:22pm PDT. Wongtalay’s two videos, however, were up for around 24 hours. The first had been viewed 112,000 times, and the second had been viewed 258,000 times. Users had already uploaded the clip to YouTube before Facebook’s moderators even knew about it.

“It is a huge, huge problem. A whole world of humans wouldn’t be able to crunch through the same volumes of data as these algorithms,” Stuart Laidlaw, the CEO of Cyberlytic, said.

Mr Eden agrees that the problems can’t yet be fixed by AI, but evolving social norms and society’s constantly shifting perception of acceptabil­ity means it might always be ill-prepared for successful­ly dealing with unacceptab­le content. If it goes the other way and takes a heavy-handed approach, targeting all offensive content, Facebook will be criticised for censorship, and users will simply upload their content to a different site. “You can never please all of the people all of the time,” said Mr Eden. “There is always going to be an element of restrictio­n felt by certain people. But there’s definitely a role for AI to play in terms of pre- determinat­ion.”

 ??  ?? Jiranuch Trirat, (L), holds up the body of her 11-month-old daughter who was killed by her father who broadcast the murder on Facebook, at a temple in Phuket, Thailand April 25. Dailynews/ via REUTERS
Jiranuch Trirat, (L), holds up the body of her 11-month-old daughter who was killed by her father who broadcast the murder on Facebook, at a temple in Phuket, Thailand April 25. Dailynews/ via REUTERS

Newspapers in English

Newspapers from Sri Lanka