Business World

Facebook removes more ISIS content in first quarter by actively looking for it

-

FACEBOOK, Inc. said it was able to remove a larger amount of content from the Islamic State and al- Qaeda in the first quarter of 2018 by actively looking for it.

The company has trained its review systems — both humans and computer algorithms — to seek out posts from terrorist groups. The social network took action on 1.9 million pieces of content from those groups in the first three months of the year, about twice as many as in the previous quarter. And, 99% of that content wasn’t reported first by users, but was flagged by the company’s internal systems, Facebook said Monday.

Facebook, like Twitter, Inc. and Google’s YouTube, has historical­ly put the onus on its users to flag content that its moderators need to look at. After pressure from government­s to recognize its immense power over the spread of terrorist propaganda, Facebook started about a year ago to take more direct responsibi­lity. Chief Executive Officer Mark Zuckerberg earlier this month told Congress that Facebook now believes it has a responsibi­lity over the content on its site.

The company defines terrorists as nongovernm­ental organizati­ons that engage in premeditat­ed acts of violence against people or property to intimidate and achieve a political, religious or ideologica­l aim. That definition includes religious extremists, white supremacis­ts and militant environmen­tal groups. “It’s about whether they use violence to pursue those goals.”

The policy doesn’t apply to government­s, Facebook said, because “nation-states may legitimate­ly use violence under certain circumstan­ces.”

Facebook didn’t give any numbers for its takedown of content from white supremacis­ts or other groups it considers to be linked to terrorism, in part because the systems have focused training so far on the Islamic State and alQaeda.

Facebook has come under fire for being too passive about extremist content, especially in countries like Myanmar and Sri Lanka where the company’s algorithm, by boosting posts about what’s popular, has helped give rise to conspiracy theories that spark ethnic violence. People in those countries told the New York Times that even after they report content, Facebook may not take it down. —

 ?? AFP ??
AFP

Newspapers in English

Newspapers from Philippines