Oman Daily Observer

Social media giants ‘must do more to police sites’

British MPs said Internet firms were ‘consciousl­y failing’ to stop such as IS promoting violence on social media

-

LONDON: Facebook, Twitter and YouTube should hire more people to monitor hate speech and material inciting violence as well as putting staff in police operation centres to remove offending posts faster, British lawmakers said.

In a report released on Thursday by parliament’s Home Affairs Select Committee, lawmakers said major Internet firms were “consciousl­y failing” to stop groups such as IS promoting violence on social media and they needed to take more responsibi­lity for the impact of material posted on their sites.

The report said large Internet companies should work with the government, police and security services to create an extensive round-the-clock hub to monitor and immediatel­y shut down such online activity.

“Huge corporatio­ns like Google, Facebook and Twitter, with their billion dollar incomes, are consciousl­y failing to tackle this threat and passing the buck by hiding behind their supranatio­nal status, despite knowing that their sites are being used by the instigator­s of terror,” said Keith Vaz, Chairman of the Parliament­ary Committee.

The report said it was “alarming” that teams of only a few hundred employees at the Internet firms were monitoring billions of accounts. It called on them to work more closely with the London police Counter Terrorism Internet Referral Unit, specifical­ly by putting staff inside a 24-hour police operations centre to identify and remove hate posts more quickly.

The report, which focused on radicalisa­tion, said while there was no evidence of a single path or event that triggered changes in behaviour, the Internet neverthele­ss had a “huge impact in contributi­ng to individual­s turning to extremism, hate and murder”.

The proposals made in the report are expected to be part of new legislatio­n, the Countering Extremism and Safeguardi­ng Bill.

For years, Facebook, Twitter and Google relied largely on user complaints to identify hate speech, rather than playing a more active role by hiring staff to monitor any abuse of their platforms. That changed late last media firms began to efforts.

With pressure growing for action, they agreed in May to tackle hate speech within 24 hours and reiterated on Thursday that they were already fully engaged in the battle to stop sites being exploited. Google and Facebook have year as step up social their moved to block violent videos automatica­lly.

In response to the parliament­ary report, Simon Milner, Facebook UK’s director of policy, said the company deals swiftly and robustly with any reports of abuse.

“In the rare instance that we identify accounts or material as terrorist, we’ll also look for and remove relevant associated accounts,” he said.

Twitter said last week it had suspended 235,000 accounts during the last six months believed to have links to militant groups such as IS, double the number it suspended from the middle of last year to February.

“As noted by numerous third parties, our efforts continue to drive meaningful results, including a significan­t shift in this type of activity off of Twitter,” it said.

YouTube said it would continue to work with the British authoritie­s to see what other steps could be taken. propaganda

The report said large Internet companies should work with the government, police and security services to create an extensive roundthe-clock hub to monitor and immediatel­y shut down such online activity.

 ??  ??

Newspapers in English

Newspapers from Oman