MPs: Put online giants in dock over extremism
SOCIAL media firms that refuse to remove extremist or abusive content should face criminal charges, MPs said yesterday.
Google, Facebook and Twitter were accused of shameful and indefensible behaviour in failing to take down vicious material even after it had been reported to them.
The Commons home affairs committee said websites that spread terrorist propaganda, far-Right racial abuse and murderous threats were probably already committing offences punishable by jail.
They urged ministers to move after the election to strengthen legislation to ensure criminal laws are used.
Sanctions should include heavy fines for companies that fail to take down hate material within a short time, the MPs said.
In a scathing report, the committee called for a string of regulations to ensure internet giants take their reponsibilities seriously.
The MPs accused them of putting profit before public safety. They said Facebook, Twitter and Google, which owns YouTube, were ‘ outsourcing their responsibilities at zero expense’ by demanding users report material before they will take it down.
The companies should be made to pay the cost of policing their sites for vicious material in the same way that foot-
SHAMING OF WEB GIANTS Daily Mail, March 15 ‘This is not beyond them’ GOOGLE ON RACK OVER CASH FROM HATE VIDEOS March 18 GOOGLE, THE TERRORISTS’ FRIEND... March 24
ball clubs pay police to maintain order on matchdays, the committee argued.
They called it ‘completely unacceptable’ that multinationals decline to say how many people they have working to remove extremist or hateful material. The companies should have to produce transparent reports every three months that say exactly what they are doing to police themselves, and they should give the number and seniority of the people doing the work, the MPs said.
The report added: ‘ Quick action is taken to remove content found to infringe copyright rules, but that same prompt action is not taken when the material involves hateful or illegal content.’
Yvette Cooper, the Labour chairman of the committee, said: ‘Social media companies’ failure to deal with illegal and dangerous material online is a disgrace.
‘ They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse. Yet repeatedly they have failed to do so. It is shame- ful. These are among the biggest, richest and cleverest companies in the world, and their services have become a crucial part of people’s lives. This isn’t beyond them to solve, yet they are failing to do so.
‘They continue to operate as platforms for hatred and extremism without even taking basic steps to make sure they can quickly stop illegal material, properly enforce their own community standards, or keep people safe.’
A spokesman for YouTube said it took the issue very seriously and was taking action.
Simon Milner of Facebook, said it had developed quick and easy ways for people to report content. Nick Pickles of Twitter said it had significantly expanded its efforts to deal with offensive material.
WILL anyone be held to account for the scandalous failings that allowed rogue surgeon Ian Paterson to mutilate hundreds of women with botched and often unnecessary mastectomy operations?
Despite a string of warnings from fellow doctors, he continued to practise his malign ‘ experimental surgery’ with impunity for more than a decade. Yet so far not a single NhS employee has even been disciplined.
It’s nothing less than an outrage that those who colluded in this gross betrayal – especially those managers who ignored specific warnings – have not been brought to book. Until they are, Paterson’s victims will have no justice. A COMMONS committee attacks internet giants including Google and Facebook today for helping to spread terrorist propaganda and sexually abusive material. As the MPs say, they are showing little sign of wiping this filth off their systems voluntarily, so surely it’s time for the law to intervene and force them into showing some social responsibility.