PM: sites must remove terrorist videos within two hours
Theresa May has told tech giants that they should find a way to remove online terrorist content within two hours, or face fines from the Government.
At a meeting in New York with Facebook, Google, Microsoft and Twitter, the Prime Minister said that she wants to see progress within a month. She urged them to do more to block extremist content so it never appears online, and to quickly take down any that gets through.
Before the meeting, in a speech at the United Nations, Mrs May had urged websites to go “further and faster” in building artificial intelligence that can detect and remove terrorist material.
The PM’S official spokesman said the Government wants websites to act voluntarily, but is prepared to take legal action, and issue financial penalties.
Mrs May’s call for more action, backed by French president Emmanuel Macron, comes amid growing concern that terrorist groups like Islamic State find it too easy to post recruitment videos and hateful propaganda.
Government figures show that Islamic State posted 27,000 links to extremist content in the first five months of 2017. These aim to radicalise young people and give them instructions on how to launch terror attacks.
Security services say that Islamic State is becoming better at spreading content quickly, which is why it’s crucial that it’s removed within two hours. This target will eventually be reduced to one hour, the Government said.
On average terrorist material stays online for 36 hours, but some bombmaking instructions have remained available for years, including advice on building a ‘bucket bomb’ like the one used in last month’s attack at Parsons Green Underground station in west London.
Google said it would need help from “trusted government sources” and users to identify and remove “problematic content”. Kent Walker, the company’s legal chief, told Radio 4’s Today programme that “the challenge is once it’s removed, many people re-post it or there are copies of it across the web”.