Google has one hour to block terror posts
Facebook and Google must remove terrorist content within an hour of detection or face tough new laws.
The European Commission issued the ultimatum on Thursday in recommendations applying to all illegal material online, including terrorist manuals, incitement to hatred and images of child sexual abuse.
The commission is acting most firmly on terrorist content, such as the bomb-making instructions on YouTube used by Salman Abedi to attack Manchester Arena in May last year.
Despite the removal by social media companies of millions of terrorist posts, experts in counterextremism warned that dangerous videos and manuals were still being uploaded.
‘‘Considering that terrorist content is most harmful in the first hours of its appearance, all companies should remove such content within one hour from its referral,’’ the commission said.
Companies should not only respond when notified by the authorities but should use automated detection to remove the material and stop it from reappearing.
Improved automated screening should also be used to remove other illegal posts, such as those of child abuse, it added.
Although companies have improved removal rates using artificial intelligence, a commission spokesman said:
‘‘We still need to react faster against terrorist propaganda and other illegal content which is a serious threat to our citizens’ security, safety and fundamental rights.’’
He said that companies would have to introduce human oversight to ensure that legitimate posts were not removed, compromising the right to freedom of expression. They have three months to show that they are implementing the recommendations or they would face ‘‘legislative measures’’.
The recommendations apply in the United Kingdom while it remains in the European Union. Home Office sources indicated that the government would hold the companies to a similar standard after Brexit. Theresa May warned in September that companies must go ‘‘further and faster’’ in removing content. They might face legislation and large fines for failing to delete extremist content within two hours of upload and the deadline would be reduced to one hour in due course.
The Computer & Communications Industry Association, which represents tech companies, said the commission had failed to reference ‘‘any major incidents justifying such a hurry.’’
It added: ‘‘Such a tight time limit does not take due account of constraints linked to content removal and will incentivise hosting services providers to simply take down all reported content.’’