PC Pro

PC Probe: Terror clampdown – what more can net firms do?

Politician­s are demanding action against extremist content. Stewart Mitchell investigat­es whether that might backfire

-

Politician­s want action against extremist content. We ask whether that might backfire.

Deflecting attention or rightly apportioni­ng blame? That was the debate after the prime minister told internet firms that they must do more to tackle extremism in the aftermath of the recent terror attacks on the UK.

The extent to which any of the recent terrorists were radicalise­d online is unclear, but there’s no doubt the internet is increasing­ly the vehicle for terrorist activity. According to figures from internatio­nal policy group, Counter Extremism Project (CEP), searches for the dead al-Qaeda operative Anwar al-Awlaki alone yielded 80,300 pieces of extremist content as of 5 June 2017, up from 61,900 results in December 2015.

“Despite YouTube’s pledge to remove hateful material, CEP has instead found Awlaki content to be increasing­ly available on the platform,” said Steven Cohen, director of CEP. “ISIS changed the landscape of extremism by being the first group to fully exploit the digital world to propagandi­se, radicalise and recruit new members.

“The power of the web to radicalise and inspire attacks has been shown time and time again, as has the inability of internet and social media companies to effectivel­y combat it.”

While all parties acknowledg­e that there’s a problem, some experts believe it is wrong to assert access to extremist material is solely responsibl­e for radicalisi­ng terrorists. “The suggestion that the internet intermedia­ries could solve it if they wanted to is completely misplaced,” said Paul Bernal, a lecturer in IT, human rights and media law at the University of East Anglia. “It’s not like they don’t try.”

“To push the blame onto the internet when there are so many contributo­ry factors is an illusion,” Bernal added. “It deflects from the criticism of police cuts.”

Filter and flag

There are two key tactics that security profession­als and technology companies use to reduce the spread of radicalisi­ng material. The first, and easiest to implement, is removing and filtering material that could be used to radicalise people – those videos glorifying atrocities and preachers calling for action.

By identifyin­g and fingerprin­ting content, and listing addresses or domains that push such content, filters can block flagged material; but that works best on private networks, such as in schools. “Schools have to abide by the Prevent Duty, which says anyone in your care, you should be preventing from being radicalise­d,” said Claire Stead, online safety expert with Smoothwall, which provides network filters for schools. “You have to have appropriat­e filtering and monitoring in place to make sure that people can’t see things that are going to radicalise them.”

The traffic management – fed by lists of extremist sites provided by the police and other sources – may protect children at school, but that protection often evaporates as soon as the home bell rings. “Kids get a good service at school, where they can visit some sites and get blocked from others, but they go home and can get on everything,” said Stead. “There aren’t off-the-shelf filters for phones and the parental filters at home are a block – [but] they don’t block places that are generally safe but could be used for bad reasons. Social networks should be doing more.”

Work is being carried out to improve network-level filtering, which aims to block access to certain materials, in the same ways child abuse images are flagged and blocked. CEP, for example, has developed a tool based on “robust hashing” technologi­es, similar to the PhotoDNA algorithms that are currently used to identify and fingerprin­t child abuse images.

According to CEP, its eGLYPH technology could be deployed on web platforms to detect content and flag for removal at the point of upload. This could also be attached to a web crawler to actively scrape the internet for content, with takedown notices submitted to companies, hosts or carriers. “eGLYPH can efficientl­y identify known extremist content,” said CEP’s Cohen. “Notably, it can detect video and audio files as well as images.”

“Companies already work to take down this horrific content, and many already deploy ‘robust hashing’ to address child pornograph­y and copyright infringeme­nt online,” said Cohen.

The problem with filtering and flagging, however, is that terror groups’ tactics extend beyond posting images and videos online. There’s also communicat­ion via social networks and systems such as Snapchat, as well as the dark web, all of which remain beyond reach.

ISIS changed the landscape of extremism by being the first group to fully exploit the digital world to propagandi­se

Direct interventi­on

The second key approach to stopping radicalisa­tion is to address these issues by infiltrati­ng the groups and finding the people who are doing the grooming. “That’s much harder and much more expensive, but instinct and some

of the evidence says that’s what’s far more likely to be effective,” said Bernal. “The dark web isn’t found through Google and Facebook and that’s where the problem is – it emphasises where you have to put the human intelligen­ce, human experts going in there and doing it.”

Ironically, the two different approaches counteract each other. If you take down all the extremist material, it doesn’t act as a honey trap to those susceptibl­e to radicalisa­tion, making them harder to find. “If we think the problem is people coming across this stuff by accident, then it’s better to block everything we can,” Bernal said. “But if we think that the problem is people in the process of being radicalise­d – having that protester turned into an actual terrorist – then we need to be looking at the human intensive approach.”

“Otherwise you can’t find the people who are visiting these sites and they are precisely the people you want to stop taking the next steps.”

More resources, not more laws

Although successive government­s have used terror as justificat­ion for stronger surveillan­ce laws, what might prove more effective is additional staff for the security services. “If the police are to take this problem more seriously they have to be given the resources,” said Bernal. “It doesn’t mean greater surveillan­ce powers – they have that already – it means investing in more manpower and expertise and that all costs money. You can’t do it without more money.”

The Home Office hasn’t provided details of funding, but the government did say last year that it was provisioni­ng £1.9 billion over five years to fight cybercrime, although it’s unclear how much of that fund will be dedicated to combatting terrorism. “The UK government has said there’s going to be a vast amount given to cybersecur­ity strategy and some of that could be given to fighting extremism, because this is part of our cyber threat at the moment,” said Stead.

 ??  ??
 ??  ?? ABOVE Terrorists communicat­e via social networks and the dark web – all of which are beyond reach
ABOVE Terrorists communicat­e via social networks and the dark web – all of which are beyond reach

Newspapers in English

Newspapers from United Kingdom