Khaleej Times

Digital criminals watch out, Big Tech is gunning for you

Just as there is an issue of reaching terrorist social media, there are challenges relating to potential overreach

-

In June 2017, Google, Facebook, Twitter and Microsoft announced the formation of the Global Internet Forum to Counter Terrorism (GIFCT). The aim of this industryle­d initiative is to disrupt the terrorist exploitati­on of its services. Recently, GIFCT members hailed the achievemen­ts of its first year of operations. But, while this progress must be acknowledg­ed, significan­t challenges remain.

Every single minute there are on average 510,000 comments and 136,000 photos shared on Facebook, 350,000 tweets posted on Twitter and 300 hours of video uploaded to YouTube.

Given this, the biggest companies extensivel­y rely on artificial intelligen­ce (AI). Facebook’s uses of AI include image matching. This prevents users from uploading a photo or video that matches another photo or video that has previously been identified as terrorist. Similarly, YouTube reported that 98 per cent of the videos that it removes for violent extremism are also flagged by machine learning algorithms.

One difficulty social media companies face is that, if a terrorist group is blocked from one platform, it might simply move to a different one. In response to this, GIFCT members have created a shared industry database of “hashes”. A hash is a unique digital fingerprin­t that can be used to track digital activity. When pro-terrorist content is removed by one GIFCT member, its hash is shared with the other participat­ing companies to enable them to block the content on their own platforms.

At its recent meeting, the GIFCT announced that to date 88,000 hashes have been added to the database. So, the consortium is on track to meet its target of 100,000 hashes by the end of 2018. Especially so, now that another nine companies have joined the consortium, including Instagram, Justpaste.it and LinkedIn.

These efforts have undoubtedl­y disrupted terrorists’ use of social media platforms. For example, in the 23 months since August 1, 2015, Twitter has suspended almost a million accounts for promoting terrorism. In the second half of 2017, YouTube removed 150,000 videos for violent extremism. Nearly half of these were removed within two hours of upload.

Yet much further work remains. In response to the disruption of their use of Twitter, supporters of the so-called Daesh have tried to circumvent content blocking technology by what is known as outlinking, using links to other platforms.

Interestin­gly, the sites that are most commonly outlinked to include justpaste.it,

sendvid.com and archive.org. This appears to be a deliberate strategy to exploit smaller companies’ lack of resources and expertise. Daesh supporters have also moved their community-building activities to other platforms, in particular Telegram. Telegram. Telegram is a cloud-based instant messaging

service that provides optional end-to-end encrypted messaging. This encryption stops messages being read by third parties. And it has been used extensivel­y to share content produced by official Daesh channels.

This forms part of a wider movement towards more covert methods. Other encrypted messaging services, including WhatsApp, have been used by extremists for communicat­ion and attack-planning. Websites have also been relocated to the Darknet. The Darknet is a hidden part of the internet that is anonymous in nature and only accessed using specialist encryption software. A 2018 report warned that Darknet platforms have the potential to function as an extremist “virtual safe-haven.”

In addition, research has found that supporters of extremists groups other than Daesh experience significan­tly less disruption on Twitter. Supporters of these other groups were able to post six times as many tweets, follow four times as many accounts and gain 13 times as many followers as pro-Daesh accounts.

It is also important to respond to other forms of violent extremism. Extreme right-wing groups have a significan­t presence on platforms such as YouTube and Facebook. While steps have been taken to disrupt their presence online, it appears that these groups are also beginning to migrate to the Darknet.

Just as there is an issue of reaching terrorist social media, there are also challenges relating to potential overreach. The difficulti­es in defining terrorism are well known. Summed up by the slogan “One person’s terrorist is another’s freedom fighter”, one of the most controvers­ial definition­al issues is that of just cause. Should a definition of terrorism exclude those such as pro-democracy activists in a country ruled by an oppressive and tyrannical regime? According to many countries, including the UK, the answer is no. As one Court of Appeal judge put it: “Terrorism is terrorism, whatever the motives of the perpetrato­rs.”

If social media companies take a similar approach, this could have some significan­t ramificati­ons. Indeed, there are already worrying examples. In 2017, thousands of videos documentin­g atrocities in Syria were removed from YouTube by new technology aimed at extremist propaganda. These videos provided important evidence of human rights violations. Some existed only on YouTube, since not all Syrian activists and media can afford an offline archive. Yet the alternativ­e — to seek to distinguis­h between just and unjust causes — is fraught with difficulti­es of its own.

At a time when social media companies face increasing pressure to do more to tackle terrorist exploitati­on of their platforms, the progress made during the GIFCT’s first year is welcome. But it is only the first step. —The Conversati­on Stuart Macdonald is Professor of Law

at Swansea University

In the 23 months since August 1, 2015, Twitter has suspended almost a million accounts for promoting terrorism. In the second half of 2017, YouTube removed 150,000 videos for violent extremism

 ??  ??
 ??  ??

Newspapers in English

Newspapers from United Arab Emirates