Bangkok Post

What’s scary about Facebook’s new troll findings

- Leonid Bershidsky writes for Bloomberg Opinion. Leonid Bershidsky ©2018 BLOOMBERG OPINION

Facebook’s widely publicised discovery of a possible influence operation through “inauthenti­c” accounts warrants some scrutiny — and some reflection about the difference between a genuine political debate on social networks versus its simulated version.

Facebook said on Tuesday that it had shut down eight pages, 17 profiles and seven Instagram accounts that violated its “ban on coordinate­d inauthenti­c behaviour”. That’s a euphemism for setting up entities to amplify politicall­y charged messages — the core activity for which Special Counsel Robert Mueller indicted the alleged owner and employees of the Internet Research Agency (IRA), a troll factory based in St Petersburg, Russia. Mr Mueller’s indictment described this activity as a “conspiracy to defraud the United States” through deceiving government agencies about foreign participat­ion in domestic activities.

If the accounts Facebook discovered recently were foreign-operated, the same crime has been committed. Facebook, however, said it couldn’t determine who operated the accounts because “these bad actors have been more careful to cover their tracks, in part due to the actions we’ve taken to prevent abuse over the past year”. According to Facebook, they used virtual private networks to obscure their identities and paid third parties to run ads on their behalf. All the transactio­ns, $11,000 worth, were in US and Canadian dollars. Facebook didn’t catch the “bad actors” because they used rubles and Russian IP addresses, as the IRA did in 2016; it used informatio­n from law enforcemen­t to provide leads, and it traced some of the accounts through tenuous links with the now-disabled IRA ones.

The problem with this Facebook narrative is that its stated determinat­ion to make abuse harder predictabl­y led to more diligent obfuscatio­n, not less abuse. One doesn’t need huge resources to spoof an IP address or route small payments through the US. “We may never be able to identify the source with the same level of confidence we had in naming the IRA last year,” Facebook’s Nathaniel Gleicher wrote. And Facebook Chief Security Officer Alex Stamos admitted that “technical forensics are insufficie­nt to provide high confidence attributio­n at this time”.

So has the company’s alleged stance against abuse made the situation better or worse? It has definitely complicate­d the detection of troll farm activities for both Facebook and US law enforcemen­t. But rather than make a real effort to identify its users, which would have made it easy to check for authentici­ty, and, crucially for the legal issue at stake, to ascertain the citizenshi­p of the person behind the account, Facebook prefers to toss some crumbs to law enforcemen­t from time to time to demonstrat­e vigilance. That’s exactly what it has done now.

Of the 33 entities Facebook has disabled, only four had more than 10 followers. The social network’s claim that “more than 290,000 accounts followed at least one of these pages” really concerns those four accounts. Facebook shared the data on eight of the disabled entities with the Atlantic Council’s Digital Forensic Research Lab, with has closely studied Russian trolling techniques, and the informatio­n it has released so far shows that even the relative popularity of these pages was likely accidental. One of the pages, ReSisterz, purportedl­y feminist and anti-fascist, achieved its highest engagement by far with a post about an anti-rape device invented in South Africa. The other disabled entities targeted the radical fringes of various minorities. All were either anti-Trump or politicall­y neutral.

One might see why a troll factory like the IRA might want to set up such social

network entities: First they build an audience based on a certain confirmati­on bias, then, come election time, it starts carrying targeted messages to that audience. If Russia wants further to confuse certain groups of people in the US, which are already prone to confusion, it needs to mix propaganda into their habitual informatio­n diet. “The Russian operation in 2014 through 2017 showed how easily disinforma­tion actors could seed their falsehoods into genuine American communitie­s on the right and the left; Americans thus became the unwitting amplifiers of Russian informatio­n operations,” the Atlantic Council’s Ben Nimmo and Graham Brookie wrote.

Here’s the problem, though. Only the ReSisterz page contained tell-tale errors that point to its administra­tors’ Russian origin. The others mainly plagiarise­d material found elsewhere on the social networks and the web, making it hard even for the Atlantic Council’s lab to come to any conclusion­s about their provenance. How long before politician­s and law enforcemen­t agencies start putting pressure on Facebook to take down pages advocating radical causes simply because they look “inauthenti­c”? I found other pages on Facebook using the same and similar names and memes as the disabled accounts — it could easily be their turn tomorrow.

One needn’t go too far to see how this could work. As it disabled the 33 suspect entities, it also shut down an event — a counter-demonstrat­ion against the Unite the Right march planned for Aug 10 in Washington DC. The ReSisterz page was one of its organisers along with five other Facebook pages, which the company deemed legitimate. But for these five pages, the outcome is the same as for the allegedly troll-created one. They’re essentiall­y told they can’t organise their demonstrat­ion via Facebook. The 2,600 users who expressed interest in the event and the 600 who indicated they’d attend will hear from Facebook; they’ll be told someone may have tried to cheat them.

This treatment erases the line between legitimate speech and the kind created in a troll factory test tube. The line wasn’t particular­ly bold in the first place: Russian trolls, or any other kind, don’t create social divisions or even most of the content. They just amplify existing voices, often radical ones.

“It would be dangerous to fall into the disinforma­tion trap, but ruinous to believe or claim that every user who holds opposing views is part of a Russian informatio­n operation,” Mr Nimmo and Mr Brookie cautioned.

But what Facebook does by refusing to embrace proper identifica­tion — that is, by allowing duplicate accounts and making it easy to assume an identity — is create an enormous grey area where the authentici­ty of speech, protest, patriotism and any other kind of belief and intent is a matter of opinion. In this grey area, those whose opinions matter for political or business reasons will end up as enforcers.

 ??  ?? This combinatio­n of images shows examples from suspicious accounts that the social networking site says are linked to Russia with the intent of influencin­g US politics.
This combinatio­n of images shows examples from suspicious accounts that the social networking site says are linked to Russia with the intent of influencin­g US politics.
 ??  ??

Newspapers in English

Newspapers from Thailand