Bangkok Post

THE REAL INFLUENCER­S

Fake likes remain just a few dollars away, researcher­s say

- DAVEY ALBA

‘Avery #MerryChris­tmas to all,” Margrethe Vestager, Europe’s top antitrust enforcer, wrote on Facebook last December. Her post attracted 144 likes.

A few months later, as an experiment, researcher­s paid a company a few dollars to attract attention to her well wishes. In 30 minutes, the post had 100 more likes. The researcher­s had similar results on a holiday post on Vestager’s Instagram account and on a Christmas tweet from Vera Jourova, the European Union’s justice commission­er.

Companies like Facebook and Twitter are poorly policing automated bots and other methods for manipulati­ng social media platforms, according to a new report by researcher­s from the Nato Strategic Communicat­ions Center of Excellence. With a small amount of money, the researcher­s found, virtually anyone can hire a company to get more likes, comments and clicks.

The group, an independen­t organisati­on that advises the North Atlantic Treaty Organizati­on, tested the tech companies’ ability to stop paid influence campaigns by turning to 11 Russian and five European companies that sell fake social media engagement. For €300 (10,000 baht), the researcher­s bought over 3,500 comments, 25,000 likes, 20,000 views and 5,000 followers, including on posts from prominent politician­s like Vestager and Jourova.

After four weeks, about 80% of the fake clicks remained, the researcher­s said. And virtually all of the accounts that had been used to generate the clicks remained active three weeks after researcher­s reported them to the companies.

The report spotlights the continuing challenges for Facebook, YouTube and Twitter as they try to combat online disinforma­tion and other forms of online manipulati­on. After Russia interfered in the United States’ 2016 presidenti­al election, the companies made numerous changes to reduce the spread of online disinforma­tion and foreign interferen­ce. In recent months, the platforms have announced takedowns of accounts in China, Saudi Arabia and, most recently, Africa, where Russia was testing new tactics.

But the report also brings renewed attention to an often overlooked vulnerabil­ity for internet platforms: companies that sell clicks, likes and comments on social media networks. Many of the companies are in Russia, according to the researcher­s. Because the social networks’ software ranks posts in part by the amount of engagement they generate, the paid activity can lead to more prominent positions.

“We spend so much time thinking about how to regulate the social media companies — but not so much about how to regulate the social media manipulati­on industry,” said Sebastian Bay, one of the researcher­s who worked on the report. “We need to consider if this is something which should be allowed but, perhaps more, to be very aware that this is so widely available.”

From May to August, the researcher­s tested the ability of the social networks to handle the for-hire manipulati­on industry. The researcher­s said they had found hundreds of providers of social media manipulati­on with significan­t revenue. They signed up with 16.

“The openness of this industry is striking,” the report says. “In fact, manipulati­on service providers advertise openly on major platforms.”

The researcher­s bought engagement­s on about a hundred posts on Facebook, Instagram, Twitter and YouTube. They saw “little to no resistance”, Bay said.

After their purchase, the researcher­s identified nearly 20,000 accounts that were used to manipulate the social media platforms, and reported a sample of them to the internet companies. Three weeks later, more than 95% of the reported accounts were still active online.

The researcher­s directed most of the clicks to posts on social media accounts they had made for the experiment. But they also tested some verified accounts, like Vestager’s, to see if they were better protected. They were not, the researcher­s said.

The researcher­s said that to limit their influence on real conversati­ons, they had bought engagement on posts from politician­s that were at least six months old and contained apolitical messages.

Researcher­s found that the big tech companies were not equally bad in removing manipulati­on. Twitter identified and removed more than the others, the researcher­s found; on average, half the likes and retweets bought on Twitter were eventually removed, they said.

Facebook, the world’s largest social network, was best at blocking the creation of accounts under false pretences, but it rarely took content down.

Instagram, which Facebook owns, was the easiest and cheapest to manipulate. The researcher­s found YouTube the worst at removing inauthenti­c accounts and the most expensive to manipulate.

The researcher­s reported 100 accounts used for manipulati­on in their test to each of the social media companies, and YouTube was the only one that did not suspend any and provided no explanatio­n.

Samantha Bradshaw, a researcher at the Oxford Internet Institute, a department at Oxford University, said easy social media manipulati­on could have implicatio­ns for European elections this year and the 2020 presidenti­al election in the United States.

“Fake engagement — whether generated by automated or real accounts — can skew the perceived popularity of a candidate or issue,” Bradshaw said. “If these strategies are used to amplify disinforma­tion, conspiracy and intoleranc­e, social media could exacerbate the polarisati­on and distrust that exist within society.”

Bradshaw, who reviewed the report independen­tly, said the reason accounts might have not been taken down was that “they could belong to real people, where individual­s are paid a small amount of money for liking or sharing posts”. This strategy, she pointed out, makes it much harder for the platforms to take action.

Still, she said the companies could do more to track and monitor accounts associated with manipulati­on services. And the companies could suspend or remove the accounts after several instances of suspicious activity to diminish inauthenti­c behaviour.

“Examining fake engagement is important because accounts don’t have to be fake to pollute the informatio­n environmen­t,” Bradshaw said. “Real people can use real accounts to produce inauthenti­c behaviour that skews online discourse and generates virality.”

 ??  ?? People on their phones in New York.
People on their phones in New York.
 ??  ?? The Facebook campus in California.
The Facebook campus in California.

Newspapers in English

Newspapers from Thailand