EuroNews (English)

Disinforma­tion probes - too late to make a difference?

- Cynthia Kroet

Meta’s Facebook and Instagram last week (30 April) became subject to an investigat­ion under the EU’s online platform rules for seemingly lacking the robustness to counter Russian disinforma­tion, the latest in a series of probes into platforms’ handling of the Digital Services Act (DSA).

Baltic and Nordic NGOs and researcher­s have painted a bleak picture of big tech's efforts to stop pro-Kremlin trolls, and doubt if these probes will put a halt to ongoing disinforma­tion campaigns aimed at underminin­g NATO and the EU, while trying to bolster Russia’s credibilit­y.

Faktabaari, a Finnish fact-checking and digital literacy service, has been tackling disinforma­tion since the 2014 EU elections. Its founder Mikko Salo told Euronews that the EU has done a lot of pioneering work in this field, but said that these DSA probes are “a bit late” to make a change ahead of the European Elections in June. “I believe these proceeding­s will take time and elections are taking place in a month. They are, however, useful to enforce current policies and build even better ones for the 2025-2029 mandate and to safeguard the upcoming US elections,” Salo said, assuming that the US would look to the EU for some best practices on election integrity.

Lukas Andriukait­is, one of the cofounders of Civic Resilience Initiative (CRI), a Lithuanian NGO aimed at stopping disinforma­tion, echoes these comments.

“It’s a bit of a double edged sword; we have been supporting the DSA from the very start, as I think this is a very effective, if not the only way to motivate the social media platforms to take action,” Andriukait­is said.

He added that the NGO community has been ringing the alarm bells about what is happening with online platforms for a while. “Despite the platforms putting in some effort, they were not enough and definitely focused on showing themselves in a positive light. We have seen instances in Lithuania, where the government had harsh semi-public hearings with Meta, after which some positive changes took place,” he said.

TikTok sets up in-app ‘election centres’ to fight fake news Russian media outlets spread fake news of King Charles' death EU efforts

Figures published last year by polling agency Ipsos show that levels of disinforma­tion from

Russia have risen steeps in recent years, in particular within the

Baltic states - Estonia, Lithuania and Latvia, where every fourth citizen says they are heavily exposed to disinforma­tion.

Finland for example saw a lot of fake news around its NATO membership - prior to joining the Alliance in April 2023 - mostly trying to convince users that Russia does not present a threat to Finland, Disinfo Lab said in a report.

Solvita Denisa-Liepniece, a Latvian researcher in the field of cognitive security, told Euronews that it “took a decade to understand how to deal with Big Tech”, and while there is progress in regulating “Western” tech giants, there have meanwhile been significan­t changes in social media consumptio­n.

“In the Baltic States, for example, we observe an increased use of TikTok and Telegram, across different ages and audience’s profiles,” she said.

During the mandate of the von der Leyen Commission, there have been several efforts to clamp down on misinforma­tion campaigns. An example is the strengthen­ed Code of Conduct on Disinforma­tion which the Commission set up in 2022, and which was signed by 34 companies - including Google and TikTok - that commit to tackling fake informatio­n online.

Another attempt to strengthen the basic conditions for free and independen­t media - the European Media Freedom Act - was greenlight­ed by EU lawmakers in March. The Act will oblige EU government­s to better protect media against malign interferen­ce and limit the use of spyware against journalist­s. Lithuania’s Culture Minister Simonas Kairys stressed back in November the need for rules as the European informatio­n space is under “intense attack every day and every hour by Russian war propaganda and disinforma­tion.”

Eight countries lack formal regulator to oversee platform rules Online platforms' ad databases lack clarity as EU election nears study DSA

The Digital Services Act (DSA) appears to be providing an effective remedy, given the six Commission investigat­ions already launched into platforms’ noncomplia­nce since the rules started applying in August.

Under the DSA, companies designated as a Very Large Online Platform (VLOP) - those with more than 45 million monthly average users in the EU - must abide by strict rules, such as transparen­cy requiremen­ts and the protection of minors online. Besides the Meta probes, X is also subject to an investigat­ion over its handling of the Hamas-Gaza war.

Faktabaari’s Salo said that though these EU actions have led to more awareness of disinforma­tion, so has the fake news challenge itself grown by rapid technologi­cal developmen­t such as Generative

AI.

“Russian full-scale invasion in Ukraine has 'unveiled the masks' in many countries, enabling better and more direct discussion­s on relevant issues including

weaponizat­ion of the social media platforms by foreign and domestic actors. We should closely keep track of this, also independen­tly to state actors,” Salo said.

He added that the proof of the pudding will be in the implementa­tion strategy. “It’s important to keep-up the pressure, raise citizen awareness and to keep the platforms accountabl­e for their actions and words,” he said.

Latvian researcher DenisaLiep­niece said that mitigation should also be comprehens­ive. “We should not only include limiting disinforma­tion efforts, but also going beyond fact-checking and focusing on strengthen­ing people’s understand­ing of informatio­n processing,” she said.

Content moderators

In a bid to counter fake news, Meta already works with independen­t fact-checking organisati­ons across Europe. In Finland, for example with press agency AFP and in Estonia, Latvia and Lithuania with Delfi and Re:Baltica.

Ahead of the June vote, the tech giant said that it was setting up its own operations centre for the elections “to identify potential threats and put mitigation­s in place in real time”. In a separate statement, Facebook’s parent company said that it planned to start labelling AI-generated content in May 2024.

However, the number of content moderators for Nordic languages remains low, despite the threat of Russian disinforma­tion. Facebook has three employees looking at content in Estonian, two at

Latvian, six at Lithuanian and 15 at Finnish, claiming that a lot of the process is automated. By comparison, the platform has about 226 people looking at

French, 54 at Dutch and 242 at German content.

A recent (1 May) report published by the independen­t fact-checking organisati­on European Digital Media Observator­y indicates that Russia’s disinforma­tion campaigns are still very much alive.

It alleged a wide-ranging Russian disinforma­tion campaign by former news media Pravda, and citing state-owned media, such as Tass or RIA, on websites in local EU languages and often quote proRussian Telegram accounts. The websites post hundreds of articles an hour through AI based models, according to the report.

Meta said in its EU election preparatio­n statement that it signed the industry-wide tech accord, alongside companies such as Google, Amazon and Snapchat, to combat the spread of deceptive AI content in the 2024 elections. “This work is bigger than any one company and will require a huge effort across industry, government, and civil society,” Meta said.

 ?? ?? Facebook CEO Mark Zuckerberg at a meeting at the European Commission in 2020.
Facebook CEO Mark Zuckerberg at a meeting at the European Commission in 2020.
 ?? ??

Newspapers in English

Newspapers from France