Even without a flood of deepfakes, disinformation still overshadowed this year's EP-elections

International experts have assessed how false or misleading information possibly reaching masses on major social media platforms influeced EP-eleciton results.

AI-generated disinformation reached unprecedented proportions during the parliamentary election campaign in Slovakia last autumn and the subsequent elections in Poland. Therefore, a large part of the European public had a right to expect that deepfake videos and audio recordings would be used to influence election results in the months and weeks leading up to the EP elections in June 2024.

However, the flood of deepfakes, eventually, has failed to materialise this time.

This does not mean, however, that disinformation, the systematic flooding of voters with false, misleading or deliberately divisive information, especially in the online space, has not again emerged as a serious challenge in this election campaign - international experts discussed at the closing conference of the joint project on electoral disinformation by Lakmusz, Mérték Media Analysis Workshop and Political Capital. Participants of the expert panel were:

  • Krisztina Stump, Head of the Media Convergence and Community Media Unit of the European Commission's DG for Communications Networks, Content and Technology,
  • Paolo Cesarini, Professor at the European University Institute (EUI), Programme Director of the European Digital Media Observatory (EDMO) and the European Media and Information Fund (EMIF),
  • and Michal Šenk, researcher at Charles University in Prague and the Central European Digital Media Observatory (CEDMO).




Photo by Political Capital/Facebook

From the very beginning of the panel discussion, there was a consensus that

we know much more about how disinformation works compared to the previous, 2019 EP elections.

In recent years, a joint effort by researchers and fact-checkers has led to the development of EU-level signalling systems and repositories (such as the EE24 database) to track damaging disinformation narratives that are spread in the European public sphere before an election, for example. This signalling system also allowed Lakmusz to show how false information was spread across Europe about the Fico assassination.

But we also have a much clearer picture of which actors - often from outside the EU - are most active in spreading disinformation. (This topic has also been covered in a number of articles at Lakmusz, collected here.)

However, there was a slightly different tone in the discussion on how much progress has been made in the fight against disinformation, which is clearly perceived as a threat by the global and EU public, as recent Ipsos and Eurobarometer studies have shown.

Krisztina Stump pointed out that the EU has done a lot in recent years to get the biggest social media companies to cooperate and do more on their platforms to fight the spread of disinformation. Before this year's EP elections, big tech companies such as Meta, TikTok, Google and Microsoft had already made various commitments to fight disinformation under the EU's Digital Services Act and Code of Practice on Disinformation.

Participants in the panel discussion on Social Media Platforms and Election Campaigns, from left to right: Blanka Zöldi (moderator), Paolo Cesarini, Krisztina Stump, Michal Šenk. Photo by Political Capital

The other two participants, Paolo Cesarini and Michal Šenk, warned, however, that it is still difficult to judge how much the platforms, which do have a serious responsibility to fight disinformation, are actually willing to do for this cause.

"There is definitely room for improvement,"

said Cesarini, citing a recent study by Spanish fact-checking portal Maldita, which found that TikTok, Meta, Google and X combined

in 45 percent of cases took no visible action about content circulating on their platforms that fact-checkers flagged as false, misleading or out of context before the elections.

"But even so, I would refrain from any condemnation of the performance of the platforms," said Cesarini, adding that it will take some time to see how well some of the economic incentives and sanctions built into the regulation work. Under the DSA, for example, platforms have to pay fines of up to 6 percent of their global turnover if the European Commission finds they have breached any of their obligations under the DSA.

Šenk pointed out that the level of resources devoted to content regulation varies widely from one platform company to another. In many places, moderation would be the responsibility of employees who do not speak the language, and who are severely underpaid, which may explain the poor response rate seen in Maldita's analysis.

The Czech researcher, when asked by the moderator, listed some of the themes that have frequently appeared in the EP election campaigns in the Central European countries he has been following closely (Slovakia, Poland and the Czech Republic). However, he said that these were not very different from the general topics that fact-checkers working in other EU countries had to debunk. So, rather than simply listing them, we therefore recommend browsing the Elections24Check fact-checking database, where a coalition of more than 40 European fact-checking organisations has compiled a list of disinformation content that was covered in the months running up to the EP-elections. Issues include climate change, war, migration, energy security and EU institutions. Lakmusz was part of this cooperation.

The full panel discussion, along with the other presentations and panel discussions of the conference, can be watched on YouTube:

Neuberger Eszter
Közel tíz éve dolgozik újságíróként, ezalatt megfordult már a hvg.hu, a 2019 végén megszűnt Abcúg és a 444 szerkesztőségeiben. A tényellenőrző munka előtt leggyakrabban oktatási, egészségügyi és szociális témájú cikkeket publikált.