The tactics of Russian trolls should worry us
Carte blanche by Péter Kreko, Csaba Molnar and Lóránt Győri of the Budapest Institute of Political Capital.
Troll armies have become staples of the Kremlin’s disinformation manual. They first emerged in 2016, when Putin confidant Yevgeny Prigogine’s Internet Research Agency recruited thousands of people at a “troll farm” in St. Petersburg to interfere in key elections, including the US presidential race between Donald Trump and Hillary Clinton. Now they are alive and well as part of Russia’s aggression against Ukraine, and less obvious after the geoblocking of the main disinformation outlets, Sputnik and RT, and their content being distributed on the world’s biggest social media platforms. world, Twitter, Facebook and YouTube.
Research published by the British government in May 2022 showed that Russia expanded its robot and troll army after invading Ukraine on February 24, 2022. A new army of trolls linked to Prigozhin, the founder of the Russian mercenary “Wagner” group, has arrived. warned the theater of war. In addition to targeting major media outlets or politicians, such as British Prime Minister Boris Johnson and German Chancellor Olaf Scholz, trolling has also focused on manipulating public opinion by posting disinformation posts in the comment sections of various social media platforms (Facebook, Twitter, TikTok). , Telegram). META researchers proved in August that war trolls Cyber Front Z were linked to a troll factory run by Prigozhin, and a study published on November 6 found that social media accounts once linked to the IRA were in “winter mode”. showed. is active in attacking President Biden’s handling of the Ukraine crisis ahead of the midterm elections.
Using a combination of algorithm-based text mining and qualitative analysis, our think tank, Political Capital, examines the V4 countries of Ukraine (Hungary, Poland, the Czech Republic, and Slovakia), Germany, Italy, and Romania. Specific posting patterns of reusing stock photos and verbatim reposting the same comment in Facebook threads revealed their ambiguous behavioral patterns. Our team examined retweets of at least 5 words that were published at least 200 times on social media channels and came up with a number of conclusions. First, in our study of V4 countries, we found that there are marked country-specific differences in activities and narratives across countries. For example, in Hungary and the Czech Republic, we found a large number of comparative posts aligned with pro-Kremlin narratives. Of the five stories broadcast in the two countries, three were about (1) Ukraine committing genocide in the Donbass, (2) neo-Nazis taking over Ukraine, or (3) Ukraine not being a real state. In Poland, such a tactic would not work because of the widespread resentment against Russia in the country. Thus, the articles sought to highlight geopolitical insecurity by suggesting that the ruling PiS party is mismanaging national security efforts and that cooperation with NATO could push Poland into war. In Germany, trolling efforts are aimed at fueling a sense of guilt in German public opinion. The mainstream narrative also sought to recast the war as a conflict between Russia and the West (US and NATO), emphasizing that the West had broken promises to the Soviet Union and Russia regarding NATO expansion.
Second, we found that many fake stories start life in Moscow. Three messages repeated by trolls in Hungary were easily identified as such. These include: “No Ukraine”; “NATO’s new dictatorial world order”; and “The last eight years of genocide in Donbass”. The first of these came from an organization linked to pro-Putin Ukrainian oligarch Viktor Medvechuk and listed its source as a separatist “news agency”. An AFP investigation revealed that the same account was circulated in Greek, German, English and Bulgarian.
In some regions, the flow of messages from the Kremlin has also increased. Our investigation uncovered disinformation from the Kremlin in popular media, including RTL, RTL Aktuell, Sat1 and ZDF Heute, which have large audiences in their home countries. They have managed to pave the way for sympathetic political administrations in some countries. In Hungary, for example, pro-government mainstream media reports falsely claim “genocide” or “ethnocide” against the Russian or Hungarian minorities, citing a diplomatic dispute. Budapest. Liberals from the German AFD, the Social Democrats of Romania, Robert Fico in Slovakia and the Tricolor party in the Czech Republic also gave oxygen to the Kremlin stories.
…and their mistakes
Ultimately, our efforts to identify troll activity were fueled by their mistakes. For example, in several cases we have seen so-called Slovak Facebook users commenting on Czech Facebook pages in Hungarian, Italian profiles on Colombian Facebook pages in their own language, etc. At the same time, profiles sharing overtly pro-Kremlin and anti-Kremlin narratives were more telling. These errors indicate that the Russian source behind these profiles forgot to switch accounts before moving to another jurisdiction. We also found that fake and stolen profiles are the most common in spreading these stories.
What should the Union do?
The scale of Russian troll activity in Europe should worry political leaders and citizens in general. Non-genuine online influencer operations are easy to develop and inexpensive to execute.
It is therefore important that the EU adopts relevant legislation in addition to its digital services legislation and develops technical capabilities to better recognize inauthentic behavior online. The final silver bullet lies in the social media companies themselves and their appetite to combat opaque networks. European legislators should give the highest priority to these platforms, which respond quickly to the demands of combating disinformation, and in addition, introduce their own tools to prevent the spread of trolling activities on their platforms.