Russian propaganda in Europe has taken a turn for the worse since the outbreak of war in Ukraine.
Written by Nazar Hlamazda | Gwara Media
In 2024, the EU held its European Parliament elections. The centre-right, led by Ursula von der Leyen, secured the most seats and claimed victory. However, it was also the most successful year for the far right, with various Eurosceptic fringe groups winning 220 out of 720 seats—roughly 30.6% of the total in the European Parliament.
This victory was particularly significant for the far right, especially Marine Le Pen’s National Rally. Her party won the European elections in France, securing twice as many votes as President Macron’s Ensemble, prompting him to call early parliamentary elections. Le Pen is known for her pro-Russian stance in international affairs, having claimed that annexed Crimea belongs to Russia and repeatedly urged an end to military aid for Ukraine.
In an opinion piece for The Hill, American lawyer Gregory Wallance observed that Russia has shifted its focus from the far left to the far right, using the latter as a vehicle for propaganda in U.S. politics. Research by the International Centre for Counter-Terrorism further indicates that Russia funds far-right parties across Europe to expand its influence and undermine support for Ukraine.
Russian propaganda in Europe
In June 2024, The Washington Post published an investigation revealing a large-scale Kremlin-backed influence operation conducted through the far-right media outlet Voice of Europe. Officially registered in Prague, the platform was used to funnel funds to far-right politicians across Europe, promoting pro-Russian narratives and eroding support for Ukraine.
The Post’s investigation cites European intelligence reports linking Artem Marchevsky—a former Ukrainian television producer with ties to pro-Russian politician Viktor Medvedchuk—to a coordinated influence campaign. Marchevsky reportedly held meetings with far-right representatives in Germany, France, Poland, the Netherlands, and other EU countries, offering monthly payments of up to €1 million in exchange for amplifying pro-Russian narratives through media and public appearances.
Maksym Vikhrov, Senior Analyst at Ukraine’s Centre for Strategic Communications and Information Security, told Gwara Media that platforms like X (formerly Twitter), Telegram, TikTok, Facebook, and YouTube are especially susceptible to Russian disinformation. Unlike traditional media with editorial oversight, these platforms struggle to moderate the overwhelming flow of content. Telegram, in particular, serves as a direct channel for Russian intelligence-led information operations targeting both Ukrainian and European audiences.
Vikhrov also noted that X hosts sprawling multilingual networks of Russian-linked accounts and “bot armies,” while TikTok’s rapid-fire video format leaves little room for critical analysis, making it fertile ground for manipulation. Beyond social media, Russia has developed a vast disinformation ecosystem, including hundreds of websites, in an attempt to construct an “alternative internet” under Moscow’s control.
In 2023, the European Commission study highlighted X's role in amplifying Russian disinformation about Ukraine. Now, Russia is ramping up its efforts on the platform, pushing narratives about Ukraine's supposed loss of sovereignty and discrediting Western support.
Gwara Media fact-check editor Olga Yakovleva shares examples of Russian propaganda using X as a propaganda tool. An anonymous user on X posted a fake claim about Hollywood stars being paid for their visits to Ukraine. However, it gained traction after being amplified by Elon Musk and Donald Trump Jr. — Russian propaganda channels, some with over 100K followers, quickly picked it up, pushing the fake to hundreds of thousands of viewers.
Similarly, the false claim that Zelenskyy bought a private bank in France originated from an anonymous X user and was then spread by Russian pro-Kremlin media such as eNews and news.ru.
Ukrainian state against Russian propaganda
Alina Bondarchuk, deputy head of Ukraine’s Center for Countering Disinformation, told Deutsche Welle that the fight against disinformation in Ukraine is coordinated at the state level. Since 2021, the Center—operating under the National Security and Defense Council—has been actively debunking Russian falsehoods and tracking their spread across social media platforms.
"We have established a partnership with Google. We provide expert assessments of narratives circulating on various channels. YouTube reviews these cases as a Google subsidiary and decides to block content. Over 200 channels have already been blocked."
Information security expert Andriy Bidenko noted that Ukraine has been effective in countering Russian propaganda, thanks in large part to the work of strategic communications centers. He emphasised that Ukrainians have developed a strong resilience to crude and simplistic propaganda, which now rarely reaches them and is virtually absent from domestic media and social networks.
An essential part of Ukraine’s effort to combat Russian propaganda involves broadcasting Ukrainian content to occupied territories. On July 5, 2024, the Zaporizhzhia regional administration launched a TV network offering free access to nine Ukrainian channels, including in areas under Russian control. Vladyslav Moroko, head of the regional Department of Culture and Information Policy, unveiled a roadmap to restore Ukrainian broadcasting starting from Zaporizhzhia. The network is expected to eventually cover 90% of the country’s territory.
Maksym Vikhrov concludes, “Ukraine has amassed significant experience in countering Russian FIMI (Foreign Information Manipulation and Interference) influences, developed through real-world challenges.” He explained that Ukraine's tools evolved not from theoretical frameworks but from urgent practical needs.
However, Vikhrov cautioned that existing tools will never suffice—neither in Ukraine nor globally—since Russian propaganda also adapts and adopts new technologies.
For instance, he noted that Moscow currently aims to "poison" large language models like ChatGPT with its narratives. At the same time, Moscow promotes an alternative "fact-checking" system globally (Russian project "War on Fakes," for instance, simply promotes Russian narratives via loud claims and links to Russian sources under the guise of fact-checking — ed.) While resembling legitimate fact-checking in form, this system would "debunk" facts by referencing statements from figures like Lavrov.
Fact-checking tips from Ukraine
Olga Yakovlieva said that cooperation between newsrooms and knowledge sharing is needed to address the disinformation problem worldwide. Most fakes spread in several countries at once. For example, the Russian network Pravda creates fakes in Russian and then translates them into other languages.
Knowledge sharing, Yakovlieva notes, helps to find patterns of disinformation, to recognize manipulations and debunk them faster, and creates an ecosystem of experts and media workers for countering disinformation.
Perevevoshcykov Kyrylo, analyst of an independent fact-checking project, VoxCheck, in the interview with Detector Media, described the algorithm they use to check information.
First, they verify who spread the fake information and whether the source—a blogger, politician, media outlet, or other organization—has previously disseminated disinformation. The next step is to search for other publications with similar content to identify variations or "mutations" of the fake, ensuring the fact-check addresses all false claims comprehensively.
Depending on the case, VoxCheck verifies whether statements, news, laws, scientific articles, statistical studies, or investigations are accurately represented in the publication or manipulated. If the fake includes photos or videos, experts make sure they are authentic.
"Fact-checkers' challenges are not significantly different from those encountered by other media professionals. These include handling large volumes of information that must be regularly analyzed and responding quickly to challenges and threats," Perevevoshcykov said.
Olha Yakovleva recommends to connect to professional fact-checking platforms like European Fact-Checking Standards Network (EFCSN) and International Fact-Checking Network (IFCN) platforms and join OSINT communities like GeoConfirmed, the Bellingcat Community, r/OSINT on Reddit.
“It’s very important to be part of a professional community — to keep learning, share knowledge, and avoid isolating in the bubble.”
It’s also good to try out new fact-checking tools like InVID (video verification), Maltego (OSINT platform), and Hunchly (data organization tool), Yakovlieva says. Investigations into fakes can be then published to open libraries like EUvsDisinfo, DFRLab, and OpenFact.
Yakovlieva said that psychological resilience must also be taken care of, recommending having an action plan in case of hate or attacks and a support team, such as a newsroom, legal advisors, or trusted colleagues.
"Fact-checking that deals with sensitive topics — such as war, politics, elections, or disinformation campaigns — can make you a technical and psychological target. Cyber hygiene and personal safety are just as crucial as fact-checking skills," Yakovlieva noted.
The project is co-financed by the governments of Czechia, Hungary, Poland and Slovakia through Visegrad Grants from the International Visegrad Fund. The mission of the fund is to advance ideas for sustainable regional cooperation in Central Europe.
The project is supported by the Ministry of Foreign Affairs of the Republic of Korea.