As one of the most complex and misunderstood political entities in the world, the EU is frequently the subject of distorted narratives, viral falsehoods, and AI-generated manipulation. Most of this disinformation doesn’t even come from bold political accusations. It hides in everyday claims — about food safety, digital rights, environmental rules, or even what currency will look like in five years. For young people growing up online, many of whom are not yet voters but are hyper-engaged in social media, the danger isn’t just being misinformed. It’s being misled without even realising it.

In early 2024, a manipulated video began circulating widely on social media. It appeared to show a France 24 journalist reporting that Ukraine had planned an assassination attempt against French President Emmanuel Macron. The footage looked convincing at first glance — the branding was familiar, the delivery calm — but the entire clip was a deepfake. AI-generated visuals and a synthetic voice had been used to impersonate the journalist and fabricate the story. The video spread quickly across platforms like X (formerly Twitter) and Telegram before being flagged and debunked by fact-checkers. Although it was removed, its brief lifespan highlighted how easily synthetic content can be used to undermine trust and inject confusion into the European information space.

 

Virginia Kouridaki©

 

A separate but equally concerning campaign was uncovered around the same time. Known as “Doppelgänger,” it involved cloned websites designed to mimic official EU pages and reputable news outlets. These fake sites published false stories with a distinctly institutional tone, such as claims that the EU would replace cash with a digital euro by 2026 or that Erasmus+ funds were being diverted to finance military operations. One article even claimed a new “click tax” was coming for social media users. While these stories were fabricated, the sites’ visual design — including EU logos, formatting, and language — made them appear authentic. The aim wasn’t to shock but to subtly mislead, preying on the trust users place in familiar formats (European Commission, 2023a).

These are not isolated incidents. According to the European Digital Media Observatory (EDMO), false content related to the EU increased by 35% between 2022 and 2024 — with the sharpest upticks before major announcements or events such as elections and policy summits (EDMO, 2024). The majority of this misinformation circulates through short-form videos and screenshots, not long articles or manifestos. Its strength lies in being visual, immediate, and, above all, emotionally charged.

How is the EU responding?

At the heart of Europe’s new digital playbook lies the Digital Services Act, or DSA — a sweeping piece of legislation that officially took effect in 2023. Widely regarded as one of the boldest efforts yet to rein in the power of online platforms, the DSA sets out clear responsibilities for tech giants like Meta, TikTok, YouTube, and X. The goal? To ensure they take real, proactive steps in tackling the spread of disinformation and managing the risks that come with it. It requires these platforms to explain how their algorithms amplify or limit content and to be far more transparent about how their systems work. The idea isn’t just to punish bad behaviour, but to create a clearer, more accountable online environment. Under the DSA, these platforms are legally obligated to remove illegal content quickly, cooperate with independent auditors, and allow researchers to access data to monitor disinformation flows (European Commission, 2023b). Failure to comply can lead to fines of up to 6% of a company’s global turnover — a deterrent designed to make tech giants take the issue seriously (European Union, 2022).

A more subtle yet crucial initiative is the updated Code of Practice on Disinformation, revised in 2022 and signed by over 40 stakeholders including major tech companies, NGOs, and fact-checking organisations. This isn’t just another empty “pledge.” Under the updated code, signatories must regularly report on how they are reducing the monetisation of disinformation, increasing transparency on political ads, and supporting independent fact-checking across languages and regions. In 2024, the European Commission reported that engagement with fact-checked content had increased by 27% compared to 2022, suggesting the code is beginning to shift platform behavior (European Commission, 2024).

The fight however, doesn’t stop at the platform level. In the run-up to the 2024 EU elections, the EU institutions kicked off a public awareness push under the slogan “#EUandMe: No Lies, Just Facts.” It was a cross-continent campaign aimed at debunking some of the most stubborn myths surrounding EU policies. With young people as the key target, the initiative rolled out content in 24 languages, tapping into formats that feel familiar — from interactive quizzes and bite-sized videos to collabs with TikTok creators who already speak the language of their audiences. The idea was straightforward: show up where people spend their time online, and cut through the noise with facts that actually stick.

It’s not the first time the EU has had to play myth-buster, either. In the past, even the European People’s Party group had to step in to squash one of the most absurd rumors to ever go viral: that the EU was forcing all bananas to be perfectly straight. Yes, really. That surreal little gem of misinformation just wouldn’t die — proof, perhaps, that sometimes the biggest policy challenge is fighting fiction with fact and a straight face.

 


EPP Group in the European Parliament©

Perhaps most impressively, the EU is pioneering open-source AI detection tools that allow journalists and civil society groups to detect deepfakes and content tampering. This move, introduced under the planned Artificial Intelligence Act, puts Europe at the global forefront of regulating synthetic media. While generative AI tools like ChatGPT and image manipulation software pose new challenges, the EU’s approach is not to ban them outright — but to ensure transparency in how they’re used. Labels on AI-generated content and digital watermarking are expected to become mandatory platforms operating in the EU in the coming year (European Commission, 2025).

Why this matters for GenZ

You don’t need to be a policymaker or a political science student to care about how fake information shapes your reality. You just need to be online — and chances are, you already are. Whether it’s deciding who to vote for, whether to trust institutions, or simply trying to figure out what the EU actually does, the stories you’re told shape the choices you make.

Misinformation isn’t just a technical glitch in the system. It’s the system being used against itself. And the EU — for all its bureaucracy and acronyms — is currently one of the few political actors trying to do something coherent, ambitious, and enforceable about it.

 

 

 

References

EDMO. (2024). Trends in EU Disinformation 2022–2024. https://edmo.eu/publications/trends-in-disinformation/EU DisinfoLab. (2023, July 13). Doppelgänger: A pro-Russian influence operation using fake news sites to mimic European media. https://www.disinfo.eu/publications/doppelganger-a-pro-russian-influence-operation-using-fake-news-sites-to-mimic-european-media/
European Commission. (2023a). Protect yourself from disinformation. https://commission.europa.eu/topics/countering-information-manipulation/protect-yourself-disinformation_en
European Commission. (2023b). The Digital Services Act. https://commission.europa.eu/publications/digital-services-act_en
European Commission. (2024). Code of Practice on Disinformation: Progress Report. https://commission.europa.eu/publications/code-practice-disinformation-progress_en
European Commission. (2025). AI and disinformation: Safeguarding democratic debate. https://commission.europa.eu/publications/ai-and-disinformation_en
European People’s Party Group. (2016, July 17). #EUmythbusting campaign photo [Photograph]. Facebook. https://www.facebook.com/photo?fbid=10153826530782689&set=a.397242497688
European Union. (2022). Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act). Official Journal of the European Union, L 277, 1–102. https://www.eu-digital-services-act.com/Digital_Services_Act_Article_74.html
France 24. (2024, February 15). France 24 journalist impersonated in new deepfake video – Truth or Fake [Video]. YouTube. https://www.youtube.com/watch?v=yXcp_h5ugTQ

Shape the conversation

Do you have anything to add to this story? Any ideas for interviews or angles we should explore? Let us know if you’d like to write a follow-up, a counterpoint, or share a similar story.