Photo credit: Pixabay

For decades, media literacy meant learning how to read between the lines. Actually, media literacy was about understanding how television, radio, and newspapers shaped public opinion. It taught us to question headlines, recognise bias, distinguish facts from opinions and defend ourselves against misinformation and propaganda. This core mission of learning the critical-creative thinking remains just as relevant today. And in a way, from a distance, that was the easy part.

But, today, the environment has changed: We no longer live only in a world of traditional or even digital media. We now live in an AI-shaped information space, where the line between reality and fabrication is increasingly thin. People are no longer just consumers of content, but they are also producers, amplifiers and (quite often) an unintentional influencers.

This is why media literacy cannot remain static, but It must evolve.

From critical reading to active responsibility

In the early digital era, media literacy focused on navigating online news, social media platforms and basic digital skills. Users learned how to check sources, recognise clickbait and understand how platforms prioritise content.

Today, in the AI era, the challenge is different.

AI tools allow anyone to generate images, videos, voices, as well as entire narratives within seconds. Content is no longer just shared, but it is manufactured at scale. This means that every user carries more responsibility than ever before. We are not only interpreting information, we are shaping it.

Media literacy therefore needs to evolve into what we might call AI-aware media literacy. Not a replacement, but an upgrade.

The new challenges we did not face five years ago

The first major challenge is deepfakes.

Deepfakes are AI-generated images, videos, or audio recordings that convincingly imitate real people and events. For example, a politician announcing a policy they never agreed to, or a friend’s voice saying words they never spoke. Or an explicit scene that actually never happened. They are problematic not only because they can spread false information, but because they undermine trust itself. When people can no longer tell whether a video is real or fabricated, doubt becomes the default reaction to everything.

This leads directly to the second challenge: erosion of trust.

Disinformation and misinformation do not aim only to deceive. Their deeper goal is to weaken trust in institutions, democratic processes, journalism, science, and even the idea of shared truth. Democracy is often neither fast, nor flashy or algorithm-friendly. It relies on procedures, verification, debate and accountability. When trust in these processes collapses, so does social cohesion and the trust in freedom and democracy.

The third challenge is speed and volume.

In the pre-AI era, misinformation spread quickly. In the AI era, it spreads just as fast as the speed of the light. Text, images, videos, and narratives can be generated, translated, personalised and disseminated instantly. At the same time, this “speed of the light” should be also applies to creating the credible information. The challenge is not technology itself, but whether people have the skills to navigate it. This is where evolved media literacy becomes essential.

Photo credit: Pixabay

What media literacy education must look like today

First, it must include a basic understanding of AI and algorithms.

This does not mean turning everyone into programmers. It means helping people understand how algorithms shape what they see, why certain content reaches them, and how AI systems can be misused. Knowing when technology supports creativity and learning, and when it manipulates or deceives, is now a core civic skill.

Second, media literacy must focus on systematic verification.

Spotting obvious red flags is no longer enough. People need practical skills: reverse image searches, video verification, lateral reading, cross-referencing sources and understanding metadata. These are not expert-level tools anymore; they are basic survival skills in the information ecosystem.

Third, education must clearly address the ethical use of AI.

Using AI is not the problem, but advantage. Using it unethically is the real problem and the disadvantage.

AI can support learning, creativity and productivity when it assists human thinking. But when it replaces thinking entirely, writes instead of us, reads instead of us or even decides instead of us, it undermines human agency. Teaching young people the difference between assistance and substitution is crucial. Ethical or AI literacy is inseparable from media literacy today.

Media literacy as the foundation of human-centred technology

Technology has always evolved. We moved from fax machines and wired phones to smartphones that connect us to the world, help us find jobs, organise travel, build communities and express creativity. Social media evolved from early platforms like MySpace to global spaces where ideas, activism movements and identities are formed.

That is the reason why media literacy must evolve in the same way.

You can read all the books. You can also learn or memorise the scientific facts. But without the ability to place that knowledge in today’s digital and AI-driven reality, it remains oddly disconnected from real life.

True literacy today means knowing how to use information responsibly, ethically, and human-centrically.

As someone working at the intersection of strategic communication, digital transformation and AI literacy (often being in rooms where fear of technology is louder than facts), I strongly believe that this evolution is not optional. It is the foundation of democratic resilience, personal agency and inclusive progress.

If we get this right, no one is left behind. Not children, not adults, not future generations. Because media literacy, evolved for the AI era, is not about fear. It is about empowerment.

And that is the skill that will decide something simple but fundamental: whether technology serves people, or people quietly end up shaped by technology instead.

Written by

Shape the conversation

Do you have anything to add to this story? Any ideas for interviews or angles we should explore? Let us know if you’d like to write a follow-up, a counterpoint, or share a similar story.