
Teenagers spent hours scrolling on social media | Source: European Union, 2020 – EC Audiovisual Services
On 14 July, the European Commission issued new guidelines under the Digital Services Act (DSA), alongside launching an EU-developed age verification tool compatible with the upcoming European Digital Identity Wallets.
Tech giants like TikTok, YouTube, Meta, and X – all classified as “Very Large Online Platforms” under the DSA – will now face stricter scrutiny to show how they protect minors, especially as member states roll out their own national rules.
Five European Union member states are piloting a new age verification app to protect children online: Denmark, France, Greece, Italy, and Spain, while in France, lawmakers recently proposed legislation to ban under-15s from accessing platforms like TikTok or Instagram without parental consent. Other EU Member States, such as Ireland and Germany, may follow suit by introducing mandatory age verification mechanisms or minimum access ages for certain categories of digital services.
“These rules are about giving children a digital space that supports their wellbeing,” said Margrethe Vestager, Commission Executive Vice-President.
While non-binding, the new guidelines set clear expectations for how platforms must act – and the European Board for Digital Services will monitor compliance across the bloc.
Key recommendations include:
- Mandatory private profiles for minors by default
- Safer recommender systems to avoid harmful content and addictive loops
- Stronger tools to block, mute, and avoid group spam
- Bans on downloads/screenshots of minors’ posts to curb sexual exploitation
- Disabling engagement-driving features like streaks, autoplay, and push alerts
- Child-friendly terms, AI safeguards, and ad transparency
- Limits on manipulative monetisation tactics like loot boxes
The rules apply to all platforms accessible to minors, except small and micro businesses. With this move, the EU is once again setting the pace for global digital regulation.