MEPs call on the EU to take ambitious measures to protect minors on the internet, including a minimum age of 16 and a ban on the most harmful addictive practices.
On Wednesday, MEPs adopted a report by 483 votes in favour, 92 against and 86 abstentions, expressing deep concern about the physical and mental health risks faced by minors in the digital environment and calling for greater protection against manipulative strategies that can increase addiction and impair minors’ ability to concentrate and interact in a healthy way with online content.
Minimum age for social networking
To help parents manage their children’s digital presence and ensure that their online activity is age-appropriate, Parliament proposes an EU-wide harmonised minimum age for access to social networking sites, video-sharing platforms and AI buddies at 16, but allowing access from the age of 13 with parental consent.
MEPs back the Commission’s initiative to develop an EU age verification app and the European digital identity wallet, insisting that these systems must be reliable and protect the privacy of minors. They also say that these systems will not absolve platforms of the responsibility to ensure that their products are safe and suitable for different age groups, they add.
To encourage better compliance with the EU’s Digital Services Regulation and other relevant laws, MEPs suggest that senior managers of companies should be held personally liable in cases of serious and persistent non-compliance, particularly with regard to the protection of minors and age verification.
More determination from the Commission
Parliament also calls for:
- ban the most harmful addictive practices and disable by default other addictive features for minors (such as infinite screen scrolling, autoplay, page reloading by scrolling down, reward loops and harmful gamification); (
- banning websites that do not comply with EU standards; (9)
- measures against persuasive technologies, such as personalised ads, influencer marketing, addictive design and misleading interfaces, in the framework of the future digital fairness standard; (14)
- ban referral systems based on profiling and interaction for minors;
- apply the rules of the Digital Services Regulation to online video platforms and ban reward boxes and other random gambling features (in-app coins, wheels of fortune, pay-for-advancement); (52)
- protect minors from commercial exploitation, notably by prohibiting platforms from offering financial incentives to ‘ kidfluencers ‘;
- urgent measures to address the ethical and legal challenges posed by generative AI tools, such as ultra-fakes, companion chatbots, AI agents and AI-based nudity apps (which create manipulated images without consent).
Speaker’s statement
Rapporteur Christel Schaldemose (S&D, Denmark) said: “I am proud of this Parliament, that we have come together to protect children on the internet. Together with the strong and consistent implementation of the Digital Services Regulation, these measures will drastically increase the level of protection for children. We are finally setting limits and saying clearly to the platforms: “Your services are not suitable for children. And the experiment ends here.
Background
The report refers to research showing that 97% of young people go online every day and 78% of 13-17 year olds check their devices at least once an hour. At the same time, one in four children show “problematic” or “dysfunctional” smartphone use, i.e. behaviour patterns that correspond to addiction.
According to the 2025 Eurobarometer, more than 90% of the European population believes that measures to protect children online are urgently needed, particularly because of the negative impact of social networking on mental health (93%), cyber-bullying (92%) and the need to find effective ways to restrict access to age-inappropriate content (92%).
Member States are starting to act and respond with measures such as age limits and verification systems.
More information: European Parliament







Leave a Reply