Enforce Digital Services Act quickly and ban harmful practices such as addictive design and gambling-like game features to protect minors, say MEPs.
On Thursday, MEPs from the Internal Market and Consumer Protection Committee adopted a report, by 32 votes in favour, 5 against and with 9 abstentions, in which they express concerns over major online platforms’ failure to protect minors adequately and warn of the risks relating to addiction, mental health, and exposure to illegal and harmful content.
Age assurance and minimums
The text supports the Commission’s efforts to develop privacy-preserving age assurance systems, while warning that such measures must respect children’s rights and privacy, and do not absolve platforms of the responsibility to make their services safe by design.
The MEPs propose an EU-wide digital minimum age of 16 for access to social media, video sharing platforms and AI (artificial intelligence) companions, unless authorised by parents, and a minimum age of 13 to access any social media.
Stronger action by the Commission
The MEPs urge the Commission to make full use of its powers under the Digital Services Act (DSA), including issuing fines or, as a last resort, banning non-compliant sites or applications that endanger minors. They also call on the Commission to:
- consider introducing personal liability for senior management in cases of serious and persistent breaches of minor protection provisions, with particular respect to age verification;
- ban engagement-based recommender algorithms for minors and disable the most addictive design features by default;
- ensure that recommender systems do not present content to minors based on profiling;
- ban gambling-like mechanisms such as “loot boxes” in games accessible to minors;
- prohibit platforms from monetisation or providing financial or material incentives for kidfluencing (minors acting as influencers);
- address the ethical and legal challenges arising from AI-powered nudity apps (that allow users to generate manipulated images of individuals without their consent);
- firmly enforce AI Act rules against manipulative and deceptive chatbots.
Closing legal loopholes
MEPs support the idea that persuasive technologies, such as targeted ads, influencer marketing, addictive design, loot boxes and dark patterns, be tackled under the future Digital Fairness Act. The report calls for EU action to address manipulative features like infinite scrolling, autoplay, disappearing stories, and harmful gamification practices that deliberately exploit minors’ behaviour to boost engagement and spending.
Quote
Rapporteur Christel Schaldemose (S&D, Denmark) said: “Our report clearly states the need for increased protection of minors online in two respects. Firstly, we need a higher bar for access to social media, which is why we propose an EU-wide minimum age of 16. Secondly, we need stronger safeguards for minors using online services. My report calls for mandatory safety-by-design and for a ban on the most harmful engagement mechanisms for minors. I’m proud that Parliament is taking this progressive step to raise the level of protection for minors.”
New survey
A new Eurobarometer survey published today shows citizens’ attitudes towards social media. It examines themes like information habits, social media patterns, exposure to disinformation, and engagement with influencer content. The findings show that young people use media differently from previous generations and are turning increasingly to digital sources.
Next steps
Parliament will vote on its recommendations to increase minors’ safety online at the 24-27 November plenary session.
More information: European Parliament
Leave a Reply