According to research, 97% of young people go online every day and 78% of 13 to 17-year-olds check their devices at least hourly. At the same time, one in four minors display behavioural patterns mirroring addiction top smartphones.
With this in mind, MEPs have adopted a report expressing deep concern over the physical and mental health risks children face online and calling for stronger protection against the manipulative strategies that can increase addiction and that are detrimental to children’s ability to concentrate and engage healthily with online content.
Minimum age for social media platforms
To help parents manage their children’s digital presence and ensure age-appropriate online engagement, Parliament proposes a harmonised EU digital minimum age of 16 for access to social media, video-sharing platforms and AI companions, while allowing 13- to 16-year-olds access with parental consent.
MEPS expressed support for the European Commission’s work to develop an EU age verification app and the European digital identity (eID) wallet. However, they say that age assurance systems must be accurate and preserve minors’ privacy. Such systems should not relieve platforms of their responsibility to ensure their products are safe and age-appropriate by design.
To provide incentives for better compliance with the Digital Services Act and other relevant laws, MEPs suggest senior managers could be made personally liable in cases of serious and persistent non-compliance, with particular respect to protection of minors and age verification.
Stronger action by the Commission
MEPs also call for:
- a ban on the most harmful addictive practices and default disabling of other addictive features for minors (including infinite scrolling, auto play, pull-to-refresh, reward loops, harmful gamification);
- a ban on sites not complying with EU rules;
- action to tackle persuasive technologies, such as targeted ads, influencer marketing, addictive design, and dark patterns under the forthcoming Digital Fairness Act;
- a ban on engagement-based recommendation systems for minors;
- application of DSA rules to online video platforms and outlawing of loot boxes and other randomised gaming features (in-app currencies, fortune wheels, pay-to-progress);
- protection of minors from commercial exploitation, including by prohibiting platforms from offering financial incentives for kidfluencing (children acting as influencers);
- urgent action to address the ethical and legal challenges posed by generative AI tools including deepfakes, companionship chatbots, AI agents and AI-powered nudity apps (that create non-consensual manipulated images).
The first draft of the Digital Fairness Act is expected in the second half of next year.