European Commission issues guidance for a safer online space for children

July 30, 2025

The Online Safety Act has been receiving a lot of media attention this week as the age verification requirements and duty to protect children have come into force. In addition Australia has announced that it is planning to ban the under 16s from having accounts on Youtube.

Alongside this, the European Commission has also issued guidance on protection of children under the Digital Services Act. as well as a prototype of an age-verification app. The guidance aims to ensure that children can continue to take the opportunities that being online offers while minimising the risks they face, including harmful content and behaviour.

The Commission has outlined a number of recommendations to address under the guidelines:

  • Addictive design: Minors can be particularly vulnerable to practices that can stimulate addictive behaviour. The guidance suggests reducing minors’ exposure to such practices, and disabling features that promote the excessive use of online services, like ‘streaks’ and ‘read receipts’ on messages.
  • Cyberbullying: The guidance recommends empowering minors to block or mute users, ensuring they cannot be added to groups without their explicit consent. It also recommends prohibiting accounts from downloading or taking screenshots of content posted by minors to prevent the unwanted distribution of sexualised or intimate content.
  • Harmful content: Some recommender systems put children in harmful situations. The guidance aims to give young users more control over what they see, calling on platforms to prioritise explicit feedback from users, rather than relying on monitoring their browsing behaviour.
  • Unwanted contact from strangers: the guidances recommend that platforms set minors’ accounts that are private by default – these accounts are not visible to users that are not on their friends’ list – to minimise the risk that they are contacted by strangers online.

The guidance outlines when and how platforms should check the age of their users. It recommends age verification for adult content platforms and other platforms that pose high risks to the safety of minors.

The prototype of the age verification app will allow users to prove they are over 18 when accessing restricted adult content online, while remaining in full control of any other personal information, such as a user’s exact age or identity. The verification app will be tested and further customised in collaboration with EU member states, online platforms and end-users.

The guidelines and the age verification app add to the Commission’s work on the protection of minors online through the Better Internet for Kids Strategy, the Audiovisual Media Services Directive and other upcoming initiatives such as the Digital Fairness Act.