ICO sets out priorities for Children’s Code strategy

April 5, 2024

Since the introduction of its Children’s code of practice in 2021, the ICO has been working with online services including websites, apps and games to provide better privacy protections for children, aimed at ensuring their personal information is used appropriately within the digital world. The ICO says that there has been significant progress and many organisations have started to assess and mitigate the potential privacy risks to children on their platforms.

The new Children’s code strategy builds on the progress to date and sets out the priority areas that the ICO says social media and video-sharing platforms need to improve on in the coming year, as well as how the ICO will enforce the law and facilitate conformity with the code by industry.

For 2024 to 2025, the Children’s code strategy will focus on:

  • Default privacy and geolocation settings. The ability to ascertain or track the location data of a child creates risks, including potentially having their information misused to compromise their physical safety or mental wellbeing. The ICO says that children’s profiles must be private by default and geolocation settings must be turned off by default.
  • Profiling children for targeted advertisements. Children may not be aware their personal information is being collected, or that it can be used to tailor the adverts they see. This may affect children’s autonomy and control over their personal information, and it could lead to financial harms where adverts encourage in-service purchases or additional app access without adequate protections in place. The ICO says that unless there is a compelling reason to use profiling for targeted advertising, it should be switched off by default.
  • Using children’s information in recommender systems. Content feeds generated by algorithms may use information such as behavioural profiles and children’s search results. These feeds may create pathways to harmful content such as self-harm, suicidal ideas, misogyny or eating disorders. The design of recommender systems may also encourage children to spend longer on the platform than they otherwise would, leading to children sharing more personal information with the platforms.
  • Using information of children under 13 years old. Children under the age of 13 can’t consent to their personal information being used by an online service, and parental consent is required instead. How services gain consent, and how they use age assurance technologies to assess the age of the user and apply appropriate protections, are important for mitigating potential harms.

Further cooperation with other UK regulators such as Ofcom and international counterparts will also be a focus for the ICO.