The European Commission has made a preliminary finding that TikTok is in breach of the Digital Services Act (DSA) for its addictive design. This includes features such as infinite scroll, autoplay, push
notifications and a highly personalised recommender system.
The Commission says that TikTok did not adequately assess how these addictive features could harm the physical and mental wellbeing of its users, including minors and vulnerable adults. By ‘rewarding’ users with new content, certain features of TikTok can increase the urge to keep scrolling. The Commission highlights that scientific research has shown that this may lead to compulsive behaviour and reduce the self-control of users.
In its risk assessment, TikTok also disregarded important indicators of compulsive use of the app, such as the time that children spend on TikTok at night, how often users open the app, and other key indicators.
The Commission has preliminarily determined that TikTok seems to fail to implement reasonable, proportionate and effective measures to mitigate risks stemming from its addictive design features. Current measures on TikTok, particularly screen time management tools and parental control tools, do not seem to effectively reduce risks resulting from TikTok’s addictive design. The time management tools have been found to be ineffective in enabling users to reduce and control their use of the app because they are easy to dismiss. Parental controls have been judged to be ineffective because they require additional time and skills from parents to introduce them.
At this stage, the Commission finds that TikTok needs to change the basic design of its service; for example, disabling key addictive features such as ‘infinite scroll’, implementing effective ‘screen time breaks’ and adapting its recommender system.
The preliminary findings of the Commission are part of its formal proceedings to investigate TikTok’s compliance with the DSA. The investigation covers the ‘rabbit hole effect’ of TikTok’s recommender systems, the risk of minors having age-inappropriate experiences due to misrepresentation of their age and the platform’s obligations to ensure high levels of privacy, safety and security for minors.
The investigation also included access to public data for researchers, for which preliminary findings were adopted in October 2025, and advertising transparency, which was closed through binding commitments in December 2025.
The European consumer organisation BEUC has said that it welcomes the Commission’s conclusion that existing measures (e.g. screen time management tools and parental controls) are insufficient and do not effectively reduce harm. It says that “The Commission rightly notes that meaningful DSA compliance will require altering TikTok’s core design – by disabling key addictive features and introducing effective screen-time breaks. Importantly, the Commission acknowledged that addictive design can harm children and adults alike.”