The European Commission has launched a formal investigation against X under the Digital Services Act (DSA). The Commission has also extended the investigation that it launched in December 2023 into X’s compliance with its recommender systems risk management obligations.
As SCL readers will be aware, Grok is an AI tool developed by the provider of X. Since 2024 it has been deployed by X in various ways including enabling users to generate text and images and providing contextual information to users’ posts. As a designated very large online platform (VLOP), X must assess and mitigate any potential systemic risks related to its services in the EU. These risks include the spread of illegal content and potential threats to fundamental rights including of minors.
The new investigation aims to assess whether the company has properly analysed and mitigated risks associated with the deployment of Grok’s functionalities into X in the EU. This included risks relating to spreading of illegal content in the EU, such as manipulated sexually explicit images, including content that may amount to child sexual abuse material.
In light of EU citizens’ exposure to these risks, the Commission will further investigate whether X complies with its DSA obligations. These include assessing and mitigating systemic risks, including the dissemination of illegal content, negative effects relating to gender-based violence, and consequences to physical and mental well-being stemming from deployments of Grok’s functionalities on its platform. The Commission has also requested that X conducts and communicates an assessment report for Grok’s functionalities to the
Commission.
The Commission has extended its ongoing formal proceedings opened against X in December 2023 to establish whether X has properly assessed and mitigated all systemic risks as defined in the DSA, associated with its recommender systems, including the impact of its announced switch to a Grok-based recommender system.
If these failures are proven, they would constitute infringements of Articles 34 (1) and (2), 35(1) and 42(2) of the DSA. The Commission has closely collaborated with Irish regulator and Digital Services Coordinator Coimisiún na Meán in preparing for the investigation. The proceedings which started in 2023 also covered the use of deceptive design, lack of advertising transparency and insufficient data access for researchers, for which the Commission adopted a non-compliance decision in early December 2025, fining X €120 million.
This follows Ofcom’s investigation, which began earlier this month. This followed widespread concerns over use of its AI Chatbot Grok to create and share undressed images of people, which may be considered intimate image abuse or pornography, and sexualised images of children that may amount to child sexual abuse material.