Sophie Stalla-Bourdillon looks at the controversial CJEU decision in EDPS v SRB and how other courts are interpreting anonymisation in its wake.
The judgment of the Court of Justice of the European Union (CJEU) in EDPS v SRB has become a pivotal reference point in recent debates on the definition of personal data and the conditions for anonymisation. Since that ruling, debates have intensified with the European Commission (EC) adding to the uncertainty by advancing its own interpretation of the judgment in the Digital Omnibus Regulation Proposal, an approach rejected by the Council of the European Union in a file dated 20 February 2026 following the negative joint Opinion 2/2026 of the European Data Protection Supervisor (EDPS) and the European Data Protection Board (EDPB). A recent report from the EDPB produced after its December stakeholder workshop on anonymisation and pseudonymisation captures a range of interpretative questions raised by the EDPS v SRB judgment, which continue to fuel debate within the data protection community.
We appear to be at an important crossroads: either courts and regulators succeed in articulating a coherent approach to anonymisation that can be understood and applied in practice in mainstream cases, or we risk undermining the very foundations of data protection law.
Against this backdrop, this piece examines whether a few regulatory and judicial developments in the aftermath of the EDPS v SRB judgment (commented here) help clarify its implications. First, it analyses regulatory interpretations, in particular how the EDPB together with the EC articulates the standard of anonymisation under Article 6(11) of the Digital Markets Act (DMA). Second, it examines subsequent judicial interpretations, focusing on how EDPS v SRB has been received by the UK Court of Appeal and on how anonymisation is approached by the French Supreme Administrative Court.
The joint DMA guidelines and Article 6(11) DMA
In its draft joint Guidelines on the Interplay between the DMA and the GDPR, the EDPB together with the EC proposes a way forward for approaching the anonymisation of covered search query data. The public consultation for these guidelines closed on 4 December 2025 and no finalised version has been published yet.
The timing of the release of the draft guidelines is notable: it precedes by about a month the publication of the Digital Omnibus, which introduced a controversial definition of personal data (as discussed in a previous article), a definition the EDPB as mentioned above does not endorse.
In this context, it is worth revisiting the text agreed by the EDPB and the European Commission. Doing so may provide valuable insights into the interpretation of the EDPS v SRB ruling within this specific regulatory landscape.
Article 6(11) DMA provides that:
“The gatekeeper shall provide to any third-party undertaking providing online search engines, at its request, with access on fair, reasonable and non-discriminatory terms to ranking, query, click and view data in relation to free and paid search generated by end users on its online search engines. Any query, click and view data that constitutes personal data shall be anonymised.”
Article 6(11) DMA should be read in the light of Recital 61 DMA. There, the trade-off between reducing re-identification risks and utility is expressed in the following terms:
“When providing access to its search data, a gatekeeper should ensure the protection of the personal data of end users, including against possible re-identification risks, by appropriate means, such as anonymisation of such personal data, without substantially degrading the quality or usefulness of the data.”
The wording of Recital 61 DMA is thus somewhat different from Article 6(11) DMA, as it appears to treat anonymisation as merely one of several possible ways to reduce re-identification risks.
At paragraph 180 of the guidelines, the EDPB and the EC explain the goal of the process to be followed by the gatekeeper as “selecting among various possible ways of achieving anonymisation of data of end users shared under Article 6(11) DMA.” According to the EDPB/EC, the process should be optimised so that “the most quality and usefulness of the data for the third party undertaking requesting access to it” is preserved, while also ensuring that the shared data of end users is anonymised within the meaning of Recital 26 GDPR. There is in the draft guidelines an acknowledgement that a combination of measures needs to be implemented to achieve GDPR anonymisation: “technical measures for alteration of the data, complemented by organisational, administrative and contractual measures to mitigate residual likelihood of identification” will be needed. The EDPB/EC team is even more precise. Together, they write that technical data transformation measures are indispensable, while organisational, administrative and contractual measures can complement technical measures. The risk threshold referred to seems to be the same as the one mentioned in OC and EDPS v SRB. At least, as a matter of principle, the likelihood of re-identification must be insignificant once the controls have been applied.
In addition, it is acknowledged in the guidelines that EC implementing acts can have an impact upon the assessment of whether the ‘insignificant’ threshold has been achieved. In other words, the EDPB/EC team seems to suggest data handling obligations can impact threat modelling and thereby the way re-identification risks are measured and/or evaluated. From the wording of the EDPB/EC, EC implementing acts therefore appear particularly important for the characterisation of personal data.
The draft joint DMA Guidelines should, however, be read together with the EDPB/EDPS joint Opinion 2/2026 on the Digital Omnibus in which the EDPB and the EDPS express concerns about the potential effect of EC implementing acts “de facto affect[ing] the material scope of EU data protection law, effectively redefining the scope of when and for whom information is considered personal data.” The EDPB and the EDPS write that only supervisory authorities, subject to oversight by competent courts, can apply the definitions of the GDPR in an independent manner as guaranteed by Article 8(3) of the European Union Charter of Fundamental Rights.
Furthermore, although Article 6(11) DMA does not comprise such a distinction, the EDPB/EC team draw a distinction between types of data subjects and suggest that article 6(11) data should only be anonymised to protect the interests of end users authoring the queries and not necessarily the interests of natural persons whose data may appear in search queries. Personal data other than end-user personal data should be at a minimum pseudonymised. This distinction is grounded in Recital 61 DMA.
As regards end-user data anonymisation, the core of the assessment lies in evaluating the means available to the requesting party, as well as to potential attackers, whom the EDPB and the EC refer to as unintended third parties, a category of actors often overlooked by those seeking to explain the implications of EDPS v SRB. The EDPB/EC team explain that the closed environment in which the sharing would take place is an important consideration, which implies that the data should not be shared beyond the range of recipients’ processors. Once again, the guidelines refer to the EC’s implementing acts, which appear to be essential for assessing the types of controls put in place to secure such closed data sharing environments and protect the confidentiality of end users. With regard to data transformation techniques, the guidelines refer in particular to removal and generalisation techniques. As for organisational and contractual measures, the EDPB and the EC highlight data access restrictions, as well as monitoring and notification obligations.
Assuming that the approach to GDPR anonymisation remains horizontal, the position of the EDPB/EC team would imply that internal data segmentation measures may be deemed both relevant and effective for actors that are not gatekeepers. If this is correct, it represents a significant shift, since data protection assessments have traditionally been conducted at the organisational level. Under the conventional approach, whenever an organisation holds identifying information in one part of the organisation, it is usually hard to claim that data held elsewhere within the same organisation, even when subject to strict access control, could qualify as anonymised.
Moreover, when monitoring and auditing are particularly important for data controls, one solution often cited is the use of secure processing environments, i.e., query-based systems operated by data holders. Is this the model envisaged by the EDPB/EC, analogous to the approach the Commission envisaged under Article 40 of the Digital Services Act?
What is more, given the challenges of anonymising free text associated with log data, especially as search queries are increasingly in the form of chatbot prompts, claims that the insignificant re-identification risk threshold has been met through the removal of direct identifiers and the generalisation of a few indirect identifiers deserve careful scrutiny. This is particularly important, as such claims could lead to the establishment of a significant precedent for interpreting the implications of EDPS v SRB and may give rise to numerous problematic assertions, which have proliferated following the EDPS v SRB ruling, as illustrated by the Court of Appeal’s decision in the ICO v DGS Retail case.
The UK Court of Appeal in ICO v DGS Retail
It is not surprising to see that EDPS v SRB has triggered attempts to considerably weaken the protection of personal data. This is illustrated by the recent judgment of the Court of Appeal in ICO v DGS Retail of February 2026.
In ICO v DGS Retail the appeal arose from a cyberattack carried out in 2017–2018 against the systems of DSG Retail Limited (“DSG”), the owner and operator of major retail brands. Over roughly nine months, attackers harvested millions of data entries by scraping transaction details from point-of-sale terminals during live purchases. The scraped information was stored on DSG’s servers, and the attackers attempted to exfiltrate it. In total, more than 5.6 million payment cards were affected. In approximately 8,000 cases, the attackers obtained the full 16-digit card number (PAN), expiry date, and cardholder name. However, most cards were protected by the chip-and-PIN system, formally known as electromagnetic verification (EMV), meaning that in the majority of instances the attackers accessed only the PAN and expiry date (“the EMV data”), without names or other cardholder details.
Following an investigation, the Information Commissioner found DSG in breach of Data Protection Principle 7 (DPP7) under the Data Protection Act of 1998 and issued a monetary penalty notice for £500,000. DSG appealed to the First-tier Tribunal, arguing, among other points, that DPP7 (known as the security duty) did not require it to implement appropriate technical and organisational measures against third-party acquisition of the EMV data because such data would not constitute “personal data” in the hands of those third parties. The First-tier Tribunal rejected that argument, holding that it was sufficient that the EMV data qualified as personal data in DSG’s own hands. The Tribunal upheld the monetary penalty notice, although it reduced the penalty by half.
DSG appealed to the Upper Tribunal, which reversed the findings of the First-tier Tribunal and considered that the question of whether third-party acquisition of the EMV data involved personal data had to be analysed from the perspective of the third party attacker. When viewed from that perspective, the data was not deemed personal and therefore the unauthorised acquisition of data was not deemed to amount to “unauthorised or unlawful processing of personal data,” which essentially meant no breach of the security duty. The Information Commissioner appealed.
As stated by Lord Justice Warby writing for the Court of Appeal,
“the question raised by the appeal, simply stated, is whether the law requires a data controller to guard against the risk that data which relate to individuals who can be identified by the data controller will be subject to unauthorised or unlawful processing by a third party who cannot identify those individuals.”
The Court of Appeal allowed the appeal: it found that the Upper Tribunal had erred in law, while discussing EDPS v SRB positively. For the Court of Appeal, it was enough to find that under the Data Protection Act of 1998, read in the light of the Directive on Data Protection, every data controller is subject to a security duty and should therefore protect the data against unauthorised or unlawful processing, irrespective of the status of the data in the hands of the organisation undertaking the unauthorised or unlawful processing.
The Upper Tribunal’s refusal to characterise the conduct as a breach of the security duty reveals a persistent misunderstanding of both the rationale underlying data protection law and established good practices in de-identification. First, as Lord Justice Warby explains it, determining the scope of the security duty by considering that status of the data from the perspective of the third party and its means would in effect “expose individuals to real and substantial risks of harm,” which would significantly undermine the credibility of the overall framework. Importantly, Lord Justice Warby was able to reach this conclusion even if he conceives the security duty as a safeguarding duty, in other words, as an obligation of means to guard against a risk, rather than as an obligation of result to guarantee a particular outcome.
Second, PANs have been for many years protected through the Payment Card Industry Data Security Standard (PCI DSS), a global data security standard that governs how entities store, process, and transmit account data. A PAN is usually considered a personal identifier when the cardholder is a natural person, and as such, a data point that is deemed both distinguishable and available thereby making cardholders indirectly identifiable from PANs. PANs are indeed unique strings of numbers and are frequently used with different types of parties to pay transactions. The PCI DSS standard therefore imposes a set of controls, such as storage restrictions and data transformations when PANs are processed, which give concrete substance to the security duty. Importantly, these controls are designed not to enable public disclosure of the data (under a “release-and-forget” model) but to permit its secure retention in multiple controlled environments. Yet, in the event of unauthorised access, control over the data environment is lost, even if the applied data-transformation technique initially remains unaltered. In other words, the list of measures that arguably had been considered to be important for reducing risks has been altered. De-identification practices show that simply focusing upon the data itself is too narrow a perspective.
When interpreting SRB v EDPS it is worth noting that Lord Justice Warby expressly acknowledges at paragraph 58 that
“[t]he CJEU noted that the GDPR does not specify the relevant perspective for assessing whether the data subject was identifiable (…). It said that the case law showed that “the relevant perspective for assessing whether the data subject is identifiable depends, in essence, on the circumstances of the processing of the data in each individual case.”
In other words, the GDPR does not specify that the relevant perspective for assessing whether the data subject is identifiable is necessarily the perspective of the data holder. What is more, it also emerges from Lord Justice Warby’s opinion that when assessing the status of the data, it is not sufficient to consider the actual re-identification means of the data holder only. Such a position is also confirmed by the French Conseil d’Etat in Sociétés Gers et autres.
The French Conseil d’Etat in Sociétés Gers et autres
Another weak argument is sustained in Sociétés Gers et autres to back up the claim that the data at stake should not be considered personal data. (For those interested in the CE case law, see also its decision of 4 March 2026).
The Conseil d’État (CE) decision in Sociétés Gers et autres of February 2026 was triggered by three deliberations, through which the CNIL had imposed administrative fines on members of the Cegedim group, including GERS, (the applicants) following inspections of the management of two databases: the “THIN” database and the “GERS Études Clients” database.
The inspections showed that the “THIN” database is fed by data collected from physicians using the “Crossway” practice management software, which allows physicians to manage their schedules, patient records, and prescriptions. The “GERS Études Clients” database, in turn, is fed by data collected from pharmacies, extracted from their management software via a module integrated into these programs.
It was established that by the end of March 2021, GERS was responsible for both databases and held in the “THIN” database, data relating to 13.4 million consultations associated with 4 million patient codes, and in the “GERS Études Clients” database, approximately 78 million client identifiers for the 8,500 pharmacies from which it collected data. Using these data, GERS conducts quantitative studies and commercialises statistical data in the healthcare sector for both public and private clients.
The applicants challenged the analysis of the CNIL, which concluded that the data collected in this manner constituted personal data and, consequently, in the absence of the data subjects’ consent, the processing in question had to be authorised under the conditions set out in Article 66 of the French Data Protection Act of 6 January 1978 (Loi relative à l’informatique, aux fichiers et aux libertés).
The French Administrative Supreme Court confirms that the data at issue amounts to personal data, refusing to refer questions to the CJEU. It thus concludes that the CNIL did not make an error of law and correctly characterised the facts by concluding that the data in question, although pseudonymised, was not anonymised.
In reaching this conclusion, the CE relies on the CJEU’s judgment in OC and holds “that data can only be considered to have been rendered anonymous through pseudonymisation if the risk of identification is insignificant, such identification being practically unachievable, particularly because it would require an excessive effort in terms of time, cost, and labour.” (author’s translation).
For the CE, the databases at stake contained a massive amount of data collected from medical practices or pharmacies, including (indirectly) identifying elements such as age, sex, or socio-professional category, as well as health data, including, in particular, medical records, prescriptions, sick leave, and vaccinations for data transmitted by physicians, and information on medications purchased and the prescribing professional for data transmitted by pharmacies.
More specifically, although the data collected from physicians contained only a patient code, and the data collected from pharmacies only a client code, and may be considered pseudonymised, the databases contained precise information about the individuals concerned, such as age, sex, pathologies, and prescribed or purchased medications, as well as dates and sometimes the exact time of the medical visit or purchase, as well as elements enabling the identification or location of the healthcare professionals involved. Regarding pharmacy data, GERS collects prescriber identifiers, which make it possible to determine the identity of the healthcare professional using a publicly accessible online search engine. A regional code was also collected until 2022.
The CE explains that the CNIL’s investigation shows that “it is possible, from these data, to reconstruct care pathways and identify individual clients and their pathologies.” Referring to the CNIL, the CE states that, according to the French supervisory authority such individualisation of the data requires only minimal time and resources, including the use of a standard spreadsheet software and the nomenclature provided by the companies to link alphanumeric codes to patient information and medical acts performed. Ultimately, the CNIL’s deliberations suggest “that the risk of re-identification is high, particularly when prescribed treatments are rare, and when additional information held by the companies, such as identifiers for healthcare professionals, or external data sources, notably geolocation data, may increase this risk.” Finally, the CE makes it clear that the fact that no actual identity inference has been made is irrelevant to determine the potential for reidentification, i.e. to determine whether individuals remain identifiable.
As a result, the CE observes that the CNIL’s concrete assessment of the risk of re-identification (a necessary second step when singling out, linkability and inference risks persist) showed that it was possible to reverse the pseudonymisation of the individuals concerned by reasonable means without accessing the additional information. Although the CE does not need to refer to EDPS v SRB to reach this conclusion, it is aligned with the CJEU decision. In that case the CJEU acknowledged that beyond access to the additional information separated from the pseudonymised data, “cross-checking with other [available] data” is a relevant threat for determining the status of the data.
Conclusion
Despite the intense debate that followed the EDPS v SRB judgment, courts rightly appear to be proceeding cautiously with a view to preserving the integrity of data protection law. That said, it remains unclear whether the EDPB and the EC have found a way to explain how the trade-off between confidentiality and utility should be resolved through anonymisation, given that preserving utility in practice tends to resolve the trade-off in its favour, or whether they are instead moving towards the view a case-by-case approach is inevitable and that real (as opposed to merely formal) controls put in place upon the environment of the data will in some cases bear quite an important role. In any case, as Lord Justice Warby writes “we must aim to ensure, as best as we can, that his area of the law develops in a principled and coherent fashion.” Such a systematisation process would help clarify the distinction between the “relate to” and the identifiability arguments, which are too frequently confused.
Returning to EDPS v SRB, one sentence that has generated numerous creative interpretations is: “data which are in themselves impersonal may become ‘personal’ in nature where the controller puts them at the disposal of other persons who have means reasonably likely to enable the data subject to be identified”. Such a scenario is likely to be of potential relevance in only a limited set of circumstances: either where the data could be considered not to relate to natural persons despite its distinguishability potential before an anticipated data-sharing use case is ever taken into consideration and a such a data-sharing use case is then deemed in scope and would result in the data being linked with personal data, or perhaps where the data was never intended to be shared but the (newly recognised) controller inadvertently put it at the disposal of an attacker. In these situations, the data should then be considered personal data for both parties, as acknowledged by the CJEU in Scania and reiterated in EDPS v SRB with respect to the first situation.

Sophie Stalla-Bourdillon is visiting professor within Southampton Law School, having held the Chair in IT Law and Data Governance there from 2018 to 2022. She is also Co-Director of the Brussels Privacy Hub (BPH), Alongside her academic career, Sophie has extensive industry experience, having worked at Immuta for six years, where she led the Legal Engineering team focusing on the legal and ethical implications of data operations in analytics and AI environments, as well as the impacts of compliance automation.