UK law
Online Safety (CSEA Content Reporting by Regulated User-to-User Service Providers) Regulations 2026 made
The Online Safety (CSEA Content Reporting by Regulated User-to-User Service Providers) Regulations SI 2026/268 have been made. They come into force on 7 April 2026 and set out the legal requirements for services to comply with the child sexual exploitation and abuse (CSEA) reporting duty under the Online Safety Act 2023, including registering with the National Crime Agency to submit reports; the key responsibilities of providers who are required by section 66 OSA to report to the NCA; the format of reports; the time-frame for reporting; data retention requirements; and the data that is required within a report such as information about the person submitting the report, information about the detected CSEA content, where possible the priority assessment of the content, and information about the suspected user. Where a service has registered with the NCA but intends or decides at a later date to report to an equivalent body outside of the UK, the service must notify the NCA.
Online Safety Act 2023 (Commencement No 7) Regulations 2026 made
The Online Safety Act 2023 (Commencement No 7) Regulations 2026 SI 2026/262 have been made to bring into force the duty on providers of regulated user-to-user services to report CSEA content to the National Crime Agency and the offence in relation to CSEA reporting under this duty. They also bring into force other provisions in relation to the duty and the offence, including conferring functions on Ofcom in relation to obtaining information and supplementary provision about offences under the Online Safety Act.
Ofcom consults on reflecting new priority offences of serious self-harm and cyberflashing in its guidance
Ofcom is consulting on changes to the Illegal Harms regulatory documents and guidance under the Online Safety Act. In December 2025, the government created two new priority offences under the Act: encouraging or assisting serious self-harm, and cyberflashing. Therefore Ofcom needs to update its regulatory documents and guidance to reflect this change in the law. It is proposing to combine the two priority offences of encouraging or assisting suicide and encouraging or assisting serious self-harm into a single kind of illegal harm which providers must risk assess: “suicide and self-harm”. This would mean providers should assess their risks of both kinds of content and assign an overall risk level for suicide and self-harm. It is also proposing to introduce cyberflashing as a new kind of priority illegal harm that providers must risk assess separately. It will also update the Register of Risks with new evidence on self-harm and cyberflashing. It proposes to update the User-to-User Risk Profile to include risk factors most strongly associated with cyberflashing and with suicide and self-harm, and to introduce a reference to self-harm in the Search Risk Profile. These changes would amount to “significant changes” to the risk profiles, meaning providers would need to update their Illegal Content Risk Assessments. Providers would be required to take these updated risk factors into account when conducting their risk assessments. The risk levels assigned will determine which recommended measures apply. Ofcom also plans to update the Illegal Content Codes of Practice to ensure existing measures apply to cyberflashing and suicide and self-harm in the same way as to other priority illegal harms where relevant. The consultation ends on 24 April.
Ofcom and ICO issue joint statement on common approach to age assurance
Ofcom and the Information Commissioners Office (ICO) have published a joint statement aimed at providing clarity for industry on how to comply with the Online Safety Act and data protection legislation when deploying age assurance. It is designed to be risk-based, flexible, tech-neutral and future proof, allowing space for innovation in the context of rapidly evolving technology and market developments. It sets out Ofcom and the ICO’s common approach to age assurance, age assurance obligations under the Online Safety Act, age assurance obligations under data protection legislation, practical examples illustrating how to implement an age assurance process that meets both online safety and data protection requirements and a summary of requirements related to age assurance under the Online Safety Act and UK data protection legislation.
ICO
The ICO has fined Brimingham-based company TMAC £100,000 for making over 260,000 unsolicited marketing sales calls to numbers registered on the Telephone Preference Service. TMAC sells personal pendant alarms and security systems and made what the ICO called predatory calls between February and September 2024 to people who may need extra support to protect themselves, including the elderly. Call transcripts have revealed that TMAC employees did not reveal their true identity, claiming to be calling on behalf of a variety of different local crime and fire prevention initiatives to dupe recipients. The transcripts also appear to show that callers were actively targeting people aged over 60 years old as part of the unlawful activity. Furthermore, one of TMAC’s company directors admitted that the telephone numbers had been taken from second-hand data that had been acquired at a company he had previously worked for. Enforcement action has been made under section 40 of the Data Protection Act 1998 and issued due to serious contravention of regulations 21 and 24 of the Privacy and Electronic Communications Regulations .
Call for evidence on taxation of stablecoins
The government is seeking views on the taxation of stablecoins, a type of cryptoasset that seeks to maintain a stable value by reference to another asset, such as a fiat currency. The government is seeking views on issues around the tax treatment, as well as any administrative burdens this might cause as the stablecoin market develops. Stablecoins are generally treated like other cryptoassets for tax purposes. With them potentially playing a more significant role in both wholesale and retail payments in the future, the government is considering whether this treatment is appropriate going forwards, and whether it should be considering making changes. The call for evidence ends on 7 May.
First-tier Tribunal says Ofcom could withhold information about meetings with tech companies
In Babbs v Information Commissioner [2026] UKFTT 389 (GRC), the First-tier Tribunal discussed the general restrictions on the disclosure of information under section 393 of the Communications Act 2003. lt confirmed that, in the instant case, Ofcom was entitled to refuse disclosure of information about meetings with tech companies under the Freedom of Information Act 2000 because section 393 of the 2003 Act prohibited it from disclosing information obtained in the exercise of its statutory powers, including both compulsory and voluntary information. It found that sections 44(1) and 44(2) were engaged, meaning that Ofcom correctly withheld some information and refused to confirm or deny that it held other information.
CMA issues Annual Plan 2026 to 2027
The CMA has issued its Annual Plan for 2026-2027. Among other things, the CMA’s new digital markets competition regime focuses on ensuring fair and effective competition in key digital sectors, particularly search and mobile platforms. Following the first Strategic Market Status designations for Google and Apple, the CMA is now developing targeted, proportionate interventions to promote greater choice, fair treatment of businesses, improved interoperability, and more control for consumers and publishers, while continuing to monitor emerging technologies like AI and assessing whether further SMS investigations are needed. From a consumer perspective it is progressing its first investigations under the Digital Markets Competition and Consumers Act, which focus on price transparency, misleading online choice architecture (and hot off the press, fake reviews).
EU law
CJEU says member states may exclude hardware and software from telecommunications infrastructure if the manufacturer poses a risk to national security
In Case C‑354/24 (Elisa Eesti), the Advocate General advised that EU law permits Member States to exclude specific hardware or software from 2G–5G telecommunications networks where the manufacturer is deemed to present a national security risk, emphasising that such restrictions can be justified on public security grounds provided they are proportionate and based on an objective risk assessment.
CJEU says that in a criminal investigation, a police authority may collect biometric data only if strictly necessary
In C‑371/24 (Comdribus), the Court of Justice of the European Union held that police authorities may collect biometric data, such as fingerprints or photographs, during a criminal investigation only where the collection is strictly necessary. This means that it cannot be carried out systematically and must be justified on a case‑by‑case basis with clear, explicit reasons. The Court emphasised that biometric information is sensitive under EU law and requires enhanced safeguards, and that national laws mandating automatic biometric collection without an individualised necessity assessment breach EU data‑protection rules. Importantly, a criminal penalty for refusing to provide biometric data is lawful only if the underlying collection itself meets the “strict necessity” test and respects proportionality.
CJEU says subject access request can be abusive if made for sole purpose of subsequently claiming compensation for an alleged infringement of the GDPR
In C‑526/24 (Brillen Rottler), the CJEU held that even a first access request under Article 15 of the GDPR may be deemed “excessive” and abusive where evidence shows it was made solely to manufacture the conditions for a subsequent compensation claim, rather than to understand or verify personal‑data processing. The Court confirmed that controllers may refuse such requests under Article 12(5) of the GDPR, and that while data‑subjects can obtain compensation for material or non‑material damage arising from infringements of their access rights, they must prove actual harm and cannot recover damages where their own conduct is the determining cause of that harm.
CJEU says a subscriber may terminate a contract for internet access, without costs, where modification is made in comply with a decision of the Court of Justice
In C‑514/24 (Magyar Telekom), the CJEU held that end‑users retain the right to terminate their internet‑access contracts without cost when a provider amends the contract to comply with a national authority’s decision implementing a CJEU ruling, because such changes are not “directly imposed” by EU or national legislation, and therefore do not fall within the exception that removes the right to penalty‑free termination. The Court stressed that this exception must be interpreted strictly to ensure a high level of user protection, noting that neither a CJEU preliminary ruling, being declaratory rather than legislative, nor BEREC guidelines, nor a national authority’s decision interpreting EU law, amount to legislative acts capable of compelling contractual modifications.
International law
White House publishes new national AI legislative framework
The White House has issued a national legislative framework aimed at addressing the most pressing policy topics that AI presents. It addresses various key objectives: protecting children and empowering parents, safeguarding and strengthening American communities, respecting IP rights and supporting creators, preventing censorship and protecting free speech, enabling innovation and ensuring American AI dominance and educating Americans and creating an AI-ready workforce. It says “Importantly, this framework can succeed only if it is applied uniformly across the United States. A patchwork of conflicting state laws would undermine American innovation and our ability to lead in the global AI race. The Federal government is uniquely positioned to set a consistent national policy that enables us to win the AI race and deliver its benefits to the American people, while effectively addressing the policy challenges that accompany this transformative technology. The Administration looks forward to working with Congress in the coming months to turn this framework into legislation that the President can sign.”