This Week’s Techlaw News Round-Up

July 18, 2025

UK law

Investigatory Powers (Communications Data) (Relevant Public Authorities and Designated Senior Officers) Regulations 2025 made

The Investigatory Powers (Communications Data) (Relevant Public Authorities and Designated Senior Officers) Regulations 2025 SI 2025/808 have been made. They amend Schedule 4 to the Investigatory Powers Act 2016. Schedule 4 (relevant public authorities and designated senior officers etc.) sets out the public authorities, other than local authorities, who may exercise powers under Part 3 of the Act to obtain communications data, the statutory purposes for which the communications data may be obtained, the type of communications data which may be obtained and any designated senior officers within those authorities who may authorise the obtaining of communications data internally, including in urgent cases. The amendments permit certain new public authorities to apply for an authorisation to acquire communications data from the Investigatory Powers Commissioner.

Ofcom launches enforcement programme on children’s risk assessments

Providers of a number of online services have been required by Ofcom to submit their record of their children’s risk assessment by 7 August, or face enforcement action. Ofcom says that risk assessments are fundamental to keeping users safer online. On 24 April, Ofcom published its children’s safety codes of practice and guidance under the UK’s Online Safety Act. From that point, providers had three months to carry out a suitable and sufficient children’s risk assessment, in line with Ofcom’s guidance. Providers must also make and keep a written record of their risk assessment, including details about how it was carried out, its findings and how it is kept under review. To assess and monitor industry compliance with these children’s risk assessment duties under the Act, Ofcom has launched an enforcement programme. This follows a similar enforcement programme launched in March into providers’ compliance with duties to carry out and record illegal harms risk assessments, under which Ofcom has already opened several investigations.

Ofcom publishes paper on deepfake attribution tools under Online Safety Act

Ofcom has published a paper on the merits of so-called “attribution measures”. This includes watermarking tools, provenance metadata schemes, AI labels, and context annotations. These measures are designed in one way or another to attribute certain types of information to a piece of content, for example information about who created it, how and when it was created, and – in some cases – whether the content is accurate or misleading. The paper looks in detail at how each measure works and assesses its strengths and weaknesses. It also considers what it would take to deploy them successfully. The paper draws on the findings of a literature review, interviews with experts, a survey and series of interviews with users of online platforms, and Ofcom’s own internal technical evaluations of openly available watermarking tools. The sharing of certain types of deepfakes is regulated under the Online Safety Act 2023. Ofcom will use the insights to inform its policy development and supervision of regulated services.

Ofcom publishes report about researchers’ access to information about online safety

Ofcom has published a report under section 162 of the Online Safety Act about how and to what extent independent researchers access information about online safety matters from providers of regulated online services. In addition, the report explores the current barriers to information sharing for research purposes and assesses how greater access might be achieved. The report will provide an evidence base to inform the design of any future access framework supporting research into online safety matters.

DSIT and DCMS launch expert working groups on AI copyright framework

The Departments for Science, Innovation and Technology and for Culture, Media and Sport have launched expert working groups with members from the creative industries and AI sectors.  The groups aim to resolve the copyright problems that have arisen in relation to training data for generative AI tools, among other things.  This follows the consultation on AI and copyright from December 2024 which attracted 11,500 responses.  The groups will focus on evaluating the impacts and opportunities surrounding AI and the copyright debate, with an emphasis on enhancing mutual understanding between rights holders and AI developers.

DHSC launches consultation on brand advertising exemption for less healthy food restrictions

The UK government is consulting on draft Advertising (Less Healthy Food) (Brand Advertising Exemption) Regulations 2025 aimed at explicitly exempting brand advertising from upcoming restrictions on less healthy food and drink advertising online and on TV. The restrictions are due to come into force in January 2026. The consultation ends on 6 August 2025.

EU law

European Parliament Committee issues report on copyright and generative AI training

The European Parliament’s Committee on Legal Affairs has issued a report which considers the legal implications of using copyright-protected works to train generative AI systems. The report’s authors do not believe that current EU copyright law, including the text and data mining exceptions in the Directive on Copyright in the Digital Single Market is up to the job of addressing the scale and nature of AI training. It recommends statutory remuneration schemes, clearer opt-out mechanisms, and enhanced transparency obligations.

European Commission seeks feedback for the revision of EU antitrust enforcement framework

The European Commission is launching a Call for Evidence and a public consultation on the future of the EU procedures for the application of EU competition rules. It aims to keep up with transformative changes such as digitalisation of the economy. The call for evidence will focus on the main areas the Commission is currently considering revising in the Regulations, with the objective of enhancing effective and speedy antitrust enforcement. This includes: the Commission’s investigative powers (specifically inspections, requests for information and interviews); certain aspects of its decision-making powers (specifically how it can adopt interim measures and commitments); the process for granting access to the Commission’s file, which is currently burdensome for parties, information providers and the Commission; the procedure for the participation of complainants and third parties in competition investigations; and how to optimise co-enforcement with national competition authorities and national courts. The consultation ends on 2 October 2025.

Commission seeks input on review of the State aid General Block Exemption Regulation

The European Commission has launched a Call for Evidence and onsultation on the scope and content of its review of the General Block Exemption Regulation. The aim of the review is to reduce red tape for businesses as well as for member states, and facilitate necessary support for industry. At the same time, EU State aid rules should continue protecting the level playing field within the EU. The consultation ends on 6 October.

European Parliament briefing examines potential impact of DUA Act reforms on EU GDPR and UK’s adequacy status

The European Parliamentary Research Service has issued a briefing which considers aspects of the reforms introduced by the Data (Use and Access) Act 2025 to the UK regime, and their potential impact on fundamental data protection rights and the UK’s adequacy status with the EU.

Irish DPC announces inquiry into TikTok Technology Limited’s transfers of EEA users’ personal data to servers located in China

The Irish Data Protection Commission (DPC) has announced that it has opened an inquiry into TikTok Technology Limited’s (TikTok) transfers of EEA users’ personal data to servers located in China. The inquiry follows on from the DPC’s decision of 30 April 2025, which also considered TikTok’s transfers of EEA users’ personal data to China under a separate inquiry. However, during that previous inquiry, TikTok maintained that transfers of EEA users’ personal data to China did not occur – user data was stored on servers located outside China and was accessed remotely by TikTok staff from within China. Accordingly, the DPC’s decision of 30 April 2025 did not consider TikTok’s storage of EEA users’ personal data on servers located in China. However, in April 2025, TikTok informed the DPC that it had discovered that limited EEA user data had in fact been stored on servers in China, contrary to TikTok’s evidence to the previous inquiry. The DPC has now decided to open a new inquiry under section 110 of the Data Protection Act 2018.  It will decide if TikTok has complied with its relevant obligations under the GDPR in the context of the transfers now at issue, including the lawfulness of the transfers under  Chapter V of the GDPR.