UK law
Financial Services Regulation Committee holds inquiry on stablecoins in the UK
The House of Lords Financial Services Regulation Commission is holding an inquiry on the development of, and proposed regulatory response to, stablecoins in the UK. In particular, the inquiry will examine the extent to which stablecoins might disrupt the traditional models of provision of financial services, including for banking and payments services. The inquiry will also assess the potential opportunities and risks that the growth of stablecoins might have on the UK’s financial services sector, and whether the Bank of England and FCA’s proposed regulatory frameworks provide measured and proportionate responses to these risks. It is open for evidence until 11 March 2026.
IPEC strikes claims alleging indirect copying of computer program source code struck out (IPEC)
In Edozo Ltd v Valos (UK) Ltd [2026] EWHC 93 (IPEC) the claimant, Edozo applied to strike out parts of the Defence and Counterclaim of the defendant Valos. Edozo and Valos are competitors in a market which provides valuations and other information about real property. Both offer their customers access to software-based systems which supply the information. Valos argued Edozo was effectively copying its product. The IPEC struck out those parts of the Defence and Counterclaim which alleged indirect infringement of the literary copyright in the Original Valos Computer Program, the Subsequent Valos Computer Programs and the Valos Coded Prompts should be struck out. In practice, this means competitors are permitted to analyse the functional outputs of a program and develop their own version, as long as they do not copy the underlying source code or any resulting graphic works that may be protected by artistic copyright
FCA takes action against HTX to stop illegal financial promotions
The FCA has begun legal proceedings against global crypto exchange HTX (formerly Huobi) for illegally promoting cryptoasset services to UK consumers. From October 2023, all cryptoasset firms marketing to UK consumers, including firms based overseas, must comply with the FCA’s financial promotion regime. The FCA previously warned about HTX’s illegal promotion of crypto services to UK consumers. However, it has continued to publish financial promotions in breach of the FCA’s rules on its website and on social media platforms, including TikTok, X, Facebook, Instagram and YouTube. HTX operates an opaque organisational structure, hiding the identities of its owners and the operators of its website. Repeated attempts by the FCA to engage with HTX have been ignored. Since issue of the proceedings, HTX has taken steps to restrict new UK customers from registering an account. However, existing UK users can still log in and access unlawful financial promotions, and HTX has given no assurance that the changes will be permanent. The FCA therefore remains concerned that the risk of ongoing breaches continues. The FCA has requested social media companies block that HTX’s social media accounts to UK-based consumers and requested the removal of HTX applications from the Google Play and Apple stores in the UK.
Draft Statement of Strategic Priorities for telecommunications, the management of radio spectrum, and postal services laid before parliament
The Statement of Strategic Priorities for telecommunications, the management of radio spectrum and postal services was laid in draft before Parliament on 11 February 2026. It sets out the UK government’s strategic priorities and desired outcomes, including fixed and wireless digital infrastructure, furthering the interests of telecoms consumers, spectrum management, the security and resilience of telecoms infrastructure and postal services. This follows a consultation on the proposed Statement of Strategic Priorities last year. The government is also carrying out a Mobile Market Review to consider the factors shaping industry’s ability to invest in networks over the long-term from investment challenges to technological developments and changes in the market. Telecoms companies have also signed up to a charter which aims to make sure that customers will know exactly what they’ll be paying when they sign up for a new mobile or broadband deal – with no unexpected price rises midway through a contract. Customers will be given clear information on any future price changes up front, so the price they sign up to is the price they can expect to pay.
Ofcom calls for evidence to support statutory report on content harmful to children
The Online Safety Act places a duty on Ofcom to publish a statutory report on content that is harmful to children at least every three years, with the first report due by 26 October 2026. Ofcom has issued a call for evidence to support that first report, which will build on its assessment of content harmful to children in its April 2025 Children’s Register of Risks. The call for evidence ends on 10 March 2026.
Ofcom fines porn company £800,000 for failing to introduce age checks
Under Online Safety Act, sites that allow pornographic material must use highly effective age assurance to prevent children from readily accessing that content. Following investigation, Ofcom has fined Kick Online Entertainment SA £800,000 for failing to comply with these age check requirements between 25 July and 29 December 2025. In response to Ofcom’s enforcement action, Kick has since implemented a method of age assurance that is capable of being highly effective. Ofcom also emphasises that gathering accurate information from companies is fundamental to its job. Information requests can help it to assess and monitor industry compliance with their safety duties, and firms are required, by law, to respond in an accurate, complete and timely way. For failing to abide by these requirements, Ofcom has also fined Kick Online Entertainment SA £30,000. It will also impose a daily penalty of £200 on the company until they respond, or for a period of 60 days, whichever is sooner.
ASA issues research on influencer labelling
The ASA has issued the outcome of a study which explored how consumers recognise advertising disclosures across current platforms and formats, with a focus on fast-scrolling, mixed-feed environments. It centres primarily on Instagram and TikTok, reflecting how influencer advertising is most commonly encountered in practice. The research shows that while many people feel confident spotting influencer ads, that confidence does not always translate into accurate recognition. Compared with traditional brand advertising, influencer advertising is more variable and often less clearly identified. Recognition depends heavily on how content is presented and how clearly advertising signals are communicated. Clear and prominent disclosure plays a critical role. Where content closely resembles organic or editorial posts, disclosure helps people recognise advertising quickly and with confidence. The research also shows that both the wording and placement of labels matter, particularly in short-form video and fast-moving feeds.
EU law
European Commission issues guidelines to protect media content on online platforms
The European Commission has issued new guidelines with the aim of making sure that professional journalism is recognised and protected across the world’s largest digital platforms. These guidelines aim to help Very Large Online Platforms (VLOPs), as defined by the Digital Services Act – and media service providers implement the relevant provisions of the European Media Freedom Act. Article 18 (1) of the European Media Freedom Act introduces specific safeguards to protect media content online produced according to professional standards from unjustified removal. These safeguards require VLOPs to notify media providers in advance when they intend to remove journalistic content and clearly explain the reasons for their decision. Media providers are also given 24 hours to respond before the removal takes effect. To benefit from these safeguards, media service providers need to declare that they fulfil certain elements, such as being editorially independent and subject to regulatory oversight, through a functionality put in place by VLOPs. The guidelines help VLOPs to implement the declaration functionality and guide media service providers in completing and handling their declarations. They also outline procedures for VLOPs to consult regulatory authorities when in doubt, and involve civil society organisations, including fact-checkers, in reviewing declarations.
No communication to public of work published online in countries where access is geo-blocked
In Anne Frank Fonds v Anne Frank Stichting and others (Case C‑788/24), Advocate General Rantos considered questions from the Dutch courts on whether Article 3(1) of the Copyright Directive (2001/29/EC) requires that an online publication constitutes a communication to the public in a particular country only if it is directed at that country’s public, and on how geo‑blocking affects that assessment. He said that the Directive does not require the publication of a work on a website to be addressed to the public of a particular country to be regarded as communication to the public in that country. In addition, the publication of content on a website does not constitute a “communication to the public” under Article 3(1) in a country where such content is protected by copyright and in which that website is subject to effective geo-blocking, as well as any other non-technical measures aimed at impeding or discouraging access alongside the geo-blocking measure and having a deterrent effect in the blocked country, taking into account the circumstances of the case and, in particular, the ability of users in the blocked country to circumvent such measures using a virtual private network (VPN) or similar service and that service providers in the public domain country cannot be subject to unreasonable requirements. Finally, if, taking into account the likelihood of circumventing geo-blocking measures, the publication of content on a website were to be regarded as communication to the public of the work in the country concerned, Article 3(1) precludes a provider of VPN or similar services from being held liable for the acts of a user in a country where access to the work is blocked, when that user makes use of those services to circumvent the geoblocking measures, unless the service provider actively encourages such unlawful use to access the protected work in that country,
European Commission issues consultation on evaluation and update of EU rules on audiovisual media services
The European Commission is consulting on a review of the Audio-Visual Media Services Directive which aims to simplify and adapt the audiovisual media rules to a shifting media landscape. The key priorities include streamlining advertising rules, improving the level playing field between traditional and new digital players, and strengthening the protections of minors on video-sharing platforms. The Commission also aims to secure the prominence of media services of general interest while improving coherence with other EU laws, including the Digital Services Act. The consultation covers four main topics: scope and enforcement, audiovisual commercial communications, protection of viewers and strengthening of media diversity in the internal market. The consultation ends on 1 May 2026.
European Commission announces cyberbullying action plan
The European Commission has announced an anti-cyberbullying action plan, which includes a review of the Digital Services Act guidelines on the protection of minors to strengthen the measures that online platforms have to take to prevent minors from being exposed to harmful content and to easily report it; adopting DSA guidelines on trusted flaggers to clarify their role in tackling illegal content, including illegal cyberbullying content; addressing cyberbullying on video sharing platforms in the ongoing evaluation and the review of the Audiovisual Media Services Directive; supporting the effective implementation of the AI Act provisions on prohibited AI practices, including when they are used for cyberbullying; and facilitating the effective implementation of the AI Act transparency obligations, including through a code of practice on marking and labelling of AI-generated content, which can be misused for cyberbullying. In parallel, the Commission is working on upcoming initiatives such as the piloting of an EU privacy-preserving age verification solution, the upcoming Digital Fairness Act, a panel of experts to inform the Commission’s work on protecting children online and an enquiry on the impact of social media on mental health.