This Week’s Techlaw News Round-Up

December 15, 2023

UK law

High Court dismisses application for reverse summary judgment in Getty v Stability AI case

In Getty Images (US) Inc & Ors v Stability AI Ltd [2023] EWHC 3090 (Ch), the High Court has turned down an application by Stability AI for reverse summary judgment. This related to a claim for copyright, database right and trade mark infringement, and passing off. Getty images claimed that Stability AI had scraped images from its websites without consent, and used the images in an unlawful way to train and develop its AI model. It also claimed that Stable Diffusion’s output also infringed Getty’s copyright works. The court refused to grant reverse summary judgment. It ruled that the training and development claim had a real prospect of success. Stability AI had argued that the training and development of Stable Diffusion did not take place in the UK. The court said that there were reasonable grounds for taking the view that having more knowledge about the facts would affect the outcome of this matter. It also said that Gerry’s secondary infringement claim, which was depended if sections 22 and 23 of the Copyright Designs and Patents Act 1988 could also encompass dealings in intangible things (such as making available software on a website), had a real prospect of success. It was a novel question which needed to be resolved at trial once the relevant facts had been established.

Guidance issued for court office holders on AI

The use of Ai continues to increase, and so does its relevance to the court and tribunal system. All judicial office holders must be alive to the potential risks. Of particular importance is the need to be aware that the public versions of these tools are open in nature and therefore that no private or confidential information should be entered into them. New guidance has been developed to assist judicial office holders in relation to the use of AI. It sets out key risks and issues associated with using AI and some suggestions for minimising them. Examples of potential uses are also included. It makes clear that any use of AI by or on behalf of the judiciary must be consistent with the judiciary’s overarching obligation to protect the integrity of the administration of justice. It points out that it may be necessary at times to check the lawyers have independently verified accuracy of any research or case citations that have been generated with the assistance of an AI chatbot. AI chatbots are now being used by unrepresented litigants. They may be the only source of advice or assistance some litigants receive. Litigants rarely have the skills independently to verify legal information provided by AI chatbots and may not be aware that they are prone to error. If it appears an AI chatbot may have been used to prepare submissions or other documents, it is appropriate to inquire about this, and ask what checks for accuracy have been undertaken (if any).

ICO fines MoD for data breach

The ICO has fined the Ministry of Defence £350,000 for disclosing personal information of 265 people seeking relocation to the UK shortly after the Taliban took control of Afghanistan in 2021. In September 2021, the Ministry of Defence’s Afghan Relocations and Assistance Policy team sent three emails to 265 people using the “To” field. All email addresses were visible to other recipients, with 55 people having thumbnail pictures on their email profiles. On one occasion, two people “replied all” to the entire list of recipients, with one of them providing their location. The data disclosed, if it had fallen into the hands of the Taliban, could have resulted in a threat to life. As part of the ICO’s public sector approach, fines remain an important regulatory tool for egregious breaches that could cause significant harm to people. The ICO says that the fine serves as a deterrent to data breaches, ensuring that both the MoD and other organisations have appropriate policies and training in place to minimise the risks of people’s information being inappropriately disclosed via email.

ICO consults on guidance on keeping employment data

The ICO is producing an online resource with topic-specific guidance on employment practices and data protection. It is releasing its drafts of the different topic areas in stages and adding to the resource over time. A draft of the guidance on keeping employment records is now out for public consultation. The draft guidance aims to provide practical guidance about how to comply with data protection law when keeping records about workers, and to promote good practice. The ICO also intends to product additional practical tools (such as checklists) to go alongside the guidance to help support employment practices. The consultation ends on 5 March 2023.

Digital Regulatory Cooperation Forum publishes Immersive Futures Foresight Paper

The DRCF has published an Immersive Technologies Foresight Paper. It considers the future of immersive technologies, to better understand how these environments might evolve, the key uncertainties that could drive changes to them, and the benefits and risks to consider. It also sets out the potential regulatory implications that may arise depending on how these environments develop in the future, and how DRCF member regulators will seek to support their development in ways that promote open, competitive markets, as well as protecting consumers and their information rights.

MPs warn that development of a Bank of England retail digital pound should “proceed with caution”

The Treasury Committee has issued a report on the possible introduction of a digital pound, which urges the Bank of England and HM Treasury to address data privacy and financial stability concerns before considering the implementation of a retail Central Bank Digital Currency. The Committee expresses concerns about the new risks a retail CBDC or digital pound could pose to the UK’s financial stability without careful management. It also urges the government to alleviate privacy concerns that organisations or the Government could misuse personal data generated by the introduction of a retail digital pound, for example to monitor or control how users spend their money. It also highlights the importance of ensuring a retail digital pound does not exacerbate financial exclusion by accelerating the demise of physical cash upon which many in the UK still rely. The Treasure Committee notes that the next stage of work on a retail digital pound could incur significant costs and urges both the Bank of England and Treasury to be transparent about these costs through annual reporting. The Treasury Committee supports the further design work being undertaken by the Bank of England but emphasises that this project must not distract the institution from its primary objectives of controlling inflation and maintaining financial stability. Any introduction of a digital pound must be underpinned by a robust cost-benefit analysis.

ICO issues guidance on transferring personal information to US under UK GDPR

The ICO has published guidance about completed a transfer risk assessment when transferring personal information to the US using the transfer mechanism under Article 46 of the UK GDPR. Article 46 sets out the “appropriate safeguards” for making a restricted transfer of personal information to a recipient in the US. Carrying out a transfer risk assessment is a strict legal requirement. It helps organisations ensure that, in the specific circumstances of the restricted transfer, the Article 46 transfer mechanism will provide appropriate safeguards, and effective and enforceable rights for people. It must address risks to people’s rights arising in the destination country from third parties that are not bound by the Article 46 transfer mechanism accessing the information, in particular government and public bodies and risks to people’s rights arising from difficulties enforcing the Article 46 transfer mechanism, as a result of the laws in the destination country.

EU law

Views sought on implementing regulation for transparency reporting under EU DSA

The European Commission is consulting on the Implementing Regulation on the templates that intermediary services and online platforms will have to use for their future transparency reports under the Digital Services Act. The DSA requires providers of intermediary services and online platforms to publish periodic transparency reports on content moderation on their services. These reports must include information such as the number of orders providers have received from member states’ judicial or administrative authorities, the human resources dedicated to content moderation, the number of accounts and items of content taken down voluntarily by the provider, and the accuracy and rate of error of their automated content moderation systems. The consultation ends on 24 January 2024.

Commission launches Implementing Act consultation on data-sharing platform

The European Commission is also consulting on the Implementing act on a data-sharing platform between member states and the Commission. The platform (called AGORA) aims to provide an effective, user-friendly tool for the implementation of the Digital Services Act specifically in connection with the supervision, investigation, enforcement and monitoring of services within the scope of the DSA. The Commission will establish and maintain a secure information-sharing system to support communications between the Digital Services Coordinators (DSCs) in the member states, the Commission, and the European Board for Digital Services (composed of the DSCs). The Commission, the DSCs and the Board must use AGORA for all communications relating to the enforcement of the DSA. The consultation ends on 24 January 2024.

European Parliament calls for new EU rules to address digital addiction

The European Parliament has called for the development of ethical digital products that do not rely on dark patterns and addictive designs. It issued a report which warns of the addictive nature of online games, social media, streaming services, and online marketplaces, which exploit users’ vulnerabilities to capture their attention and monetise their data. They want to increase consumer protection through safer alternatives, even if these are not as profitable for social media platforms. The European Commission is currently evaluating the need to update certain consumer protection legislation to ensure a high level of protection in the digital environment, which results are expected in 2024. The report will feed into this ongoing fitness check.

MEPs call for revision of EU geo-blocking rules

According to MEPs, Regulation (EU) 2018/302 (the EU Geo-blocking Regulation) needs to be better enforced and updated to strengthen the digital single market and better respond to consumer expectations. They have issued a report which highlights the need to re-assess the EU’s rules on geo-blocking, particularly in light of the accelerated digital transformation and surge in online shopping in recent years. As the current rules do not apply to specific digital services offering copyrighted content (such as e-books, music, software and online games), MEPs highlight the potential benefits of including those under EU rules if the service has the requisite rights for the relevant territories. MEPs urge the Commission and member states to facilitate consumer access to cross-border parcel delivery services and to support a reduction in cross-border shipping costs. MEPs also say that online registration and payments methods need to be improved, as in their currency form they undermine the “shop like a local” objective of the rules. MEPs also want more cross-border catalogue availability and cross-border access and findability to sports events through streaming services. They specifically want the European Commission and member states to carefully assess all options to reduce the prevalence of unjust and discriminatory geo-blocking barriers, while also considering the potential impact of this on existing business models and on financing for creative industries. However, MEPs argue that extending the scope of the rules to the audio-visual sector would result in a significant loss of revenue, threaten investment in new content, reduce the cultural diversity of content and decrease distribution channels, and ultimately raise prices for consumers.

European Commission publishes DMA gatekeeper consumer profiling reports template

The Commission has published the template for reporting on consumer profiling techniques and the independent audit of such reports. Gatekeepers are obliged to submit the reports to the Commission under Article 15 of the Digital Markets Act. The reports on consumer profiling techniques must describe, in a detailed and transparent manner, all relevant information on all techniques used for profiling of consumers applied to or across any core platform services offered by gatekeepers. Gatekeepers are required to submit this description to an independent audit, and the reports should also contain the auditor’s assessment on the completeness and accuracy of the description. The gatekeepers designated on 5 September 2023 need to submit their first report as well as a non-confidential overview by 7 March 2024. The Commission has also published the non-confidential replies to the consultation on the draft template for the independently audited description.