High Court grants order under Defamation Act 2013, ICO issues updated regulatory statement and Select Committee launches inquiry about the UK government’s approach to tackling harmful online content in this week’s round-up of UK and EU techlaw news developments not covered elsewhere on the SCL website.
High Court grants order section 13 of under Defamation Act 2013
In Blackledge v Person(s) Unknown (the Website METOOUCU.BLOGSPOT.COM)  EWHC 1994 (QB), the High Court made an order against Google LLC under section 13 of the Defamation Act 2013. Section 13 provides for a relatively unused power for a court to use if a defendant does not engage with proceedings and where it is not easy to identify the defendant. If it is unlikely that a defendant will comply with an injunction, it provides a way for a claimant to still obtain a remedy. In this case, the order required Google, as an operator of a website containing defamatory statements, to take down the statements complained of, which actually meant that Google had to take down the website entirely, because the website appeared to have been set up specifically to publish the allegations that the claimant had complained about.
ICO publishes updated regulatory statement
The ICO has published an updated version of its regulatory approach document. It states its commitment to continue taking into account the challenges organisations face, but also makes clear the value of information rights. For example, it expects that organisations should be able to deal with complaints they receive from members of the public, and it expects organisations to have robust recovery plans in place to reduce any backlogs. It says that it will continue to update on its regulatory approach, to provide clarity to organisations both during the pandemic and beyond. This will include updating its Regulatory Action Policy, on which it will consult later this year.
Select Committee launches inquiry about the UK government’s approach to tackling harmful online content
The House of Commons DCMS Sub-Committee on Online Harms and Disinformation has launched a new inquiry into the government’s approach to tackling harmful online content, outlined in its draft Online Safety Bill. The draft legislation would compel social media sites and search engines to remove harmful content such as terrorist content, child sexual exploitation and abuse and disinformation that causes individual harm. The Sub-Committee will investigate how focus has shifted since the introduction of the Online Safety Strategy Green Paper in 2017, including concerns that the definition of harm is now too narrow and may fail to address issues such as non-state intervention in elections, racist abuse and content that contributes to self-harm and negative body image. It will also explore key omissions of the draft Bill, such as a general duty for tech companies to deal with reasonably foreseeable harms, a focus on transparency and due process mechanisms or regulatory powers to deal with urgent security threats and how any gaps can be filled before the Bill is finalised. Another focus will be on where lessons can be learnt from international efforts to regulate big tech, such as in France, Germany and Australia. The inquiry distinct from any work by the Joint Committee on the Draft Online Safety Bill, established by the House of Lords and the House of Commons on 23 July. This inquiry will take a broad approach to scrutinising the Bill, and may cover areas such as how the changing circumstances of its introduction from the Online Safety Strategy Green Paper and Online Harms White Paper to now have shaped its development and how it will interlock with other areas of government policy. Submissions are requested by 3 September 2021.
EU sends letter to Google asking for improved disclosure and compliance
The Commission and Consumer Protection Cooperation authorities, under the lead of the Netherlands Authority for Consumers and Markets and the Belgian Directorate General for Economic Inspection, have written to Google asking it to make clear when it acts as a direct seller or intermediary and to improve its compliance to EU law in general for its various services. The Commission will support national consumer authorities in evaluating the response from Google, taking into account any commitments to modify their websites and services. If the commitments made by Google are not deemed sufficient, a follow up dialogue will take place. National authorities may eventually decide to impose sanctions. Issues identified by national consumer authorities include geo-blocking; transparent search result ranking; transparent business model of services; visible final pricing and reliability of reviews.
Bundeskartellamt says mobile apps offer insufficient consumer protection
The German Bundeskartellamt has presented the results of its sector inquiry into mobile apps to examine consumer rights. The Bundeskartellamt considered three problem areas for mobile end devices run on the Android or iOS operating system. The first was lack of information about data being accessed when using apps. In the case of a large number of apps, users are not adequately informed of the extent to which third companies such as Facebook or Google obtain personal data and specifically what personal information is obtained from the use of apps. Neither the app descriptions in the app stores nor the privacy policies of the app publishers provide sufficient information on this aspect. Preferably, users should be able to search more selectively for consumer-friendly apps (eg without trackers or advertisements) via an improved app store search function. The second issue is lack of transparency about contractual partners: Consumers are not adequately informed about who they actually conclude a contract with when downloading an app. There is no clear guidance about whether the respective app store operator or app publisher is to be contacted for warranty claims. To some extent conditions of use, online help pages and presentations in app stores contradict one another in this respect. The final issue is the lack of options to control data processing: Consumers’ wishes for more control over the processing of their personal data are only rudimentarily addressed in iOS and Android operating system settings. In spite of some innovations in the area of data protection, there is still much room for improvement. Clear and comprehensive information must go hand in hand with simple setting options. This way consumers should be able to effectively deny access to their data via apps and delete all non-system relevant apps.
EDPB adopts Art. 65 decision regarding WhatsApp Ireland
The European Data Protection Board has adopted a dispute resolution decision under Article 65 of the GDPR. The binding decision seeks to address the lack of consensus on certain aspects of a draft decision issued by the Irish supervisory authority regarding WhatsApp Ireland Ltd and the subsequent objections expressed by a number of concerned supervisory authorities. The Irish authority issued the draft decision following an own-volition inquiry into WhatsApp, concerning whether WhatsApp complied with its transparency obligations under Articles 12, 13 & 14. On 24 December 2020, the Irish authority shared its draft decision with the other authorities under Article 60 (3) GDPR. The other authorities issued objections under Article 60 (4) concerning, among other issues, the identified infringements of the GDPR, whether specific personal information at stake was to be considered personal data and the resulting consequences, and the appropriateness of the envisaged corrective measures. The Irish authority was unable to reach consensus, having considered the objections of the other authorities, and consequently indicated to the EDPB it would not follow the objections and initiated the dispute resolution procedure. The EDPB has now adopted its binding decision. The decision addresses the merits of the objections found to be “relevant and reasoned” in line with the requirements of Article. 4 (24). The EDPB will shortly notify its decision formally to the concerned supervisory authorities. The Irish authority shall adopt its final decision, addressed to the controller, on the basis of the EDPB decision, without undue delay and at the latest one month after the EDPB has notified its decision. The EDPB will publish its decision on its website without undue delay after the Irish authority has notified their national decision to the controller.