ICO launches data analytics toolkit, ICO issues fines for nuisance calls, CMA says Gumtree purchase raises competition concerns, Penrose review published on competition and consumer law, and more in this week’s round-up of UK and EU techlaw developments.
ICO launches data analytics toolkit
The ICO has launched a toolkit for organisations considering using data analytics (including algorithms and AI). The toolkit is aimed at organisations at the beginning of their data analytics project lifecycle. It aims to help organisations to recognise some of the central risks to the rights and freedoms of individuals created by the use of data analytics. Therefore, it provides a basic introduction to some of the risks to individuals that data analytics may create or exacerbate and to consider risks, rights and freedoms in the context of data protection law. However, the ICO says that it is not a comprehensive analysis of every factor organisations will need to consider when implementing a data analytics system. The toolkit covers four main themes: lawfulness, accountability and governance, the data protection principles, and data subject rights. Once organisations have used the toolkit, a short report will be created that suggests practical actions to take and it provides links to additional guidance to assist with improving data protection compliance.
ICO issues fines totalling £270,000 to firms making nuisance calls
The ICO has issued fines totalling £270,000 to two separate companies for making unlawful marketing calls to numbers registered with the TPS. Two companies, Call Centre Ops of Nottingham and House Guard of Bournemouth, were found to have made almost 860,000 illegal calls between them, resulting in complaints to both the ICO and the TPS. The ICO’s investigation found that Call Centre Ops, a marketing company, made a total of 159,461 unsolicited direct marketing calls between May and October 2019. They have been fined £120,000. House Guards of Bournemouth, which provides masonry protection solutions, was found to have made 699,966 nuisance calls between May and December 2018, over half of which were to TPS-registered numbers. They have been fined £150,000.
Adevinta’s purchase of Gumtree raises competition concerns
The CMA has found that Adevinta’s anticipated £6.5bn ($9.2 billion) purchase of eBay Classified Group (Gumtree) from eBay could lead to higher prices and less choice for consumers. The CMA has decided, on the information currently available to it, that it is or may be the case that this merger may be expected to result in a substantial lessening of competition within a market or markets in the UK. It will be referred for a phase 2 investigation unless the parties offer acceptable undertakings to address these competition concerns. With the sale to Adevinta, eBay will acquire a 33.3% voting stake in Adevinta and positions on the Adevinta board. This means that eBay will be able to participate in the management of Adevinta and could enable it to influence business strategy. In addition, having reviewed eBays internal documents at the time the decision was made to sell eCG to Adevinta, the CMA considers there would have been a realistic chance eBay would have sold Gumtree to a different purchaser without retaining its influence. This would have resulted in Gumtree becoming an independent competitor to eBay’s marketplace. Following its Phase 1 investigation, the CMA is concerned the merger could lead to a loss of competition with only Facebook Marketplace remaining as a significant competitor. This could reduce consumer choice, increase fees or lower innovation in the supply of platforms that allow people to buy and sell goods online. Adevinta and eBay now have until 23 February 2021 to offer legally binding solutions to resolve the CMA’s competition concerns. The CMA then has five working days to consider whether to accept the offer instead of referring the deal to an in-depth investigation.
Penrose review on consumer protection and competition published
The UK government has published an independent report by John Penrose MP on ways to improve consumer protection and promote competition. In particular, the report says that the CMA’s new digital unit’s extra-strong upfront powers must be ring-fenced tightly, to prevent regulatory creep, otherwise they will steadily spread to cover every digital sector of the economy, with enormous increases in red tape and bureaucracy. To reinforce this central point, the new unit should be called the Network & Data Monopolies Unit (NDMU). Its extra-strong upfront powers must: be a ring-fenced addition to the rest of CMA’s existing competition and consumer powers, so it can use the normal ones wherever possible; only apply to individual firms that own and run new network and data monopolies, rather than to the rest of the sector in which they work; only apply to problems which CMA’s existing competition and consumer powers can’t solve already; and only be extended with the UK parliament’s consent. According to the report, NDMU should have a legal duty to extend and promote competition in the monopolies it regulates, by making pro-competition interventions to reinstate normal competitive conditions wherever it is possible and proportionate, including for example data portability schemes.
Cybersecurity Challenges in the Uptake of Artificial Intelligence in Autonomous Driving
A new report has been published by ENISA and JRC to shed light on the cybersecurity risks linked to the uptake of AI in autonomous vehicles, and provides recommendations to mitigate them. The AI systems of an autonomous vehicle work to recognise traffic signs and road markings, to detect vehicles, estimate their speed, to plan the path ahead. Apart from unintentional threats, such as sudden malfunctions, these systems are vulnerable to intentional attacks that have the specific aim to interfere with the AI system and to disrupt safety-critical functions. Such attacks can include adding paint on the road to misguide the navigation, or stickers on a stop sign to prevent its recognition. The alterations can lead to the AI system wrongly classifying objects, and subsequently to the autonomous vehicle behaving in a way that could be dangerous. The report makes several recommendations, one of which is that security assessments of AI components are performed regularly throughout their lifecycle. Systematic validation of AI models and data is essential to ensure that the vehicle always behaves correctly when faced with unexpected situations or malicious attacks. Another recommendation is that continuous risk assessment processes supported by threat intelligence could enable the identification of potential AI risks and emerging threats related to the uptake of AI in autonomous driving. Proper AI security policies and an AI security culture should govern the entire supply chain. The automotive industry should embrace a security by design approach for the development and deployment of AI functionalities, where cybersecurity becomes the central element of digital design from the beginning. Finally, it is important that the automotive sector increases its level of preparedness and reinforces its incident response capabilities to handle emerging cybersecurity issues connected to AI.
MEPs call for democratic oversight of tech giants to safeguard freedom of expression.
MEPs have criticised the vast power of social media platforms and what they say is their worrying impact on politics and freedom of speech. Citing various decisions taken by the platforms to censor content or accounts, a large majority of MEPs highlighted the lack of clear rules governing such decisions and the lack of transparency of big tech practices. They urged the European Commission to address the issue in the Digital Services Act and the Digital Markets Act, and as part of the Democracy Action Plan. Most speakers focused on the need to provide legal certainty when removing content, and to ensure that such decisions lie with democratically accountable authorities, and not with private companies, to safeguard freedom of speech. Other topics raised included: the need to defend democracy and EU values by tackling disinformation and increasing efforts to subvert them or incite violence; technology being used to enhance rather than limit political discourse, while addressing the issue of proliferation of hate speech and discrimination online; algorithm transparency, use of personal data and the restriction (or ban) of microtargeting and profiling practices to fundamentally alter the business models of tech giants; the problems caused by the emergence of tech monopolies and their impact on media pluralism, as well as on pluralism in public discourse; the false dichotomy between the online and offline spheres and the need for rules that cover all aspects of life; and the systemic risks, as well as the societal and economic harm, that major platforms can cause or exacerbate.
BEUC files complaint against TikTok for multiple EU consumer law breaches
The European Consumer Organisation, BEUC, has made a complaint to the European Commission and the network of consumer protection authorities against TikTok. Consumer organisations in 15 countries have also alerted their authorities and urged them to investigate the social media giant’s conduct. BEUC contends that TikTok breaches various EU consumer laws and fails to protect children from hidden advertising and inappropriate content. According to BEUC, several terms in TikTok’s terms of service are unfair. They are unclear, ambiguous and favour TikTok to the detriment of its users. Its copyright terms are also unfair as they give TikTok an irrevocable right to use, distribute and reproduce the videos published by users, without remuneration. One popular feature of TikTok is that users can purchase coins which they use for virtual gifts for TikTok celebrities whose performance they like. TikTok’s ‘Virtual Item Policy’ which manages this feature contains unfair terms and misleading practices. TikTok also fails to protect children and teenagers from hidden advertising and potentially harmful content on its platform. TikTok’s marketing offers to companies who want to advertise on the app contributes to the proliferation of hidden marketing. TikTok is also potentially failing to conduct due diligence when it comes to protecting children from inappropriate content. TikTok does not clearly inform its users, especially in a way comprehensible to children and teenagers, about what personal data is collected, for what purpose and for what legal reasons. BEUC and its members want authorities to launch a comprehensive investigation into Tik Tok’s policies and practices and to ensure that TikTok respects EU consumer rights.