ICO's Data Sharing Code of Practice comes into force, CMA issues response to consultation on new pro-competition regime for digital markets, UK government publishes five-point plan for digital trade, and more in this week’s round-up of UK and EU techlaw news developments not covered elsewhere on the SCL website.
ICO's Data Sharing Code of Practice comes into force
Firstly, a reminder that the Information Commissioner's Office's Data Sharing Code of Practice came into force on 5 October 2021. It was laid before Parliament on 18 May 2021 and issued on 14 September 2021 under section 125 of the Data Protection Act 2018. For more information about the Code see here.
CMA issues response to consultation on new pro-competition regime for digital markets
The Competition and Markets Authority has issued its response to the UK government’s consultation on a new pro-competition regime for digital markets. The government had consulted on proposals to enhance the CMA’s ability to tackle breaches of competition and consumer law, and to empower the Digital Markets Unit. The CMA supports the reforms proposed by the government and will offer the government assistance and support as they are taken forward. The CMA is clear that large digital platforms should take greater responsibility for monitoring and removing content and banning sellers that breach consumer law on their sites. The CMA is also asking the government to amend existing consumer laws to make them clearer and more efficient to apply in relation to tech giants and other consumer markets. The CMA sees the government’s proposal to provide the DMU, which is currently operating in “shadow form”, with the necessary powers to tackle problems in digital markets as a major, positive step towards better protecting consumers and supporting businesses. The CMA says that this is consistent with legislative changes under discussion, or recently introduced, in other major economies. Some powerful digital firms are showing signs of entrenched market power, which is leading to a worse deal for consumers and businesses across the UK. This significant market power is likely to result in reduced competition, and less innovation and growth in the UK economy. The CMA, and other leading international competition authorities, consider existing competition tools insufficient to address these challenges. The proposals outlined by the government will help tackle these issues by enabling the DMU to implement codes of conduct for digital firms which, if broken, can result in meaningful remedies and, if appropriate, big penalties. The proposals would also give the DMU powers to intervene in the market to boost competition in the long run, as well as new powers to scrutinise, and, if necessary, block, mergers involving the most powerful digital firms.
UK government publishes five-point plan for digital trade
The Department for International Trade has published a five-point plan for establishing a free and fair digital trade landscape to help UK businesses and consumers thrive. It says that many businesses currently face barriers that restrict their ability to benefit from digital technology, such as paperless trading, or force them to meet unjustified requirements to localise data or disclose their intellectual properties such as source code. The five points are: facilitating more open digital markets to ensure British consumers and businesses benefit from greater access to digital markets in other countries; advocating free and trusted cross-border data flows that will make it simpler and cheaper for businesses who use data to trade internationally while maintaining the UK’s standards for personal data protection; championing consumer and business safeguards through enhanced consumer and intellectual property protections; promoting the development and adoption of innovative digital trading systems such as digital customs processes, e-contracting and paperless trading, which can cut red tape and make trade easier, cheaper, faster, and more secure; and establishing global cooperation on digital trade via free trade agreements with international partners and using the UK’s G7 presidency and seat at the WTO to push for countries to become more open to digital trade.
Council agrees position on Data Governance Act
The Council of the EU has agreed on a negotiating mandate on a proposal for a Data Governance Act. The Act would seek to set up solid mechanisms to facilitate the reuse of certain categories of protected public-sector data, increase trust in data intermediation services and promote data altruism across the EU. It would also create a mechanism to enable the safe reuse of certain categories of public-sector data that are subject to the rights of others. This includes for example data protected by intellectual property rights, trade secrets and personal data. Public-sector bodies allowing this type of reuse will need to be technically equipped to ensure that privacy and confidentiality are fully preserved. In this respect, the DGA will complement the Open Data Directive from 2019, which does not cover such types of data. The proposal creates a framework to foster a new business model - data intermediation services - to provide a secure environment to help companies or individuals share data. Data intermediation service providers would need to be listed in a register, so that their clients would know that they can rely on these providers. The service providers would not be allowed to use shared data for other purposes. They would not be able to benefit from the data, for example by selling it on. They may, however, charge for the transactions. The Council position has clarified the scope of these provisions to indicate more clearly which types of companies can be data intermediaries. The proposal also aims to make it easier for individuals and companies to make data voluntarily available for the common good, such as a particular research project. A new structure, the European Data Innovation Board, will be created to advise and assist the European Commission in enhancing the interoperability of data intermediation services and ensuring consistent practice in processing requests for public-sector data, among other tasks.
European Commission announces prospective signatories to join Code of Practice on disinformation revision process
The European Commission has announced that eight new prospective signatories joined the revision process of the Code of Practice on disinformation. The revision of the Code is based on the Commission's Guidance issued in May, which sets out how the current Code should be strengthened to provide a firm response to disinformation. It has also published the outcome of its sixth evaluation of the EU Code of Conduct on countering illegal hate speech online. The evaluation showed that some companies had improved their practices. However, results for some companies have worsened. For the first time, 2021's evaluation contained detailed information about the measures IT companies have taken to combat hate speech outside the monitoring exercise, including automatic detection of hateful content. The Commission will carry on monitoring the implementation of the Code and is encouraging IT companies to take action to address gaps in reviewing notifications and to improve their feedback to users.
BEREC issues proposals for a swift, effective and future-proof regulatory intervention towards digital gatekeepers
BEREC has adopted and published its final Report on the ex-ante regulation of digital gatekeepers. It has now provided further detail about its initial proposals, such as tailoring remedies, the set-up of an Advisory Board for Digital Markets Act, the need and conditions for a regulatory dialogue with different types of stakeholders and the BEREC proposal for a dispute resolution mechanism in the context of the DMA. The BEREC Report highlights the need for an ex-ante asymmetric regulatory intervention towards digital gatekeepers, since it is key to ensure that competition and innovation are encouraged, end-users’ interests protected, and that the digital environment is open and competitive. While supporting a regulatory intervention at the EU level, BEREC believes that national competent independent authorities have existing skills on which the European Commission can rely on in the enforcement of the DMA.
MEPs demand common EU cyber defensive capabilities
MEPs have called for a comprehensive set of measures and a coherent IT policy, as well as improved military cyber defence coordination, to strengthen EU cyber resilience. They have stressed that a common cyber defence policy and substantial EU cooperation on cyber capabilities are among the key issues needed for the development of a deepened and enhanced European Defence Union. Further, they say that it is essential to overcome the current fragmentation and complexity of the EU’s overarching cyber architecture and to develop a common vision for achieving online security and stability. Parliament recommends the creation of a Joint Cyber Unit to improve the lack of information sharing among EU institutions, bodies and agencies and to foster a secure and rapid information network. The EU must become technologically independent and innovate and invest more.
MEPs oppose mass surveillance by AI
MEPS have voted on a resolution to combat discrimination and ensure the right to privacy. They say that strong safeguards are needed when artificial intelligence tools are used in law enforcement. MEPs highlight the risk of algorithmic bias in AI applications and emphasise that human supervision and strong legal powers are needed to prevent discrimination by AI, especially in a law enforcement or border-crossing context. They say that human operators must always make the final decisions and subjects monitored by AI-powered systems must have access to remedies. The ysay that AI-based identification systems already misidentify minority ethnic groups, LGBTI people, seniors and women at higher rates, which is particularly concerning in the context of law enforcement and the judiciary. To ensure that fundamental rights are upheld when using these technologies, algorithms should be transparent, traceable and sufficiently documented. Where possible, public authorities should use open-source software to be more transparent. To respect privacy and human dignity, MEPs ask for a permanent ban on the automated recognition of individuals in public spaces, noting that citizens should only be monitored when suspected of a crime. They also call for the use of private facial recognition databases (like the Clearview AI system, which is already in use) and predictive policing based on behavioural data to be forbidden. MEPs also want to ban social scoring systems, which try to rate the trustworthiness of citizens based on their behaviour or personality. Finally, they are concerned by the use of biometric data to remotely identify people. For example, border control gates that use automated recognition.