Ofcom proposes updates to its enforcement guidelines, PSA announces transfer of regulatory responsibility to Ofcom in late 2023, OPSS publishes report on impact of AI on product safety, and more in this week’s round-up of UK and EU techlaw news developments not covered elsewhere on the SCL website.
Ofcom proposes updates to its enforcement guidelines
Ofcom is consulting on changes to its enforcement guidelines, to include new powers it has received in recent years and to make the guidelines easier to follow. The proposed guidelines set out how Ofcom approaches enforcement of its regulatory and consumer protection powers across the industries it regulates. It also proposes to expand the guidelines, so they reflect its new powers covering video-sharing platforms and network security. It also plans to simplify the information in the guidelines explaining its different enforcement powers, to reflect more clearly how it carries out investigations. The consultation ends on 19 July 2022, and Ofcom plans to publish its decision later this year.
PSA announces transfer of regulatory responsibility to Ofcom in late 2023
The Phone-paid Services Authority, the organisation which regulates premium rate services, and Ofcom have announced that regulatory responsibility will transfer to Ofcom in late 2023, subject to further DCMS approval. From that point, the PSA would cease to operate as an independent body. This proposed transfer of regulatory responsibilities has been approved by the Ofcom board. The Department for Digital, Culture, Media and Sport have also agreed for in-depth discussions to continue, meaning a programme of work will begin to effect this change. This will include a statutory process and related consultation later this year and be subject to final approval from DCMS. The PSA anticipates that Ofcom will assume responsibility for regulation in the second half of 2023, with PSA staff transferring to Ofcom. Ofcom will look to retain the key components of the PSA’s regulatory Code 15 to continue effective regulation in the consumer interest. PSA regulation and Code 15 will remain in place until the transfer.
OPSS publishes report on impact of AI on product safety
OPSS has published a report on the impact of AI on product safety. The report was commissioned by the Office for Product Safety and Standards from the Centre for Strategy and Evaluation Services. Its objective was to examine the current and forecasted future impacts of artificial intelligence in consumer products, and what this means for product safety. Its scope was manufactured consumer products subject to the General Product Safety Regulations 2005 and other legislation for specific goods.
Applications open for membership of the Government Cyber Security Advisory Board
The Government Cyber Security Strategy (GCSS) was launched in January 2022 outlining the vision to ensure that core government functions are resilient to cyber-attack. Expressions of Interest are now invited for membership of the Government Cyber Security Advisory Board (GCSAB). The objectives of the GCSAB will be to support the government to keep on track to achieve the GCSS aim of ensuring all government organisations across the whole public sector being resilient to known vulnerabilities and attack methods no later than 2030; aid the long-term improvement of government cyber security by providing advice, expertise and guidance, and where required additional programme resource; bring depth and experience across multiple fields with a diversity of viewpoints to improve policy making by providing solutions to particular problems and challenges; identify solutions and proposals on how to achieve the 24 GCSS outcomes; and ensure government is leveraging industry expertise to deliver on the goals of the strategy. The deadline for receipt of applications is 6 June 2022.
Study published saying that current discrimination laws fail to protect people from AI-generated unfair outcomes
A new academic paper has called for changes in current laws to protect the public from AI-generated unfair outcomes. It says that the public is increasingly the unwitting subject of new, worrying forms of discrimination, due to the growing use of AI. For example, using a certain type of web browser can result in a job applicant being less successful when applying online. Candidates in online interviews may be assessed by facial recognition software that tracks facial expressions, eye movement, respiration or sweat. The paper argues there is an urgent need to amend current laws to protect the public from this emerging discrimination. It is creating new digital groups in society – algorithmic groups – whose members are at risk of being discriminated against. These individuals should be protected by reinterpreting existing non-discrimination law. AI-related discrimination can occur in very ordinary, everyday activities with individuals having little awareness. In addition to job applications, other scenarios include applying for a financial loan where an applicant is more likely to be rejected if they use only lower-case letters when completing their digital application, or if they scroll too quickly through the application pages. The paper highlights that these new forms of discrimination often do not fit into the traditional norms of what is currently considered discrimination and prejudice and so legislation needs to be updated.
SLSC reports on proposed amendments to Highway Code for self-driving vehicles
The House of Lords Secondary Legislation Scrutiny Committee considers the policy effects of statutory instruments and other types of secondary legislation subject to procedure. It has considered the Draft Highway Code Amendment (Self-Driving Vehicles) 2022. This instrument brings into effect a new section in the Highway Code which sets out the responsibilities of a driver when using a self-driving vehicle. More comprehensive legislation on the subject was announced in the Queen’s Speech, but interim measures have been necessary to keep up with the pace of development.
European Commission publishes Q&As on new standard contractual clauses
On 4 June 2021, the European Commission adopted two sets of standard contractual clauses, one for the use between controllers and processors within the EEA and one for the transfer of personal data to countries outside of the EEA. The European Commission has now published Q&As about how to use the standard contractual clauses and to help with compliance with the GDPR. They were developed following feedback from various stakeholders on their experiences of using the new standard contractual clauses in the first months after they were adopted. The Commission highlights that the Q&As will be updated as new questions arise.
Behavioural study on unfair commercial practices in the digital environment published
The European Commission has published a behavioural study on unfair commercial practices in the digital environment. It says that the digital environment contains an increasing number of effective artificial solicitations of consumers’ attention that influence them to take transactional decisions that may go against their best interests. Unfair business-to-consumer commercial practices like dark patterns and manipulative personalisation may jeopardise consumer trust in digital markets and exploit consumer vulnerabilities. The report says that these practices call for an investigation of the market situation to ascertain if the existing EU consumer protection framework continues to meet these challenges. A key challenge is that such practices often operate in a blurred area between legitimate attempts at persuasion and illegitimate manipulation techniques. The research conducted for this study shows that dark patterns are prevalent and increasingly used by traders of all sizes, not only large platforms. According to the mystery shopping exercise, 97% of the most popular websites and apps used by EU consumers deployed at least one dark pattern and the most prevalent were (1) hidden information/false hierarchy, (2) preselection, (3) nagging, (4) difficult cancellations, and (5) forced registration. The prevalence of dark patterns nonetheless varies between different types of websites and apps. For example, countdown timers or limited time messages are quite prevalent on e-commerce platforms, while the use of nagging is more customary in health and fitness websites/apps. In general, the prevalence levels are similar for mobile apps and websites, as well as across member states and EU/non-EU traders.