UK government publishes response to Select Committee report on immersive and addictive technologies: calls for evidence on ‘loot boxes’

June 10, 2020

The UK government has issued its response to the Digital, Culture, Media & Sport Select Committee Report on Immersive and Addictive Technologies. The Committee’s inquiry investigated how games companies operate across a range of social media platforms and other technologies, generating large amounts of user data and operating business models that maximise player engagement in a lucrative and growing global industry.

In summary, the report called for regulation of video game “loot boxes” under gambling law and to ban their sale to children. The report also said that there was a lack of honesty and transparency in some of the evidence given by some games and social media companies to the Committee. It also called upon games companies to accept responsibility for addictive gaming disorders, protect their players from potential harms due to excessive play-time and spending, and along with social media companies, introduce more effective age verification tools for users.

The government says that it welcomes the Committee’s report and believes that immersive technologies and content offer great potential for economic, cultural, and social benefits to the UK.  However, it acknowledges that there is also potential for harm.

In its response, the government highlights its Online Harms White Paper consultation and says it will publish a full response later this year with further policy details which cover several of the Committee’s recommendations. As part of the Online Harms White Paper it said that it intends to introduce in law a new duty of care that will ensure companies who facilitate the sharing of user-generated content have appropriate systems and processes in place to deal with harmful content and activity on their services. It says that the application of the regulatory requirements and the duty of care model will reflect the diversity of organisations in scope and ensure a risk-based and proportionate approach. Companies will be expected to take reasonable steps to respond to and minimise harms, corresponding to the type of service they provide. The government aims to minimise excessive burdens, particularly on small businesses.

Video game research

The government says that it will lead a programme of work to set a framework supporting future independent video games research including workshops with relevant Research Councils, academia and industry. It also wants to explore the potential for the government to create a mechanism to request and analyse industry data whilst working within the Data Protection Act 2018, and the advice of the Centre for Data Ethics and Innovation.

Online age ratings

The Committee had said that the Video Recordings Act should be amended to ensure that online games are covered by the same enforceable age restrictions as games sold on disks. Last year the government called for the adoption of the Pan European Game Information (PEGI) age ratings for every game available online. The majority of video game platforms already use these best practice age ratings. The government will shortly be making a further assessment of voluntary compliance and will continue to work with industry to drive adoption on every major platform. If progress is not forthcoming it will consider amending or creating legislation so as to ensure that consumers are protected from potentially harmful material online.

Online harms regulation

Many of the issues discussed by the Committee’s report will be addressed under the government’s online harms workstream or have been partially addressed already, such as by the ICO’s Age Appropriate Design Code. The government plans to enhance digital regulation. The White Paper proposes establishing in law a new ‘duty of care’ towards users, which will be overseen by an independent regulator. Companies will be held to account for tackling a comprehensive set of online harms, ranging from illegal activity and content to behaviours which are harmful but not necessarily illegal. This regulatory framework would apply to companies that provide services or tools that allow, enable or facilitate users to share or discover user-generated content, or interact with each other online. The government will also support further research about screen time and expects the regulator to support research around designed addiction. The government expects companies to use a proportionate range of tools including age assurance, and age verification technologies to prevent children from accessing age-inappropriate content, and to protect them from other harms.

Loot boxes and gambling

“Loot boxes” allow players to buy a randomised item (with real or virtual currency), which is only revealed after the transaction takes place. They are controversial and there have been calls to regulate them as gambling.

The government has already indicated that it intends to review the Gambling Act 2005. It will also be issuing a call for evidence on loot boxes while monitoring the effect of various industry and trade association initiatives around transparency and protecting consumers. Areas of focus include the size and variation of the market; the design of mechanisms; the context in terms of other types of in-game spending; the effect on young people including links to problem gambling; and the effectiveness of the current statutory and voluntary regulation.

Esports

The Committee had called on the government to set out how a similar framework to the duty of care practices enshrined and enforced by the governing bodies of other sports can best be applied within the growing area of esports. The government is going to carry out research in this area and will encourage initiatives to encourage best practice in areas such as player well-being and esports integrity.

Disinformation

The Committee said that social media platforms should have clear policies in place for removal of “deep fakes”. Further, the government should include action against “deep fakes” as part of the duty of care on social media companies planned in the Online Harms White Paper.

The Online Harms White Paper proposed that companies, where appropriate, take prompt, transparent, and effective action to address online harms, including the propagation of false and misleading content. The government recognises the potential challenges artificial intelligence and digitally manipulated content, including “deep fakes”, may pose. The technology used to manipulate audio and video content is becoming more sophisticated and it is considering these issues carefully as part of efforts to tackle online manipulation and disinformation.

Diversity

The industry is male-dominated. The government is committed to diversity and will continue to monitor existing industry initiatives to improve diversity as well as carrying out research. Areas that research should explore include understanding representation within the sector, across participation, and within gaming content, and the effect that this has, including on societal norms and attitudes.