Lords Committee says democracy under threat from ‘pandemic of misinformation’ online

June 28, 2020

The House of Lords Select Committee on Democracy and Digital Technologies has published a report on Digital Technology and Resurrection of Trust.  

The report says that the UK government should act immediately to deal with a ‘pandemic of misinformation’ that ‘poses an existential threat to our democracy and way of life’. It calls on the UK government to take action immediately to ensure tech giants are held responsible for the harm done to individuals, wider society and democratic processes through misinformation widely spread on their platforms.

The Committee says online platforms are not ‘inherently ungovernable’ but power has been ceded to a “few unelected and unaccountable digital corporations” including Facebook and Google, and politicians must act now to hold those corporations to account when they are shown to negatively influence public debate and undermine democracy.

The Committee’s report sets out a proposed package of reforms which, if implemented, could help restore public trust and ensure democracy does not ‘decline into irrelevance’. Its recommendations are set out below:

Publish draft Online Harms Bill now

According to the Committee, the UK government has failed to get to grips with the urgency of the challenges of the digital age and should immediately publish an Online Harms Draft Bill covering the effect of disinformation. This should give Ofcom, as the proposed Online Harms regulator, the power to hold digital platforms legally responsible for content they recommend to large audiences or that is produced by users with a large following on the platform.

The Committee points out that many content providers are in effect in business relationships with platforms that host their content and the platforms have a duty of care to ensure the content is not harmful, either to individuals or to democratic principles. This should be backed up by the power for Ofcom to fine digital companies up to four percent of their global turnover or force ISP blocking of serial offenders.

Ofcom should also be given the power to ensure online platforms are transparent in how their algorithms work so that they are not operating in ways that discriminate against minorities. To achieve this, Ofcom should publish a code of practice on algorithms including internal and external audits of their effects on users taking into account the characteristics protected in the Equalities Act 2010.

Regulate political advertising

The report states that political advertising should be brought into line with other advertising in the requirement for truth and accuracy. It says the political parties should work with the Advertising Standards Authority and other regulators to develop a code of practice  that would ban “fundamentally inaccurate advertising during a parliamentary or mayoral election or referendum”. This Code would be overseen by a Committee including the ASA, the Electoral Commission, Ofcom and the UK Statistics Authority and would have the power to remove political advertising that breached the code.

This should be supported by a significant strengthening of electoral law, including a requirement for online political material to include imprints indicating who has paid for them, real time databases of all political advertising on online platforms and an increase in the fines that the Electoral Commission can impose on campaigners to £500,000 or four percent of the total campaign spend, whichever is greater.

Introduce a digital ombudsman

The Committee calls on the UK government to establish an independent ombudsman for content moderation. This would ensure the public would have a representative who could both require the tech giants to take down inappropriate content and protect individuals from their content being unfairly taken down by platforms.

Increasing digital media literacy

The Committee also makes recommendations for increasing digital media literacy and developing active digital citizens through changes to school curriculum and adult digital literacy initiatives.  It highlights examples of digital literacy in Finland and Estonia.  

Design of digital platforms

The Committee also considers the design of digital platforms, and suggests that the Centre for Data Ethics and Innovation should conduct a review of the implications of platform design for users, focusing on determining best practice in explaining how individual pieces of content have been targeted at a user. Ofcom should use this to form the basis of a code of practice on design transparency. This should feed into the Department for Education’s review of the curriculum so that pupils are taught what to expect from user transparency on platforms. Ofcom should require large platforms to user test all major design changes to ensure that they increase rather than decrease informed user choices and help devise the criteria for this testing and review the results. There should be genuine and easily understandable options for people to choose how their data is used.