When it comes to online privacy “the consent model is broken” says The Joint Select Committee on human rights

November 5, 2019

The Joint Select Committee on human rights has issued a report on the right to privacy and the digital revolution. A key issue is that some services are not, or are no longer, available offline, so there is no choice for consumers, yet people are being asked to provide their personal information without really understanding the implications of doing so.

The report says that privacy policies are too complicated for the vast majority of people to understand. Although individuals may understand they are consenting to data collection from a given site in exchange for “free” access to content, they may not understand that information is being compiled, without their knowledge, across sites to create a profile. As an example, the Committee heard evidence about eye tracking software being used to make assumptions about people’s sexual orientation, whether they have a mental illness, are drunk or have taken drugs. All this information was added to their profile.

The Committee says that the “consent model is broken” and should not be used as a blanket basis for processing. It is impossible for people to know what they are consenting to when making a non-negotiable, take it-or-leave-it “choice” about joining services like Facebook, Snapchat and YouTube based on lengthy, complex terms, subject to future changes to terms. It is questionable whether such consent is freely given. The consent model puts too much onus on the individual, but the responsibility of knowing about the risks with using web based services should not rest with the individual. Regulation should be strengthened so that safe passage on the internet is guaranteed and the system should provide adequate protection from the risks as a default.

Further, the report says that it is completely inappropriate to use consent when processing children’s data: children aged 13 and older are, under the current legal framework, considered old enough to consent to their data being used, even though many adults struggle to understand what they are consenting to. The digital age of consent of 13 should be revisited. 

The Committee’s report also reiterates the risks of discrimination against some groups and individuals through the way data is used. The Committee heard evidence about some companies using personal data to ensure that only people of a certain age or race, for example, see a particular job opportunity or housing advertisement. There are also long-established concerns about the use of such data to discriminate in provision of insurance or credit products.

Unlike traditional print advertising, where such discrimination would be obvious and potentially illegal, personalisation of content means people have no way of knowing how what they see online compares to anyone else. In the absence of whistleblowers or work by investigative journalists, there currently appears to be no mechanism for protecting against such privacy breaches or discrimination.

As a result, the Committee calls on the UK government to ensure there is robust regulation over how personal data can be collected and used.  In addition, it calls for better enforcement of that regulation.

It should be made much simpler for individuals to see what data has been shared about them, and with whom, and to prevent some or all of their data being shared. The government should look at creating a single online registry that would allow people to see, in real time, all the companies that hold personal data on them, and what data they hold.

The government should be regulating to keep people safe online in the same way as they do in the real world, not by expecting people to become technical experts who can judge whether personal data is being used appropriately, but by having strictly enforced standards that protect rights to privacy and freedom from discrimination.