Algorithmic Surveillance: True Negatives

August 14, 2016

The algorithmic interrogation of information for commercial surveillance purposes is now a well-recognised phenomenon. Such practices have also spread to local law enforcement and crime control. Police forces can use software packages in order to run ‘threat reports’ on an individual’s potential for violence in a similar manner to how a credit report may be generated on an individual applying for a loan.[1] So, we know that humans and institutions can outsource important decisions to algorithms and we now know that such decisions can even be of the life or death variety.[2]

It is also now clear that the algorithmic processing of mass data sets plays an essential role in the modern government surveillance apparatus. Long before the Snowden revelations, it had become clear that the prior restrictions of physical space, memory, and cost had been greatly mitigated by developments in technology. The Snowden documents merely confirmed how large the surveillance apparatus had become with the ‘collect it all’ mentality of the intelligence agencies being explicitly acknowledged. When the collection and storage of data is cheap and convenient, focus shifts to developing the methods necessary to filter that information. When suspicion is not individualised, and when the aim is to prevent terrorism as opposed to investigating terrorist acts, profiles can be used in order to draw together data points indicative of a ‘probable terrorist’. This, of course, requires the development and application of sophisticated algorithms that can process large amounts of information in an effort to predict future behaviour.

With the GCHQ’s mass surveillance activities due before the European Court of Human Rights, it is important to consider whether such activities can ever be considered compatible with fundamental human rights standards. As the ECtHR considers the compilation and processing of data to have private life implications, it is clear that algorithmic profiling comes within the scope of Article 8 of the European Convention on Human Rights. While the majority of surveillance cases before the ECtHR have concerned targeted surveillance, the ECtHR has also addressed the question of generalised surveillance. For example, in the case of Liberty v UK [2008] ECHR 568, the ECtHR examined the practice of intercepting and subsequently filtering all external communications—including telephone, facsimile, and email communications—carried on microwave radio between two British Telecom radio stations.

Article 8 of the Convention is, of course, not an absolute right. An interference can be justified where the surveillance measure in question is found to be ‘in accordance with the law’ and ‘necessary in a democratic society’ in the pursuit of a ‘legitimate aim’. When considering whether an action is ‘in accordance with the law’ in the surveillance context, the ECtHR has placed significant focus on the importance of foreseeability. While the ECtHR acknowledges that some government powers will be exercised in secret, the Court also recognises that secrecy increases the risk of arbitrariness.[3] This reasoning informs the position of the ECtHR that laws should be accessible and citizens should be able to foresee the consequences of the law.

When examining a surveillance act that is deemed to constitute a ‘serious interference’ with Article 8, the ECtHR requires that provision be made for a detailed list of prescriptive safeguards. For example, surveillance legislation should specify the nature of the offences that could give rise to surveillance and the categories of persons affected by the surveillance. While it may be a straightforward matter to detail the relevant offences and persons liable to targeting when directing surveillance at individuals, general surveillance creates new challenges. If the law provides that every individual is potentially (or actually) the subject of surveillance, the value of the safeguard is seriously undermined.

The ECtHR attempted to strike a balance in the admissibility decision of Weber and Saravia v Germany [2006] ECHR 1173, where the ECtHR rejected a challenge to the German system of strategic monitoring, designed, in part, to prevent terrorism in Germany. While the surveillance was generalised, the ECtHR drew particular attention to the use of ‘catchwords’ to filter the information. The German law stated that the catchwords had to be listed in the monitoring order. The ECtHR praised the fact that the catchwords used in strategic monitoring had been authorised by the G10 Commission. While this may be an intuitively satisfying solution at the level of simple and understandable selectors, it is extremely difficult to imagine what rights enhancing effect might be provided by such review of more complex algorithms. This effect is magnified when the algorithms are dynamic and where the eventual criteria for selection are impossible to predict. When the outcome and reasoning of a tool may not even be understood by those who design it, the essential value of foreseeability appears to be unattainable.

But let us assume that the necessary transparency and foreseeability can, somehow, be provided; perhaps by extensive auditing and the establishment of a group of technologically, legally, and ethically trained individuals who can review such decisions. The Convention also requires us to assess the necessity of the surveillance measure. The ECtHR has stated that necessity implies the existence of a ‘pressing social need’ for the interference in question. While we would all agree that the prevention of terrorist attacks is a pressing social need, it is much more difficult to conclude that the algorithmic processing of bulk data sets is proportionate to that legitimate aim.

In order to be considered proportionate, the measure should be suitable and also strike a reasonable balance between the various interests. On consideration of the effectiveness of such analytical tools, they would appear to fall at the very first hurdle of necessity, ie they are not even suitable to achieve the sought after legitimate interest. One of the most concerning issues of algorithmic surveillance is the potential for ‘false positives’. As explained by Bruce Schneier, due to the rarity of terrorist attacks, pattern searching is liable to mistakes.[4] In a climate of heightened fear, these false positives can have tremendously negatives effects on the targeted individual and can also be waste of resources. Where false negatives are also highly probable, the cost of squandering intelligence resources at the expense of traditional investigation can be tragic.

Once again, even if we set aside the problems of legality and necessity of algorithmic surveillance, we are still left with the incredibly thorny problem of initial generalised collection. Based on recent case law from both the ECtHR and the CJEU, this may actually be the most challenging human rights issue for the advocates of mass surveillance to overcome.[5] The claim is sometimes made that existing human rights standards are incompatible with modern technologies. The order of this statement is questionable. Human rights standards were not created as some intangible good, but were created to best enable the protection of fundamental values, like democracy. Where new technologies cannot work within the boundaries of human rights law and rule of law principles, we need to revaluate our acceptance of those technologies rather than rethinking our fundamental ideals. 

Dr Maria Helen Murphy researches and lectures in the areas of privacy, surveillance, human rights, and technology law at Maynooth University. Twitter: @maria_h_murphy; email: maria.murphy@nuim.ie.


[1] Justin Jouvenal, ‘The new way police are surveilling you: Calculating your threat “score”’ The Washington Post (10 January 2016).

[2] Christian Grothoff and JM Porup, ‘The NSA’s SKYNET program may be killing thousands of innocent people’ Ars Technica UK (16 February 2016).

[3] Malone v UK [1984] ECHR 10, 67.

[4] Bruce Schneier, Data and Goliath: The hidden battles to collect your data and control your world (WW Norton & Company, 2015) 137.

[5] Case C-293/12 Digital Rights Ireland and Seitlinger v Minister for Communications, Marine and Natural Resources [2014]; Zakharov v Russia [2015] ECHR 1065; Szabó and Vissy v Hungary App No 37138/14 (ECHR, 12 January 2016).