How To Make Bad Law

January 27, 2010

When we say a law is good or bad this is usually a social or political judgment. But can we assess the quality of law from a specifically legal perspective? My focus in this article is on the body of computer and communications of law (or law-system). I argue that it is, taken as a whole, of a lower quality than it need be. One of the main reasons for this low quality[1] is that lawmakers have concentrated excessively on technical precision.

1    The search for certainty

In all fields of law, legislation and regulation have become increasingly detailed. Law makers now attempt to spell out the law’s requirements in exhaustive detail, rather than as broad obligations of principle. Law-making which aims at detailed precision has three main characteristics:

1.  Behaviour is regulated by reference to objective and/or quantitative measurements. The aim is to make compliance with the law achievable by meeting these measurements. There is no need to make qualitative judgments about matters such as fairness or reasonableness.

2.  Compliance can usually be achieved by completing legal check lists, a ‘tick-box’ approach to compliance.

3.  Although it is clear what the subject is required to do to comply with the law, it is often unclear why the law imposes those obligations. This may be because of a common unstated understanding between a regulator and its regulatees or because the lawmaker’s aims to establish a controlled market for a new activity without analysing the likely development of that market.

This law-making technique is most evident in the regulation of financial services, but is also particularly common in laws regulating the computer and communications sector.

1.1   Why is precision attractive?

Computing is precise, and this seems to encourage precise legal drafting. If an activity is seen as purely ‘human’ the law is still generally drafted in terms of general obligations.[2] Where it is economic or technological in nature, like the computer and communications sector, technically precise regulation seems an obvious match for the nature of the regulated activity. This also occurs where primary legislation establishes a regulator, whose task is to produce regulation which implements the primary aims of the legislation.

Technical precision is also attractive where the aim is to facilitate a market for a new technological activity. Governments regularly announce policies to encourage innovation and the widespread adoption of information technology. The consequences of innovation and technology adoption have never been predictable, so a favoured technique is to foster a market through which the technology can be adopted and developed. This is done by introducing regulation which purports to encourage commercial use of the technology by removing legal uncertainty.[3] Unsurprisingly such regulation takes the form of detailed technical provisions which address the known uncertainties at the time, rather than attempting to specify a desired social and commercial outcome which the market might not achieve.

The effects of this law-making technique are very evident in the computer and communications sector. To take just one example, the e-Signatures Directive’s concept of an advanced electronic signature based on a qualified certificate and created by a secure-signature-creation device[4] aimed to grant such a signature the same legal effectiveness as a hand-written signature. It did this by prescribing the nature of such a signature in minute detail.

The definition of ‘advanced electronic signature’ in art. 2(2) contains four elements[5], all of which are in theory objectively verifiable by a technical expert. A ‘qualified certificate’ meets the requirements of Annex I and is provided by a ‘certification-service-provider’ who fulfils the requirements of Annex II. Annex I sets out ten data elements which the certificate must contain, and Annex II contains twelve requirements which the provider must meet, only three of which have any real qualitative element. Annex III sets out four matters which a ‘secure-signature-creation device’ must ensure, all of which are technically verifiable. Taken together, this produces a check list of 30 items which, if met, bring the signature within art. 5(1).

As a further example, to prove that this phenomenon is not confined to EU legislation, it is worth examining the Australian Broadcasting Services Amendment (Online Services) Act 1999, which inserted a new Sch. 5 into the Broadcasting Services Act 1992. The aim was to control obscene and indecent material available via the Internet by subjecting ISPs to the same censorship regime as for broadcasting and film. ISPs were intended to comply by refusing to host information classified by the censor as unsuitable, and blocking access to it if hosted by another. Doing so was difficult both because of the nature of the internet technologies and because it required ISPs to make constant value judgments about the nature of hosted material. An alternative was to provide customers with technology which enabled them to filter out undesirable content. If an ISP did this, it had no liability under the Act, so Australian ISPs seized on this objective method of compliance and entered into bulk licensing deals with producers of content filtering software. It is notorious that such software is not very effective, and easy for expert users (such as children) to circumvent. The law has thus had little effect on the accessibility of ‘adult’ content in Australia, which remains much the same as elsewhere in the world.[6]

2    Does certainty make bad law?

Certainty has a number of potential advantages. It is particularly helpful where a person needs to decide whether to take a particular course of action and there is no time to ask a legal decision-maker for a ruling. For example, in many jurisdictions an intermediary who receives notice that unlawful material is being hosted will lose its immunity from liability unless it quickly takes down or removes access to the material. What should the intermediary do if the person whose site is being hosted asserts that the material is lawful? Article 14 of the Electronic Commerce Directive gives no guidance at all, so a sensible intermediary will override the protests of the site owner and disable access for fear of becoming liable to the complainant. By contrast, the US Digital Millennium Copyright Act 1998 s. 512(g) has detailed provisions for a site owner to contest the allegation of infringement. If these procedures are followed, the host knows whether it is potentially liable to the complainant and does not need to decide whether to risk leaving the material accessible.

However, in many instances this technical certainty is illusory. The aim of the 30-item checklist in the e-Signatures Directive legislation is clearly to enable a relying party to answer objectively the question whether the signature is legally valid, because if the accuracy of the certificate had to be checked, or the security of technical processes, no-one would accept such a signature. Unfortunately these apparently objective tests are not objective. A legal specialist is needed to identify which parts of the 30-item checklist are important and what they mean in the context of the particular transaction. A technical expert must advise whether those requirements have been met. Finally the legal expert needs to review the technical expert’s opinion, to assess whether a court would be convinced by the technical expert’s argument. If this is certainty, it is a very uncertain type of certainty, and it is not surprising that the law has done almost nothing to encourage the use of electronic signatures.

The disadvantages of the search for certainty are less obvious, but more dangerous. One of the most unfortunate consequences of the search for certainty has been a real weakening of the normative effect of some parts of computer and communications law. This weakening occurs in at least three ways:

In the case of a law which aims to enable or encourage an activity, the law is ignored by avoiding that activity. Compliance with the detailed technical requirements of the s-Signatures Directive is so difficult, and because of their complexity uncertain, that enterprises have sought and found alternative ways of doing business. In the worst case, the law is actively disobeyed. An example of this occurred in 2003 when the UK FSA attempted to impose the requirements of the e-Money Directive on mobile telephone companies.[7] If they complied, they would be prohibited from continuing to offer telephony services and would thus be forced out of business. If they continued to offer telephony services, they could not seek authorisation as electronic money issuers. The companies dealt with this problem in a commercially realistic way by defying the FSA, which eventually backed down and left the issue to be resolved as part of the review of payment services regulation at EU level.

More insidiously, the law’s subjects may cease to consider the normative aims of the law, and instead seek to do no more than to comply with the detailed, tick box requirements, even if such compliance does not further the law’s aims. This might be described as complying only with the letter of the law, whilst ignoring its spirit.

3    Standards for law-system quality

The search for certainty in law-making encourages the law’s subject to ignore the propriety of their actions in relation to the aims of the law. Does this mean it is bad law?

Lon Fuller argues that a minimum internal morality is a necessary prerequisite for a purported law-system to be effective in ‘the enterprise of subjecting human conduct to the governance of rules’.[8] This internal morality of law does not prescribe what must be done, but rather sets out aims which lawmakers should aspire to achieve. The more nearly those aims are achieved, the ‘better’ the result is as a law-system. By means of his parable of Rex the King, Fuller elucidates eight guiding principles of internal morality.[9] Much of the law relating to computer and communications activities fails to comply with at least three of these.

First, says Fuller, the law’s rules should be understandable by those who have to comply with them. Laws expressed in great detail can impose so great a volume of requirements that it becomes extremely difficult to work out which of them apply. The EU data protection regime provides a clear example of this. It consists not simply of the Directive and its national implementing law, but also of a large mass of reports, recommendations, guidance notes from regulators and the like. The UK Information Commissioner’s web site lists 33 Good Practice Notes and 19 Technical Guidance Notes as at November 2009, together with a similar volume of other guidance material.[10] This is clearly difficult to assimilate. Organisations regularly cite data protection concerns as a reason for failing to do something which is perfectly lawful.[11] Lawyers may be able to understand how to comply with data protection law (or, more accurately, some of them may understand how to comply with some of it), but this does not seem to be true of those whom the law regulates.

Such a volume of detailed provisions can make it impossible to understand the overall legislative aims without working through the complex interlocking of the detailed requirements. Where a new technology or business model is adopted, the detail of the law may not match the new activity[12] If the aims of the law are not clear, its application will also be uncertain..

Second, rules should not be contradictory. The greater the mass of technical detail the more scope there is for internal contradiction. As an example, the UK Regulation of Investigatory Powers Act 2000 distinguishes between the content of a communication, which can only be intercepted under a warrant, and data about a communication (not including its content), which can be demanded by notice given by a designated person. Each of these processes is governed by a detailed Code of Practice which has statutory force.[13] The dividing line between these two categories of data is often hard to draw, in part because the concepts derive from pre-internet communications technologies which made their separation far clearer.[14] Often both might potentially apply with contradictory results. Contradictions of this kind are particularly likely in the computer and communications field because of technological change. Because law changes at a far slower rate than technology, contradictions can persist for substantial periods of time.

Third, rules must not be changed too frequently to permit compliance. An inevitable consequence of legislating objective and quantitative requirements is that frequent amendment is required to take account of changes in social or business conditions. The burden of keeping up to date and changing operations in accordance with the revised law can weaken the law’s force as a guide to behaviour. Anecdotally, the employment policies which HR departments produce in response to legal and regulatory change are often seen as barriers to be overcome rather than as normative guidance about proper employment behaviour.[15] Data protection is in danger of following the same route. The UK Data Protection Act 1998 has seen 29 statutory instruments amending or supplementing the rules since the passage of the Act. In addition, there is copious guidance material as we saw above. All this information is, of course, very helpful in enabling data controllers to comply with the law, but a law which requires such regular amendment and explanation is likely to be less well-understood, and less well-respected, than a law which is stable over time.

As well as internal morality, the substantive content of the law’s rules is also important. A law-system could meet all the elements of Fuller’s test but still be a ‘bad’ system because it achieves nothing at all. Imagine that the dictator of Ruritania, inspired by Rex the King, decides to produce a comprehensive e-commerce code. Unfortunately the dictator has also been influenced by the example of the megalomaniac villains in Bond films, and so the code says nothing about the formation and validity of online contracts, how to comply with legal requirements for signatures, the obligations of online sellers or the protection of online consumer buyers. Instead it requires all those engaged in e-commerce to wear Ruritanian national costume and demands that all online-sellers should include a sound file of the Ruritanian national anthem on each web site page. This law is clear, non-contradictory and will not need regular amendment. However, it cannot reasonably be disputed that the dictator’s code is a bad law-system for e-commerce because it fails to achieve its aim of regulating e-commerce effectively.

A failure to achieve some minimum proportion of effective rules would make it fair to describe the law-system as bad, because ineffective rules reduce the willingness of the law’s subjects to comply with the law, and thus weaken its normative effect. On this test, how good is the computer and communications law-system?

I have examined aspects of this question at length elsewhere[16] and so this part of the article offers just two examples. The first is the Databases Directive,[17] where the aim of the legislator was that:

… databases which qualified for copyright protection under the ‘sweat of the brow’ regime would no longer be protected. In exchange, and in order to compensate for the loss of the ‘sweat of the brow’ protection, the ‘sui generis’ form of protection for ‘non-original’ databases was introduced as an entirely novel form of intellectual property.[18]

However, the actual wording adopted in the Directive was ineffective to achieve this aim. In British Horseracing Board Ltd and Others v William Hill Organization Ltd[19] the ECJ held that the sui generis right did not apply to single source databases, those databases whose content was generated by their maker rather than being acquired from external sources. The result is that most single source databases, often the most commercially valuable, not only receive no protection via sui generis right but have also lost copyright protection in those countries where it subsisted prior to the implementation of the Directive.

From outside the EU the US Communications Decency Act 1996 similarly failed. The intention was to introduce new criminal offences of knowingly creating, sending, transmitting or displaying obscene or indecent materials to minors, or knowingly permitting the use of one’s telecommunications systems for these purposes. As a counterbalancing element, s. 230 provided protection for ‘Good Samaritan’ activities on the part of ISPs, allowing them to introduce blocking or filtering technology without becoming responsible for the third party content. However, the new criminal offences were struck down in ACLU v Reno[20] as infringing the First Amendment protection for freedom of speech, thus leaving the immunity provisions to stand alone. The outcome was almost exactly the opposite of the lawmaker’s aims. No criminal offences were created, and ISPs were encouraged to refrain from acting as Good Samaritans for fear of losing the immunity. In subsequent litigation it has been held that s. 230 provides a complete immunity from civil actions for defamation,[21] even where the ISP pays the author for the right to provide access to the defamatory material,[22] and even from a claim alleging negligence in failing to prevent continued solicitations to purchase child pornography made via the ISP’s system.[23] On the positive side, the Communications Decency Act is likely to have played an important part in encouraging the development of Internet content. However, there can be no dispute that, assessed against its legislative aims, it was a complete failure.

These examples of failure demonstrate, unsurprisingly, that the computer and communications law-system has not achieved perfection. But are the failings resulting from the search for certainty so extensive that a change is needed?

Aernout Schmidt has recently suggested that the quality of a law-system could be objectively assessable.[24] His argument is too complex to summarise here, but relies on finding or collecting data about the choices made by those who are currently part of the law-system under analysis and those who are outside it. Three variables are likely to be a good guide to quality:

A positive net inflow of participants to the law-system suggests it is of high quality, whereas a net outflow suggests the opposite. In the case of e-money, we know that there were few who joined the e-money law-system, and at least one of the major players, PayPal, chose to leave for the, presumably better, system of ordinary banking regulation.[25]

The proportion of participants who attempt to evade the provisions of the law-system is inversely related to its quality. On this measure, the EU e-Signature regime would appear to be a low-quality law-system. Most strikingly, the law of copyright as it applies to music downloads exhibits such a high degree of evasion by those subject to the law[26] that, on this measure, its quality is very low indeed.

Finally, a high proportion of participants who revolt against the law-system, coupled with outsiders who fight against it, is also an indicator of low quality. Copyright in online music is a good example of this phenomenon, as is the defiance of the UK FSA by the mobile telecommunications providers. At the time of writing, there appears to be a high level of both internal and external opposition to the Australian online censorship proposals.[27]

Although no overall conclusions can yet be reached by this method, enough evidence is available to suggest that at least parts of the system are in a poor state.

4    Reducing certainty to make better law?

What would happen if we abandoned the drafting style which attempts to define compliance in objective terms? Law-making would then have to influence behaviour by requiring the law’s subjects to make their own qualitative assessments as to whether they were meeting the obligations imposed on them. Laws would concentrate on human actors, rather than on the technological activities those actors engage in.

An obvious objection to this approach to law-making is the assertion that technologically-based activities can only be regulated through laws which match the technology, and in particular set out detailed technological or objective compliance requirements. The only way to rebut such an assertion is to redraft an existing law in accordance with these principles.

In an attempt to deal with this issue, I set out below a partial redraft of data protection law which aims to comply with the principles of law-system quality. This redraft deals only with the obligations of those who make use of personal data (data controllers and data processors, in the terminology of the Directive). I contend that this redrafting, brief though it is, captures the entire essence of that part of the law.

The Data Protection (Internal Morality and Effectiveness) Bill 2009

1.  No person shall use personal data other than in a way which the user reasonably believes does not contravene the data protection principles. In determining the reasonableness of such belief, regard shall be had to any agreement between the user and the data subject and any published privacy policy of the user.

2.  No person shall disclose personal data to any person whom the discloser does not reasonably believe will comply with the obligation set out in Section 1.

3.  No person shall store personal data unless they reasonably believe that  precautions have been taken which will prevent its use or disclosure by any person in a way which would contravene the data protection principles.

This drafting approach meets the test for law-system quality in four respects:

The obligations imposed are sufficiently concise, and the wording is sufficiently simple (even though in legal style), that subjects have a real chance of understanding how the law expects them to behave. The normative effect of the law should be stronger, leading to an increased likelihood that it will be complied with in a way which meets the aims of the law.

The conceptual complexity of the law is reduced to three simpler concepts: use, disclosure and storage. These are much closer to what actually happens to personal data than the more complex defined terms of the current law. Metaphysical discussions such as that in Lindqvist,[28] which held that placing information on a web site was not a ‘transfer’ whereas sending the same information on disk would have been, are thus avoided – both are clearly disclosures of the data.

This approach more closely mirrors the fundamental aim of data protection law, which is to provide adequate protection for personal privacy in data. The current law would allow a UK data controller to transfer personal data to, say, an Italian company which is known to engage in extensive spamming, leaving it to Italian law to control the activities of the recipient. The drafting above would make the transfer (‘disclosure’) an infringement by the UK company. It seems to me that this is closer to the aims of data protection than the existing law.

In order to comply, those subject to the law must form a belief about the appropriateness of their intended use or disclosure. This directs their attention to the aims of the law rather than to a checklist of detailed requirements to be met.

To confine this redraft to the current scope of the law, the definition of personal data would need to be modified to include the concept of a filing system, taken over from art. 2(c) of the Data Protection Directive, otherwise the law would extend to gossip in a letter or personal information communicated verbally. The distinction would be between personal information which is merely known and that which is collected or stored in a systematic way. The uncertainty whether the law’s subjects had managed to comply with their obligations could be mitigated by enforcement provisions which imposed an initial sanction of modifying the processing within a reasonable time, rather than a fine or imprisonment. The fundamental principle of the draft, that there should be a belief that personal data will be treated properly based on reasonable grounds, is similar to much of the criminal law, and there is no major complaint that criminal law is too uncertain to allow compliance.

5    Conclusions

In this article I have attempted to demonstrate three things:

The search for certainty in the law, which manifests itself through exhaustively detailed obligations expressed in as objective terms as possible, pervades much of the computer and communications law-system.

One consequence of this is a degradation in the quality of the law-system, assessed by reference to Fuller’s minimum internal morality of law and the principle that a law-system’s rules should largely be effective in meeting their aims.

An alternative approach to law-making which addresses human behaviour and beliefs, rather than specifying compliance at a technical level, can produce a law which is not immediately implausible and which more closely meets the tests for quality. The sacrifice of certainty is more than outweighed by the improvement in the ability of the law’s subjects to understand the obligations imposed on them and the increase in the normative effect of the law.

Of one matter, I have no doubt. Many parts of the computer and communications law-system are seriously defective. How far I have explained the reasons for this, and whether my proposed solution is indeed an improvement, must remain for the reader to judge.

 

Chris Reed is Professor of Electronic Commerce Law, Queen Mary University of London School of Law, Centre for Commercial Law Studies.

He offers his special thanks to Graham Smith of Bird & Bird and Diane Rowland, who commented on a very different version of this paper at the Society for Computers & Law Policy Forum in September 2009, and to those who have commented on later versions.

A lengthier version of this article is available at http://ssrn.com/abstract=1538527

 

 



[1]  I have examined some other reasons in Chris Reed, ‘The Law of Unintended Consequences – embedded business models in IT regulation’, (2007) 2 JILT http://www2.warwick.ac.uk/fac/soc/law/elj/jilt/2007_2/reed ; Chris Reed, ‘Taking Sides on Technology Neutrality’, (2007) 4:3 SCRIPT-ed 263 http://www.law.ed.ac.uk/ahrc/script-ed/vol4-3/reed.asp

[2]  Thus the UK Fraud Act 2006 contains no definition of dishonesty, nor any compliance checklist for the makers of statements.

[3]  See e.g. European Commission, Proposal for a European Parliament and Council Directive on a common framework for electronic signatures, COM(1998) 297 final, 13 May 1998 p 3:In order to ensure the functioning of the Internal Market and to support the rapid development of the market in terms of user demand and technological innovation, prior authorization has to be avoided. As a means to gain the confidence of consumers, voluntary accreditation schemes for certification service provider aiming at providing enhanced levels of security is considered to be useful. As far as such measures are required by the market, they could give a clearer or more predictable level of legal security for both the certification service provider and the consumer.

[4]  Directive 1999/93/EC on a Community framework for electronic signatures OJ L13/12, 19 January 2000 art. 5

[5]  These are that:(a) it is uniquely linked to the signatory;(b) it is capable of identifying the signatory;(c) it is created using means that the signatory can maintain under his sole control; and(d) it is linked to the data to which it relates in such a manner that any subsequent change of the data is detectable.

[6]  However, the Australian government is renewing its efforts to introduce internet content filtering via what has been described as the Great Australian Firewall. See the announcement in December 2007 by the Telecommunications Minister Stephen Conroy. (http://www.abc.net.au/news/stories/2007/12/31/2129471.htm), and the report of revised plans of October 2008 (http://www.heraldsun.com.au/news/mandatory-censorship-on-web/story-0-1111117883306). The proposals have not yet received legislative approval at the time of writing.

[7]  UK FSA, Electronic Money: perimeter guidance (February 2003).

[8]  Lon Fuller, The Morality of Law (Revised ed, Yale University Press 1969) p 96.

[9]  Ibid pp 33-38, under the heading ‘Eight ways to fail to make law’. The eight principles are examined in detail at pp 46-91.

[10]  The Information Commissioner has recognised the difficulties this causes and has produced a new, plain English, Guide, which is available via http://www.ico.gov.uk/home/for_organisations/data_protection_guide.aspx. However, this Guide does not replace the other guidance documents but merely seeks to explain them more simply. A full understanding still requires that all the other guidance materials be read.

[11]  See e.g. ‘Police chief admits Huntley records were wiped out’, The Times 19 December 2003; ‘Medical research need not fall foul of the law’ (letter by Richard Thomas, Information Commissioner), The Times 20 January 2006.

[12]  Chris Reed, ‘The Law of Unintended Consequences – embedded business models in IT regulation’, (2007) 2 JILT http://www2.warwick.ac.uk/fac/soc/law/elj/jilt/2007_2/reed

[13]  UK Regulation of Investigatory Powers Act 2000 ss. 1, 25 and Interception of Communications Code of Practice.

[14]  Ian Walden, Computer Crimes and Digital Investigations (Oxford University Press 2007) 4.244-4.256.

[15]  See e.g. Luke Johnson, ‘The truth about the HR department’ Financial Times 30 January 2008; Stefan Stern,  ‘What is HR really for’, Management Today 29 April 2009; Sathnam Sangara, ‘Human resources departments: I’ve never understood the point of them’ The Times 5 October 2009.

[16]  Chris Reed, ‘The Law of Unintended Consequences – embedded business models in IT regulation’, (2007) 2 JILT http://www2.warwick.ac.uk/fac/soc/law/elj/jilt/2007_2/reed

[17]  Directive 96/9 on the legal protection of databases, OJ L77, 27 March 1996 p. 20.

[18]  DG Internal Market and Services Working Paper, ‘First evaluation of Directive 96/9/EC on the legal protection of databases’, Brussels 12 December 2005 p. 8.

[19]  [2001] EWHC 516 (Pat.) (High Court); [2001] EWCA Civ 1268 (CA); Case C-203/02 9th November 2004 (ECJ).

[20]  929 F Supp 824, 830–838 (ED Pa, 1996), affirmed 117 S Ct 2329 (1997).

[21]  Zeran v America Online, Inc 129 F 3d 327 (4th Cir, 1997), 1998 US 4047 (cert Denied).

[22]  Blumenthal v Drudge and America Online Inc (1998 District of Columbia Civil Action No 97-1968 (PLF), 22 May 1998).

[23]  Doe v America Online Inc 718 So 2d 385 (4th Cir, 1999).

[24]  Aernout Schmidt, “Radbruch in Cyberspace: about law-system quality and ICT innovation’ p 10, available via http://ssrn.com/abstract=1423105

[25]  PayPal press release, ‘Paypal Building the Foundations for European Growth’ 15 May 2007 (www.pppress.co.uk)

[26]  IFPI, Digital Music Report 2009: Key Statistics (http://www.ifpi.org/content/section_resources/dmr2009.html) estimates that 16% of European users regularly swap infringing music files online, and that in 2008 the number of infringing music files shared exceeded 40 billion, suggesting an infringement rate for downloads of about 95%.

[27]  Internal opponents include Electronic Frontiers Australia (http://www.efa.org.au), the Digital Liberty Coalition (http://dlc.asn.au/) and GetUp! (http://www.getup.org.au/campaign/SaveTheNet), and externally NetChoice (http://www.netchoice.org/)

[28]  Case C-101/01 6 November 2003, OJ C7 10 January 2004 p. 3.