Time to Drop the Bomb?

November 1, 2003

The war against software theft and misuse is ongoing. Dissident users have strategic advantage – software is intangible, susceptible to infinite reproduction and incapable of repossession, but the apocalypse may be nigh. Suppliers have in their arsenal what they hope is the doomsday weapon – the “software bomb”.

A software bomb is code written into software enabling it to be disabled unilaterally. A “logic bomb” enables the supplier to send a remote command to the software at any time, disabling the software immediately. “Time bombs” explode automatically on expiry of a period of time (typically the licence term), immediately disabling access to the software.

Why bomb?

In an IT dispute, ordinarily the supplier[i] will have limited objectives: to get paid, and, where continued use is causing the supplier loss, to bring use to an end. The supplier may pursue an action for damages or an injunction through the courts, but litigation is expensive and extremely time consuming. Even the winner never recovers all costs, and there is always a risk of losing.

Bringing a claim involves offsetting these ‘litigation costs’ against the spoils of victory – in other words, undertaking a cost-benefit exercise. The sad fact for suppliers is that some claims are not worth bringing, no matter how meritorious. This is music to the ears of a troublemaking user – invoices may be paid late or incompletely, unlicensed software used, and a raft of other misdemeanours committed.

Against this backdrop, the attraction of the bomb is obvious. A detonated bomb is final, absolute, and usually devastating. If the software is mission-critical then the user may go out of business. In any event, the software is disabled and the user has no choice but to cease using it. The threat of detonation gives the user little choice, putting it in a very weak bargaining position. It is a case of “comply or else.”

Legal problems

Some software engineers, it seems, pride themselves on sneaking a logic bomb into their software. After all, it’s a great bargaining chip if things get messy – isn’t it? Well, think of it like this: detonating a bomb to disable software is analogous to hiring some heavies, marching onto the user’s premises, and manually disabling the software (without, of course, physically damaging any property or persons – that’s where the analogy breaks down). If you did all of that without the user’s permission (or threatened to do it), you’d expect to get into trouble, not only with the user, but also with the police. The same is true for software bombs.

(i) Justification

The Computer Misuse Act 1990 was enacted to bring “hacking” and virus transmission firmly within the scope of the criminal law. Under s 3 (the virus provisions), a person is guilty of a criminal offence[ii] punishable by a prison sentence of up to five years and a fine if he does an act:

· which causes an unauthorised modification of the contents of any computer

· knowing that what he is doing is unauthorised; and

· intending to impair or hinder access to any software or data.[iii]

Clearly, these provisions cover use of a software bomb – the key is whether or not the supplier’s act was authorised. There is little case law on the point. There have, however, been two illustrative prosecutions.

In R v Goulden,[iv] the defendant installed security software for a printing company. The software included password access functionality. The defendant detonated the bomb (i.e. password protected access) to support his claim that he was owed £2,275 in unpaid fees. The software was business critical and the printing company claimed £36,000 damages for loss of profits, including £1,000 spent engaging a specialist to override the password protection. Although the defendant’s actions were understandable, they were unauthorised, and he was convicted.

In R v Whitaker,[v] the defendant, a software developer, detonated a logic bomb in a dispute over payment, and argued that, because he owned the intellectual property rights in the software, he was entitled to modify it. Unsurprisingly, the court had no trouble rejecting this argument – suppliers do not have the right to modify licensed software simply because they own the intellectual property rights. The defendant was convicted.

Goulden and Whitaker would have been acquitted had their actions been authorised. Authorisation must be informed consent. Think back to the analogy with physically entering the user’s premises and disabling the software. To do this, the supplier would need a key, the alarm code, and (contractual) rights to enter the premises and access the computer system for the purpose of disabling the software. All of this amounts to authorisation, which legitimises the supplier’s actions.

For clarity and certainty, authorisation should be granted from the outset of the relationship, in the contract. After all, a user is hardly likely to authorise a supplier to bomb the software during a dispute. Insufficient authorisation will, however, not only implicate the supplier under the 1990 Act, but will also impose liability for breach of contract. In Rubicon Computer Systems v United Paints Limited,[vi] a payment dispute arose in a contract to supply a computer system. The supplier, who was able to access the system, then embedded a logic bomb into the software. The payment dispute continued and the supplier detonated the bomb. The court held that upon delivery and installation the user was entitled to “enjoy quiet possession of the goods except so far as it may be disturbed by the owner or other person entitled to benefit of any charge or encumbrance so disclosed or known”.[vii] In other words, there is an implied term that the supplier must not interfere with a working system without authority. Unauthorised detonation was a repudiatory breach of this implied term, entitling the user to restitution – in this instance repayment of the purchase price.

In Rubicon the supplier was not able to point to any contractual provision entitling it to detonate a bomb. Inserting a ‘bomb clause’ into the contract may have changed the outcome. A bomb clause should identify the bomb, and set out specific uncontroversial and easily proven circumstances in which the supplier is entitled to detonate, such as failure to pay an invoice within X days of the due date. Ideally, determining compliance with such clauses should be a simple question of fact so that all scope for challenging the validity of detonation is removed. For example, by contrast, tying detonation to breach of a “co-operation clause” is clearly going to attract trouble – whether or not the user has co-operated will always be a moot point.

Inserting a clause entitling the supplier to use the bomb as a panacea is not recommended for three reasons. Firstly, it is not realistic. Nuclear powers do not enforce international trade agreements by obliterating non-compliant states. Likewise, a supplier would not use a software bomb unless it had run out of alternatives. Therefore, its breadth does the supplier few favours. Secondly, it is of dubious legal effect. A court would not look at the literal wording of such a clause, but would apply a common sense gloss. This renders it ambiguous and open to challenge – the last thing the supplier wants. By analogy, see Rice (T/A The Garden Guardian) v Great Yarmouth Borough Council,[viii] where the Court of Appeal held that an open-ended entitlement to terminate upon any breach “offended commercial common sense”. Thirdly, it should deter all but the most desperate (or careless) users from ever signing. For obvious reasons, it is an awful sales tool.

Finally, even if you are authorised to detonate, there are good reasons to think twice. Terminating for breach is one thing, but detonating a software bomb is quite another. It is not only antagonistic, but the user’s loss (and therefore damages) will typically be very high. After all, detonating a bomb does not mitigate damage; it creates damage, and can cause collateral damage to data, systems, and products. In addition, in the event that the supplier succeeds at trial for pre-detonation breach of contract, the point of detonation would be a cut-off point for its damages. A supplier cannot suffer loss for the period in which the user has access to disabled software.

(ii) Other legal constraints

Suppliers also must be aware of potential legal liability from another, more unexpected source. The ability to detonate a logic bomb, or to disable a time bomb, is a valuable bargaining chip, but under s 21 of the Theft Act 1968 a person is guilty of an offence, punishable by up to 14 years in prison, if with a view to making a gain he:

· makes a demand with menaces without reasonable grounds to make the demand or

· supports a demand with menaces that are not a proper means of reinforcing the demand.[ix]

This is more commonly known as blackmail.

The fact that you have a contractual right does not necessarily mean that you are entitled to threaten to carry it out. Thus, even if the supplier has contractual authorisation to detonate the bomb, unless detonation is a proper means of reinforcing the demand (which is decided by the court, not the parties) the supplier will have committed blackmail. This means that, irrespective of what the contract says, the criminal law dictates that a threat of detonation must be reasonable.

(iii) Escrow

The parties should not forget the less exciting, but nevertheless important, implications of time-bombed software for source code escrow contracts. These are multi-party contracts whereby source code is entrusted to an independent third party (an escrow agent) who keeps the source code safe and confidential until the occurrence of a release event (typically the supplier going out of business – but occasionally the supplier failing to, or electing not to, continue to provide maintenance and support), whereupon users can call for the release of the source code. The users then maintain the software themselves.

Time bombed software limits the effectiveness of source code escrow. The compiled source code of ‘timed-out’ (ie disabled) software is useless to the users. True, users with in-house programming expertise should be able to ‘decommission’ the bomb, or re-enable the software. However, this will be at a cost – a significant cost for smaller users. In any event, detonation may put the (former) supplier on the hook under the 1990 Act.

To solve these problems before they arise, both supplier and user should ensure that the source code deposited with the escrow agent is bomb-free, or that all necessary passwords are also deposited.

Bombing abroad

The EU has not harmonised the laws of member states on the topic of software bombs. Therefore, the law will vary from country to country. The Copyright Directive (due for implementation at the end of 2002, but not yet implemented in the UK) calls for “adequate legal protection and effective legal remedies against the circumvention of effective technological measures that are used by authors in connection with the exercise of their rights”.[x] It is not clear how this will be implemented into domestic law, or whether it will be implemented consistently throughout the EU, but if implementing legislation is consistent with the wording of the Directive, one would expect domestic law to expressly outlaw the act of, and methods for, circumventing valid software bombs.

The US is currently undergoing an overhaul of its IT law, in the form of the pro-supplier (and now notorious) Uniform Computer Information Transactions Act (UCITA). By default, it legalises software bombs, subject to various limits (less stringent but not unlike the UK limits). However, states are entitled to amend the legislation before enacting it, so UCITA will vary from state to state.[xi]

Suppliers should take local advice.


Software bombs are not illegal, per se. There are, however, legal limits to their use:

· The user must have clearly and unambiguously authorised detonation.

· A threat of detonation must be proportionate.

Suppliers should not use (or threaten to use) a software bomb without legal advice.

Whether software bombs should be used is another matter. Their results, however impressive, are achieved at a sometimes unnecessarily high cost to the user. The risk of the user attempting to pass that cost back to the supplier, via the courts, should not be ignored. In addition, in an increasingly competitive market, a software bomb clause is not a great selling point. Before the development of the software bomb, suppliers didn’t ask for a right to enter the user’s premises and disable the software manually – so why now? Suppliers would be better advised to police their contracts through effective account management and auditing.

Nevertheless, like its atomic cousin, the software bomb’s publicised existence acts as a tacit deterrent to warmongers. Ideally, it is never used, but it is worth remembering that detonation (or the threat) is a unique ‘solution’ and, as such, the bomb will always have its place in times of crisis – in times of last resort.

Tim Sewart is a solicitor at niche IT law practice, v-lex Limited.

[i] For simplicity, I refer to suppliers (meaning the owners of the intellectual property rights) and users (meaning the users/customers).

[ii] The question of whom, in the supplier’s organisation, carries the can for a criminal offence is out of the scope of this article. Suffice to say that, as a general rule of thumb, it is usually the most senior person that authorised the action (implicitly or explicitly).

[iii] See CMA, s 7. If tried summarily the maximum prison sentence is 6 months and the fine is capped at the statutory maximum (currently £5,000) (CMA 1990, s.7(a)).

[iv] The Times, 10 June 1992, Southwark Crown Court.

[v] Unreported, 1993, Scunthorpe Magistrates’ Court.

[vi] (2000) 2 TCLR 453 (CA).

[vii] Sale of Goods Act 1979, s.12(2)(b).

[viii] (2000) LTL 30/6/2000.

[ix] Theft Act 1968, s 21(1)(a) and (b) and (3).

[x] 2001/29/EC, Article 11.

[xi] For example, the State of Maryland has tweaked the software bomb provisions. In their choice of law provisions, expect US suppliers to choose the state with the most pro-bomb version of UCITA. But, in theory at least, that wont avail them of criminal liability if they detonate in the UK.