Y2K: Myths v Reality

March 1, 1999

I have listened to many more lawyers than software engineers hold forth about Y2K issues. That may be a result of the company I keep – my son says I am not a real engineer at all, he claims I spend all my time talking to lawyers!
Certainly lawyers have been saying and writing a great deal on this subject, mainly it seems in contemplation of litigation. My concern is that much of the discussion appears to be based on technically false premises. It is rather like an experienced ship’s master listening to a discussion of the problems of navigation on the assumption that the earth is flat.

I want to challenge three main propositions which I regard as myths. I do so because if they are left publicly unchallenged two consequences seem likely to follow.

Consequences of Misconceptions

The first consequence of relying on current popular misconceptions is that some parties, mostly systems suppliers, will not recognise and face up to their responsibilities. If they are assuming (or have been advised) that they have no liability for the supply of non-compliant systems, the consequences for themselves – not to say their unfortunate customers – are likely to be severe. By now, of course, the remaining options for doing something about it are diminishing rapidly.

The second consequence is that numerous expensive and protracted pieces of litigation will continue, and will clog up the courts, because in each case one party is clinging to a position which is technically invalid.

The First Myth

There is a very common tendency to speak of ‘Millennium Compliance’ in software as if it were some kind of enhancement or optional extra. Refuge is taken in the fact that it has not been expressly stated to be a feature of software in sales literature or representations on the one hand, or stated to be a requirement in procurement on the other. Simplistic views such as ‘if it is not in the specification, it’s out’ have been expressed.

I describe that view as simplistic because it fails to recognise important areas of requirements which are often implicit. I have never seen a Requirements Statement, or even a Functional Specification, which states that results should be arithmetically correct, that customer transactions should be posted only to the current customer account, or that a system must be able to process dates correctly in leap years.

There are essential requirements which derive from such phenomena as that night follows day, power supplies and telecommunications will sometimes fail, data entry has inherent and predictable error rates and so on which any ordinarily competent system developer will be well aware of and will take into account whether expressed or not. That is quite different from asserting that developers should be mind-readers and that customers can properly assert that every unexpressed peculiar aspect of their requirements can be implied into specifications. The need for systems to keep on working as the days and years pass is not a requirement peculiar to some particular application.

It is important to remember that the transition from 1999 to 2000 is not a peculiarity of the calendar. That is perfectly logical. What is out of step is the convention adopted in many computer programs of making a transition from 99 to 00. The need to deal with data stored according to that convention arise not from an imposed ‘requirement’ thought up by users, but from a convention adopted by programmers.

Put bluntly, ‘millennium compliance’ is not an enhancement – ‘millennium failure’ is a defect, and a foreseeable one at that.

The Second Myth

The second myth is that those who caused (and have subsequently failed to rectify) the millennium defect cannot be regarded as negligent since so many others behaved in the same manner. It is said that if there is a ‘body of responsible/respectable professional opinion’ which indulged in the production of millennium defective systems, that would absolve developers from negligence. This argument comes close to saying ‘everyone did it’.

First I would challenge the assumption that almost ‘everyone did it’. Who says so? What is the evidence?

Secondly, I refute strongly the possibility that any body of responsible/respectable professional opinion would accept the creation of the millennium defect as acceptable, non-negligent practice. I would deny that such an opinion was responsible, worthy of respect or in any sense professional. I do not in this paper purport to speak officially for the British Computer Society, which as the relevant Chartered Engineering Institution must represent the qualified professionals’ view. However, I would be very surprised if my colleagues in the BCS supported the view that developers were behaving in a professional manner if they turned out millennium defective programs in circumstances where there was any reasonable prospect of the program being required to process dates beyond the end of 1999.

Thirdly, even if it is true that the practice was widespread, however bad, that does not excuse it (at least morally and possibly legally). I recall that it was asserted in defence of the officers of the Herald of Free Enterprise that it was not uncommon practice for car ferries to set sail with open bow doors, but that did not absolve them from responsibility for the consequences. In any case, the consequences in that case were far less predictable than the imminent arrival of 1 January 2000.

It only required the simplest level of common sense to be aware that Y2K was coming.
For those systems whose failure may have safety implications, no such defence would avail. The so-called ‘state of the art’ defence places the burden of proof on defendants to show that the best available contemporary scientific and knowledge was applied. It only required the simplest level of common sense to be aware that Y2K was coming. Safety Critical practice requires the application of well known methods of Hazard Analysis and anticipation of faults and their consequences, and recognition of concepts such as Acceptable Degrees of Risk and risk levels As Low as Reasonably Practical (ALARP). Health and safety legislation obliges suppliers to monitor their products (with no exemption for software) for safety and to notify users if any risks are discovered. That regime is intended to cater for hazards much more subtle and remote than the millennium defect, but it obviously embraces it.

One would expect the standards applicable to Safety Critical systems to be more stringent than those applicable to commercial software generally. In the latter field, the burden of proof may not automatically shift to the defendant in the same way, nor may the Defendant be left with only the ‘state of the art’ defence. I do not believe that matters. The occurrence of the millennium defect is not some remote (eg 1 in 10 9) possibility but in many cases a certainty. In other, slightly older systems, it was somewhere between 1:10 and 9:10. Contemporary technical knowledge would have enabled it to be avoided. Some commentators have interpreted the Court of Appeal judgment in St Albans DC v ICL as setting the acceptable threshold for faults in commercial software as zero, potentially more stringent than that for Safety Critical Systems.

There remains a question as to the date by which any reasonably responsible and competent developer would have ensured that software would not fail when it attempted to process dates later than 1999. I believe that date for any particular system can be calculated with respect to two factors, which I call ‘longevity’ and ‘horizon’.

  • Longevity. Longevity is the reasonably foreseeable maximum life expectancy of the software. (Not the average for obvious reasons.) There have long been plenty of examples of software with useful lives well in excess of ten years. I was invited by my erstwhile employer, the Post Office, to the closing down party for the first system I ever programmed, some 12 to 14 years after it went live. I have advised clients on the procurement of replacements for systems which were already more than ten years old when the procurement process started. For any significant, professionally developed system, it is unlikely that a useful life of less than ten years (and more probably 12 or more) should have been anticipated.
  • Horizon. By a system’s horizon I refer to its need to look ahead and to process dates in the future. These vary from system to system. Some real time systems are only concerned with the here and now; except for those which keep track of their own maintenance schedules etc, they may well have no need to process dates at all. At the other end, systems written from the 1950s and 1960s onwards dealing with, say, pensions or mortgages always had to take cognisance of dates beyond 1999. It should be straightforward to ascertain for any system what its horizon is. By adding that to its longevity, the date by which it should have anticipated the need to process such dates can be calculated.

This is not some arcane insight into the phenomenon. It is simple common sense relying on a fact which is well known to the majority of the population of the industrialised world.

Third Myth

It is still being asserted that until recently it was not possible for programmers at large to process dates beyond 1999 correctly, because only the last two digits of the year were stored as data, and/or proffered by the operating system. That confuses how data is stored with how it is processed.

When it is stored, data may be compressed in various ways. Storing only two digits for the year element of dates is a very simple, basic form of data compression. Any programmer with the most basic competence would be able to devise one or more simple ways to process such dates correctly before and after the end of 1999, in literally one or two lines of code. If anyone is hoping to run a defence on the basis that the technology prevented them from avoiding the millennium defect, they better think again.

Some Conclusions

This short article does not pretend to be an exhaustive analysis of every facet of millennium non-compliance and its consequences. Its aim is to set out a professional, Chartered Information Systems Engineer’s view of the relevant technological issues and the kind of evidence which might be expected from a qualified chartered IS engineer or practitioner acting as an expert. It is hoped that it will inform the positions being adopted by those who may be in, or heading towards, disputes in this area, their legal advisers and insurers. I have already been instructed in a dispute in which an allegation of millennium non-compliance was made. In that instance the allegation was unfounded in fact, but I had to advise my instructing solicitors that I could not give evidence to support a defence that the client (a supplier) had no responsibility in that respect. To plead such a defence as a knee-jerk reaction, denying everything in order to force the plaintiff to prove the contrary, may look like a good legal tactic, but in fact if it became public knowledge it would only diminish the client’s reputation to no purpose.

As a piece of practical advice, I would advise any suppliers who have within the last ten years (at least) supplied systems which may still be in use and which they are not confident will work with dates beyond 1999 to mitigate their exposure by advising their customers of the position. That may be painful, but I believe it may be much less painful than the consequences of relying on a denial of responsibility which I believe will be of no avail. That is a statutory obligation if there is any Safety Critical element, but it is also a sensible commercial measure. It is to be supposed that customers want systems which will work, rather than some possibility of redress after their systems have failed, when either they and/or their suppliers might have gone out of business.

The corresponding piece of pragmatic advice to users is two-fold. First, do not allow yourselves to be bamboozled by the myths I have discussed above. Be clear about responsibility. Secondly, take a pragmatic approach to negotiation with suppliers to arrive if at all possible at solutions, in preference to recompense for failure. Remember the infamous epitaph ‘He had the right of way’.