Data Confidentiality: Policy Proportionality

October 9, 2013

It was revealed shortly after Snowden’s revelations about US’s PRISM surveillance program that typewriters were back in at least some of the Russian government’s most sensitive departments. According to the Daily Telegraph, they chose to rely on electric typewriters as a less connected alternative, and therefore presumed to be more secure, to contemporary computers.

This unexpected but increasingly common choice (I read last month that some law firms in Eastern Europe had made a similar choice for their most sensitive cases) reminded me of a cryptography course I had taken in 2008. One key point stressed during the course was that only one cryptographic algorithm was really unbreakable, if properly implemented: the one-time pad. The protocol is really basic but extremely resource consuming, as it involves a deciphering key as long as the data to be protected. In other words, you will need to transfer the same amount of data twice, one of them (the key) using a safe medium (eg diplomatic pouch).

Excluding experimental technologies such as quantum cryptography, all other commercially-exploitable cryptographic algorithms we know today can be broken. The point is not to know whether a code is breakable but rather to know when it will be broken.

Therefore, choosing the right technology and the right key-length (ie complexity) requires not only good technical experts but also a realistic and relevant assessment of how critical the data is and for how long this data should be kept safely encrypted.

Beyond cryptography, ensuring confidentiality of data in a sensitive business such as a law firm requires a good sense of priorities and a realistic and efficient methodology.

Assessing the Risk

For instance, no doubt the Russian officials will have thought of the risks of compromising radiations, and will have buried all their brand new £10,000 electrical typewriters several feet under the ground, in a military bunker. But what about the private law firm, working on a highly critical case for a wealthy politically-exposed client? Will they manage to keep their typewriters away from windows, or even non-shielded walls? Will they implement the right measures in their premises’ security policy, eg holding all smartphones and other electronic devices capable of recording or photographing at a check point before entering the ‘typewriters room’?

It is a safe bet that these extreme measures will almost never be justified in a normal private law firm.

The first questions one should ask when assessing how critical the data is for both a firm and its clients are:

–       What would be the impact of stolen (or leaked) data, in terms of brand image, direct and indirect financial cost and even sometimes criminal liability for partners or directors? 

–       How attractive is the data and why? (This can usually be answered in relation with the firm’s expertise and the nature of the cases involved.) 

–       What are the actual threats and how much effort is the potential thief likely to involve him/herself in the theft? (In other words: where do the threats come from? Employees themselves are usually – voluntarily or in good faith – the first breach in a security policy.)

This last question may require a deep knowledge of IT security best practices and cybercrime techniques. The source of the risk (eg competitors, domestic or foreign intelligence agencies), cybercrime techniques and available budget will dramatically affect the attacker’s efficiency.

Choosing Key Measures, and Cost

One major concern brought to the fore by Snowden’s revelations is about confidentiality in the context of cloud solutions.

Who could have access to sensitive data? Where is the data physically stored? Which jurisdiction is applicable? To what extent is the service provider liable?

These questions are legitimate: cloud computing is not the universal perfect solution often sold by service providers. But neither is it always a bad choice!

For some small or medium-sized firms, implementing the level of IT security that can be found in very large cloud providers is both economically and practically unrealistic. In this context, cloud computing and its usual set of complex built-in security features shared between the clients (eg password complexity policy, token authentication, integrated intrusion detection) can be a good opportunity to rely on expensive technologies at an affordable price. It can be a valuable asset in a firm’s policy, provided that the risks do not include open access by intelligence agencies from the country where the cloud services provider is based.

Burying a pool of paralegals with their electric typewriters in the basement of the firm’s building to prevent electromagnetic emanations being captured and analysed by competitors, leaving aside the work conditions impact and related employment law issues that it might bring to the employer, can prove extremely costly – and will invariably be totally disproportionate to the real threats.

Finding Your Level of Optimal and Proportionate Security

Even if a firm genuinely required a drastic ‘typewriter burial’ measure, its implementation would be one measure in a whole policy, which should be consistent and made up of a homogeneous set of other measures.

Reliably implementing a policy requires establishing a level of optimal security, i.e. striking the right balance between the risk involved and the cost of fighting the threat.

For instance, preventing employees from bringing their own device at work to avoid sensitive e-mails being stored in a personal smartphone other than the approved corporate model, remotely managed by the IT department, is pointless if the user credentials are available to the employee and no check is made to ensure that only the approved terminal is allowed to authenticate on the mail server.

In the same vein, keeping sensitive documents in paper form in a safe with strict instructions not to remove them from a room is likely to be of limited value if the photocopier is stored in that same room.

Therefore, each key measure must have its set of minor (but nonetheless essential) associated measures.

This brings me to the last point: implementing too restrictive a policy (i.e. disproportionate to the risk) always leads to the policy being discredited and eventually encourages the users to bypass it, making the whole effort void.

Who has never seen a password scribbled on a post-it note stuck to the frame of a computer screen in an office? ‘Of course, the password policy is such a nightmare… I couldn’t remember it!’

Conclusion

While zero risk cannot be reached – at best can we tend towards it – implementing an efficient and valuable data confidentiality policy involves rigour and methodology.

My framework for thinking in this area comprises three steps:

–       realistically assessing the risk; 

–       choosing key measures, by ensuring their proportionality to the risk; and 

–       establishing the optimal level of security and the associated minor measures.

Yoann Le Bihan is a freelance consultant in IT specialised in infrastructure and project management for leading stock exchanges and banks. He is a member of the SCL and a GDL (part-time) student at the University of Law Bloomsbury, intending to specialise in IT Law. His thanks go to Kieran McDonagh for his input.