2012 Predictions 9: Domains and Data

December 12, 2011

{b}From Jane Seager, Counsel, Hogan Lovells: jane.seager@hoganlovells.com{/b}

Much ink was spilt in 2011 in the domain name world over the proposed launch of new generic Top Level Domains (gTLDs). This has been on the cards for a number of years now and will result in a fundamental change to the Internet as we know it, with .generic, .brand or .city proliferating instead of the popular .com. Despite the continuing furore, ICANN has announced that the first round application window will open on 12 January 2012 and close on 12 April 2012, although I predict that the closing date may well be extended as many companies wake up to the full implications of not applying only at the last minute, despite the significant costs that running a TLD will involve. The list of applied for strings will be published shortly after the closing date, and predictions as to how many will be received vary wildly. I would expect that over 1,000 applications will be made initially, and that the most straightforward will be approved and in use by early 2013, thus changing the face of the Internet forever.

Also on the theme of Internet infrastructure, in 2012 the growing scarcity of IPv4 addresses means that we will see a burgeoning trading market in them as companies who need large numbers of IP addresses at short notice seek to buy them from those with a surplus. As a result IPv4 addresses will become increasingly valuable assets, although of course such value will last only until the widespread adoption of IPv6 at some unforeseen point in the future.

{b}From Richard Graham, Partner at Edwards Wildman (rgraham@edwardswildman.com){/b}

Over the course of the last few years, we’ve witnessed a radical shift in the way that organisations collect, store and process their business data – this has been helped by seismic improvements in high performance computing and software interoperability, and the cloud can be seen as the jewel in the crown of this technological revolution. The sheer volume of data has also exponentially increased, and for some organisations, such as Google, Facebook and Twitter, their underlying business model is built around the analysis and exploitation of that business data. This analysis will help transform business processes and achieve the ultimate goal of understanding the customer, with the added consequence of encouraging job creation, innovation and growth. This will also keep us all grappling with the legal issues at the core of data ownership and licensing, ranging from confidentiality, intellectual property, cyber security and privacy to more complex issues surrounding competition law and insurable risk.

The term ‘big data’ refers to mass repositories of data, the size of which is beyond the ability of standard database software to structure and analyse. This can be data collected as part of a social network, including location data, ‘life-logging’ data, video, photos, blogs, articles, documentaries, and browsing and purchasing behaviour data. This can be collected across multiple platforms, in multiple jurisdictions and in multiple languages. The cloud has now allowed us to store and manage this data in a way that would have seemed unachievable this time five years ago. However, the real value for business is not simply from the collection and storage of this data in the cloud, it is from the sophisticated analysis and exploitation of the data itself. We may associate the company SAS Institute with a legal case of 2011 relating to the copyright protection in computer programs – yet we should note that SAS Institute are the ones pioneering ‘big data’ with their business intelligence and analytics software, and my view is that the technological pioneers like these are the ones for us to watch in 2012.