The Net Neutrality Zombie and Net Neutrality ‘Lite’

January 27, 2009

No matter how many economists plant a stake in its heart, or come to bury it not praise it, net neutrality will not die. President Obama is likely to resuscitate the fully-fledged version of US net neutrality in some form. His advisors include arch-net neutrality proponent Lawrence Lessig and many other ‘openness’ advocates, not least Vint Cerf and Google Inc. As he also proposes to breathe new life into the dormant antitrust division of the Department of Justice, that may be a two-edged sword for dominant companies.

In Europe, the Commissioner finishes her current term at the end of 2009. She robustly states her belief in net neutrality, but it is not Vint Cerf or Lessig’s net neutrality but rather what I have previously termed ‘negative net neutrality’ or net neutrality ‘lite’. It essentially permits discrimination on speed and price for new network capacity, but insists that existing networks do not discriminate ‘backwards’ – ie do not reduce the existing levels of service. That is also what came to be known as ‘Amazon net neutrality’ in September 2006 (when Amazon decided that was an acceptable minimal level of regulation) – it admits to future possible investment sharing, but draws a line to protect existing networks.

There is also the possibility of future revision of the Universal Service Directive in 2009 (impacting UK law in 2011-12) and obligations thereunder, which will mark a new ‘line in the sand’ in Europe for minimum service levels. Levinson and Odlyzko (2007) refute many of the arguments for fine-scaled charging that underlie the architecture of IMS, the building block for Next Generation Networks. They note that:

‘Technology appears to be making fine-scale charging (as in tolls on roads that depend on time of day or even on current and anticipated levels of congestion) increasingly feasible. Standard economic theory supports such measures, and technology is being developed and deployed to implement them. But their spread is not very rapid, and prospects for the future are uncertain . . . the case for fine-scale charging is not unambiguous, and in many cases may be inappropriate.’

I see no obligation to take any firm position on the issue (Marsden 2008, Marsden et al 2006). What is important is the extent of such potential discrimination, and its justification. Freiden (2006) ‘accepts as necessary and proper many types of price and QoS discrimination’ and attempts ‘an identification of best practices in “good” discrimination that should satisfy most network neutrality goals without creating disincentives that might dissuade ISPs from building the infrastructure needed.’ That should also be our goal, in a specifically European context.

A group of academics and engineers have proposed rules on what can be called Internet service, in the suggested wording of an Internet Platform for Innovation Act (2006) at http://www.dpsproject.com/legislation.html. Those rules might be considered a form of transparency regulation. Essentially, they claim that any service that differentiates between packets is breaching the end-to-end principles of the Internet protocol and therefore should not be labelled as an Internet service. They suggest legislative wording as follows:

‘Network providers that offer special features based on analyzing and identifying particular applications being conveyed by packet transmissions must not describe these services as ‘Internet’ services. Any representation as to the speed or ‘bandwidth’ of the Internet access shall be limited to the speed or bandwidth allocated to Internet access.’’

I suggest that regulators will need to form a view of what access to the public Internet is required in order to make effective conclusions on the future for USO during the course of 2008-2009. I emphasize that this debate is likely to grow in complexity during that period and urge regulators to conduct research in this area. Unfettered Internet access of some type is a currently enjoyed public good for consumers, particularly in the use of Web 2.0-type applications and services, and this public sphere is a regulatory policy of continued consideration.

Net neutrality ‘lite’ is of course itself a work in progress. I have written at great length about the forensic difficulties in exposing breaches of existing service levels, and indeed the 2006 Ofcom Annual Conference exposed the frustrations of gamers and file-sharers at the throttling policies of ISPs. With this evidence placed in front of it, Ofcom has investigated threats to net neutrality ‘lite’ and taken concrete steps to expose such practices and permit consumers to take action to switch between ISPs. More needs doing to ensure greater transparency regarding Quality of Service (or rather lack of service), and the success of the self-regulatory scheme in place needs to be proven. The Consumer Panel could usefully audit this initiative.

Light Touch but Effective Regulation

Regulation to ensure any form of net neutrality in Europe should have as light a touch as possible, while maintaining effectiveness, based on three recourses:

  1. they are making when they sign up for a service and any relevant changes to the service, for example, blocking of certain services. The relevance of the changes is consumer-driven, and therefore full and prompt disclosure by companies via their Web sites is necessary. Even if not all customers choose to exercise the option to monitor the situation, the provision of the information itself promotes transparency. It also may head off calls to help desks given that the technical fault may actually be a change of network policy.
  2. Continually upgraded monitoring and surveillance.
  3. Where necessary, investigation and timely but evidence-based intervention to correct harmful and unjustified discrimination

These regulatory interventions do, however, require regulators to impose a reporting requirement on service providers to provide transparency in their traffic-management practices. This reporting requirement is to be provided in a co-regulatory forum, with a code of practice to be adopted by the Internet Service Providers’ Association (ISPA) in the UK. Pedants might argue that it is formally self-regulatory, but I note that the Code was published by Ofcom and it will be auditing the outcome and adherence to the Code – is this ‘co-regulation lite’?

Its counterparts in EuroISPA might do the same, but by no means all 27 Member States have such a forum. There will be a need for content provider participation in, and consultation over, such a scheme to ensure it receives full industry backing. Consumers should also be consulted and the Consumer Panel’s involvement is essential.

I note that the danger of fragmentation and regulatory arbitrage is apparent for two reasons: (1) a type of net neutrality ‘regulatory holiday’ for ISPs in one country but not another is quite likely simply because regulating the area is complex and different levels of regulatory commitment are inevitable, and (2) enforcement of net neutrality may be highly divergent under the current 2002 framework, no matter what improvements are produced in Telecoms Council.

Therefore, the European Commission and Member States will need to monitor developments in this area closely, especially in view of policies for ContentOnline and the wider i2010 goals, under which the importance of content provision (as well as network deployment) for jobs and growth are emphasized. In particular, the role of start-up companies and other small companies in content and service provision is likely to be a substantial engine for such growth.

Notwithstanding the backstop of regulatory intervention, based on the incomplete evidence thus far, I suggest that net neutrality be primarily enforced via reporting requirements. This can be classed as self-regulation in which there is an incentive for market players to cooperate, and co-regulation or formal regulation if there is insufficiently unanimous cooperation. In this approach, market actors and self-regulatory bodies maintain a constant dialogue with regulators and consumers. This is a preferable lighter-touch regime to those of government-funded regulation and non-regulation of European net neutrality and a flexible and responsive framework.

Expanding on the recommendation for timely and evidence based intervention, I note that regulators will need to ensure that the network operators report more fully and publicly the levels of QoS that they provide between themselves and to end-users. Internet architecture experts have explained that discrimination is most likely to occur at this level, as it is close to undetectable by those not in the two networks concerned in the handover of content. It is very difficult (if not impossible) to monitor the former for anyone other than the two network operators themselves, and therefore shedding light on QoS in this area will require reporting. As this information is routinely collected by the network operators for internal purposes, this should not impose a substantial burden.

Ofcom has so far responded with an acceptance and willingness to deal with those problems that were shown to have emerged, regulation of customer switching problems between ISPs, regulation of video on broadband offered by the public service broadcaster BBC, and a light touch attempt to persuade ISPs to offer greater transparency to users (Kiedrowski 2007).

Future Directions: Research, Training, Technology

The pace of change in the relation between architecture and content on the Internet requires continuous improvement in the regulator’s research and technological training. This is in part a reflection of the complexity of the issue set, including security and Internet-peering issues, as well as more traditional telecoms and content issues.

Dominant and entrenched market actors in regulated bottlenecks play games with regulators in order to increase the sunk costs of market entry for other actors and pass through costs to consumers and innovators. Very high entry barrier co-regulation and self-regulation can be as effective in curbing market entry as direct content regulation, especially when ISPs are incentivized to tier and charge for QoS, which raises doubts as to their desire to implement self-regulation. By and large, the greater the levels of regulation, the more likely the market is to develop toward more closed and concentrated structures. There are three reasons for this:

1.      Larger companies are able to bear compliance costs much more easily than SMEs, and therefore it is important that such entry barriers, where necessary, are minimized.
2.      Larger companies have the resources and lobbying power to seek to influence regulation in a positive direction.
3.      Large ISPs in a concentrated market may offload costs upstream onto content providers and developers or downstream onto consumers.

Therefore, any solution needs to take note of the potential for larger companies to game a co-regulatory scheme and create additional compliance costs for smaller companies (whether content or network operators, and the combination of sectors makes this a particularly complex regulatory game).

Conclusion

Ofcom has made a good start on regulating for ‘Net Neutrality Lite’. It needs to encourage sensible European regulation, ensure its fellow ERG regulators have the ability and commitment to ensure their ISPs keep consumers and innovators informed of any restrictions both present and future, and to resist any excess regulation proposed in the US from over-extending the more competitive European landscape.

Chris Marsden is Senior Lecturer at Essex Law School and Director of the LLM in IT Media E-commerce Law: cmars@essex.ac.uk

References

Frieden, Rob (2006) Internet 3.0: Identifying Problems and Solutions to the Network Neutrality Debate. at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=962181>http://papers.ssrn.com/sol3/papers.cfm?abstract_id=962181

Kiedrowski, Tom (February 2007) Net Neutrality: Ofcom’s view at http://www.wwww.radioauthority.org.uk/media/speeches/2007/02/net_neutrality

Lemley, Mark A. and Lawrence Lessig (2001) The End of End-to-End: Preserving the Architecture of the Internet in the Broadband Era, UCLA L. REV. 48: 925

Levinson, D. and Odlyzko, A. (2007) Too expensive to meter: The influence of transaction costs in transportation and communication at http://www.dtc.umn.edu/~odlyzko/doc/metering-expensive.pdf

Marsden, C., Cave, J., Nason, E., Parkinson, A., Blackman C. and Rutter, J. (2006) Assessing Indirect Impacts of the EC Proposals for Video Regulation, TR-414 for Ofcom. Santa Monica: RAND

Marsden, C. [2008] Net Neutrality: The European Debate 12 Journal of Internet Law 2 pp1, 7-16 (Wolters Kluwer).