The ‘Future-proof’ Communications Bill: Thinking Afresh

January 27, 2009

Future-proofing

The importance of future-proofing the Communications Act 2003 was a constant theme in the policy debates leading up to the legislation. But it should not be assumed uncritically that future-proofing is necessarily a good thing and that it should set the parameters for a new Communications Bill for the UK. Future-proofing implies that legislation can be based on enduring principles that will apply regardless of changes in technology and business models. If those principles can be identified, they can be applied in a technologically neutral way, which means that the regulatory regime will not be dominated by transient features of media developments and will not have to be amended in response to obsolescence. In the UK, we have seen a lack of technological neutrality in the regulation of media ownership, cable television and digital terrestrial television, reflected in various iterations of broadcasting legislation since the 1980s, which has contributed to complexity and overregulation. Furthermore, the frequent need for change has had a negative impact on business stability and predictability.  

All this may seem so self-evident as to be not worth articulating. But there are counter considerations of equal significance. An incremental, “wait and see” approach may actually be more desirable  – being more effective in a period of rapid and variable change. Most significantly, the continued political salience of media and communications policy suggests that legislation should not be set in future-proof stone but should be frequently subject to re-appraisal and re-evaluation. For all the lip-service that is paid to future-proofing, this is the view favoured in the UK but, because we do not have a systematic and consistent approach to policy generation in this field, it does cause problems for long-term planning.  

In terms of substance, the tradition of public service broadcasting (PSB) has dominated policy development. This has had the effect of polarising public service against the market and against deregulation: everything is measured by reference to the effect it will have on public service. The result is that it is very difficult in practice to take a holistic view of media policy in order to develop a complete media strategy. Certainly, Ofcom has tried to be more comprehensive in its PSB reviews, but they remain constrained.  

In terms of process, media and communications remains a politically sensitive area, so it is subject to party political swings (notwithstanding a remarkable consensus about the 2003 Act) and to turf wars between government departments and regulatory bodies. In the UK, the basic model is one of political dominance over expert judgment. Policy is set in legislation and the regulator simply implements. It is true that legislators show considerable deference to regulators’ experience when framing legislation (the IBA was notably influential in debates about the Broadcasting Act 1990, and Oftel was similarly influential over parts of the 2003 Act). However, there is resistance and sometimes resentment when regulators are perceived to take the initiative in filling policy voids: for example, Ofcom’s thinking about convergence and about the BBC may be contrasted with the work of the Convergence Think Tank and the BBC Charter Review, respectively, and there have been similar tensions in the development of broadband policy.  

The ideal solution would be for politicians to set only the strategic direction in legislation and to allow regulators more scope to put it into practice. It may be thought that the Communications Act 2003 did precisely that, but it has turned out that the overall thrust of policy in the legislation is too open-ended. Ofcom has not had a sufficiently clear mandate about competing priorities but neither does it have the mandate to fill the gaps. At the same time, it does not necessarily desire the discretion that would be implied in creating policy, most likely preferring to follow clear rules. The result in such a scenario is a reinforcement of the political salience of policy, since much will depend on the choice of individuals that are appointed to regulatory office.   

Foundations of a Media and Communications Strategy 

Deciding a strategy requires choosing a sense of direction in respect of three matters. Industrial policy would deal with innovation, enterprise, and competition policy, including the UK’s global position. Cultural dimensions would deal with the positive support that may be provided for culturally valued content. Protection for the vulnerable would deal with both economic issues (consumers, prices and advertising) and political/social issues (citizens, news, decency, privacy and security), without denying the importance of choice and responsibility in decisions about media use. Assuming a political determination of the balances to be struck over these matters, some indication of priorities about resulting strategies is crucial in order to control institutional politics (whether within a single regulator, or between separate regulators) and to encourage uniformity between different approaches to regulation (whether command and control, or variations on co- and self-regulation). Obviously, this is may be difficult, because this is where the politics are very keen. But, without some basic consensus about the ranking of politics, positions will arise similar to that under the 2003 Act, where the relationship between citizen and consumer objectives has been too often ambivalent. Yet, within the framework of a clear strategy, it would be right for regulators to take initiatives and suggest new ideas on the basis of their expertise. Furthermore, it would be possible to create formal constitutional structures for ministers to respond to such initiatives in order to preserve a democratic superiority without subjecting long-term policy to shifts in political expediency.   

Thinking about Web 2.0 

Technological neutrality is not a policy strategy. It is an approach, perhaps a regulatory principle, for applying strategy. Failure to recognise this has confused debates about the implications of the Internet for communications policy. Concern about push and pull, multi-distribution and one-to-one, and linear and non-linear, have encouraged a focus on discovering legal categories that might reflect technological distinctions. Solutions have proved to be very difficult to agree, as the negotiations over the Audiovisual Media Services Directive illustrate, as does its adoption of a policy to regulate “television-like” services.   This should not be surprising, because the underlying rationale for regulating media depends neither on the technology that is used nor on the method of receiving or interacting with the information conveyed. Rather, the rationale is that the communication is a critical component of social activity. This is regardless of whether the users and participants are physically alone or in groups or families, or whether they experience the material through differing permutations of broadcast, on-demand choices, or interactivity. Because the media are an integral part of social activity, the way that they reflect and convey values and beliefs is a matter of public significance. There is a democratic interest in the way that media assist or obstruct dissemination of viewpoints (accuracy, fairness, pluralism) and their ability to provide an adequate basis for constructing social mores and political perspectives. While many manifestations of media can rightly be considered to be commodities, the clearer producer-consumer nexus offered by newer kinds of interactive media does not define the social characteristic of such media. 

The implication of recognising the “social activity” significance of the media is that Web 2.0 cannot be regarded as immune from regulation. Indeed, the apparently defining features of Web 2.0 (participation, especially through user-generated content; and interactivity and interconnectivity, facilitating the combination and re-assembling and combining of content) are intrinsically social. The social and community potential of Web 2.0 may be considered to make the case for public policy oversight all the more strong. In this context, it will be important to focus on the nature of the transaction involved: is the Internet being used for (in no particular order) streaming or downloading, broadcasting or retail selling, a temporary experience or something more permanent, user-generated material or professional production?  

Whether it is actually possible to regulate the Internet is a separate question, albeit one that is often wrongly asked first and answered negatively. There are indeed well known problems of jurisdiction and practical enforcement, and effective regulation requires careful consideration of the appropriate technique (law, command and control regulation, self-regulation or the market). An important consideration is where regulation might “bite” in the value chain. At a national level, the jurisdictional connection is key, and that points to the role of ISPs. To raise that possibility is, of course, to court controversy, because it is assumed in some quarters that ISPs are incapable of, and unwilling to, participate in regulatory enforcement. Yet they already have a role: they assist in blocking illegal content and content related to crime and security, and they also comply with civil process in the enforcement of rights related to music and video file-sharing. “Blocking” does have negative connotations, but “geo-location” is accepted as an important feature of new media content distribution.  

It turns out that content-control is possible, so the strategic question is what kind of content, if any, should be considered for regulation. Here, it need not at all be assumed that the broadcasting model is a starting point. It might be better to begin with considering some of the problems raised, for example, by social networking sites, such as invasions of privacy, bullying, defamation or discriminatory speech; and then consider “broadcasting” matters such as exposing children to indecency or causing offence. Self-help and self-regulation may well, and in some instances already does, provide practical solutions where regulatory enforcement proves difficult. This is a far cry from censorship and unjustified interference with freedom of speech, or from overbearing government intervention – points that are too often deployed selectively when considering the current state of Internet regulation. 

Ultimately, the fact that media and communications are an integral part of our social fabric means that we have to take responsibility for evaluating their impact and for being willing to shape their activities for the wider good. This indeed is the critical future-proof element for any new Communications Bill. The political and social relevance of regulatory oversight has not diminished in the face of new technologies and that will have to continue to be the basis for the more refined and targeted kinds of regulation that will be needed in the next round of law-making.

Tom Gibbons is Halliwells Professor of Law at the University of Manchester: Tom.Gibbons@manchester.ac.uk