Report sets out twelve challenges for AI governance to meet if public safety and confidence in AI are to be secured.
The Science, Innovation and Technology Committee has issued a report on the governance of AI.
It says that the recent rate of development has made debates regarding the governance and regulation of AI less theoretical, more significant, and more complex. It has also generated intense interest in how public policy can and should respond to ensure that the beneficial consequences of AI can be reaped whilst also safeguarding the public interest and preventing known potential harms, both societal and individual. There is a growing imperative to ensure governance and regulatory frameworks are not left irretrievably behind by the pace of technological innovation. The Committee says that policymakers must take measures to safely harness the benefits of the technology and encourage future innovations, whilst providing credible protection against harm.
The Committee has identified twelve challenges of AI governance, that policymakers and the frameworks the design must meet:
In March 2023, the UK Government sets out its proposed "pro-innovation approach to AI regulation" in the form of a white paper. It sets out five principles to frame regulatory activity, guide future development of AI models and tools, and their use. These principles would not initially be put on a statutory footing but interpreted and translated into action by individual sectoral regulators, with assistance from central support functions.
The Committee says that the AI white paper should be welcomed as an initial effort to engage with this complex task, but its view is that the paper's proposed approach is already risking falling behind the pace of development of AI. This threat is made more acute by the efforts of other jurisdictions, principally the EU and the US, to set international standards. The Committee's view is that a tightly focused AI Bill in the next King's Speech would help to position the UK as an AI governance leader.
Published: 2023-09-05T10:30:00