Chris Kemp, associate at Kemp IT Law, explores the Institute for the Future of Work’s proposals for an Accountability for Algorithms Act.
As the heady clamour of this Summer’s exam results algorithm fiasco fades into the darkening evenings of the UK Winter, the recent proposals for an “Accountability for Algorithms Act” by the Institute for the Future of Work (IFOW) are pretty timely, to say the least.
The proposals, supported by an op-ed in The Times by David Davis MP, aim for “an overarching, principles-driven approach to put people at the heart of developing and taking responsibility for AI, ensuring it is designed and used in the public interest.”
For observers in the UK this is an interesting development. 2020 has seen a flurry of paperwork on AI regulation and some interesting debate, particularly from the European Commission’s AI White Paper in February and the subsequent consultation. But the UK, having bigger fish to fry, has seemed much less involved.
The proposed Act, detailed in Part 5 of the IFOW’s “Mind the Gap” report, “would regulate significant algorithmically-assisted decision-making, which met a risk-based threshold, across the innovation cycle, legal spheres and operational domains in the public interest”. An “umbrella, ‘hybrid Act’”, it would help guide and align the existing regulatory ecosystem, the current law, and decisions taken by the makers of algorithms.
A number of proposed statutory duties are given top billing:
Proposals are also made around increasing transparency in the innovation cycle and support for collective accountability (rights for unions and workers vis a vis algorithmic systems involving AI used at work).
In terms of regulatory supervision, the IFOW isn’t proposing a new regulator – instead the Act would “establish an intersectional regulatory forum to coordinate, drive and align the work of our regulators, and enforce our new duties, which would otherwise lie between the EHRC [the Equality and Human Rights Commission] and the ICO.”
The IFOW is clear that the proposals “need very wide consultation” – i.e. we are at a very early stage – but there appears to be some parliamentary support here. How much governmental and legislative bandwidth the proposals will get, given the competing pressures of COVID-19 and Brexit planning, is clearly another matter.
Chris Kemp, associate at Kemp IT Law, has a keen interest in the emerging field of AI regulation, and in making sure his A-levels don’t get downgraded.