There’s been a lot of noise recently about AI - specifically Generative AI and LLMs. But the ODI - and our network - wants to see more emphasis on building good data infrastructure; the bedrock on which most - if not all - technologies stand. It's difficult to avoid AI at the moment. Open letters and debates about whether we should focus on existential risk or the harms already happening, Congressional hearings in the US, the AI Act in the EU. There’s a lot of AI-related activity in the UK. In June alone, we had the Prime Minister and technology secretary meeting AI (business) leaders; announcements of a global summit on AI safety, £54m to boost AI research, and of early access to major models for research and safety purposes; the head of the new foundation model taskforce being unveiled; the PM at London Tech Week; the ICO looking into generative AI; and new guidance for using generative AI tools inside government. The momentum has continued, and the government is now consulting on Generative AI in Education - this closes on 23rd August, and you can respond here. June also brought an end to the consultation period for the government's AI White Paper, A pro-innovation approach to AI regulation. The volume of AI announcements over the last month illustrates just how much the world has changed since the government published those initial proposals at the end of March – and the challenges in trying to regulate such a rapidly developing field. Nonetheless, the consultation presented an important, and exciting, opportunity to make our voices heard. The ODI submitted a response – which you can read in full. We drew upon internal expertise and our previously published work touching on AI and automated decision-making, which includes:
- Our response to the Data: a new direction consultation (2021)
- Our response to the government’s green paper on transforming public procurement
- A blogpost on open models
- Our mapping of organisations across UK government responsible for data (2021)
- Our report on how the UK government approaches data literacy
- Our evidence to the APPG on artificial intelligence
- Our blogpost on the recent evidence given by the chair of the ODI, Sir Nigel Shadbolt, to the Commons Science, Innovation and Technology committee.
Getting the ODI network involved
We also held a workshop for ODI members, which was invaluable in informing our response – and we will be sharing our key findings on this blog next week. Broadly, we think the government’s general approach – setting cross-sector principles which allow existing regulators to apply them in context, as close as possible to any resulting harms – is the right one. However, as several of our members noted at our roundtable, there will be challenges in coordinating across so many different regulators, and the government needs to provide more detail about the funding, statutory powers and support provided by the proposed ‘central functions’, so regulators can properly apply the principles. As you might expect from the Open Data Institute, we also felt greater transparency about algorithms is vital – but not sufficient. We believe that transparency is one important element, along with greater public and political awareness, other accountability measures (such as redress and contestability) and regulators having the capability, capacity and teeth to do their jobs. And as you might also expect, we underlined the importance of data in discussions about AI, going so far as to say that a sixth cross-sector principle centred on data should be added to the list. My colleague spoke about data as a sixth cross-sector principle at a recent Demos panel - you can watch her remarks here. As our co-founder and Chairman Sir Nigel told the Science and Technology select committee earlier this year, ‘although we are talking a lot about AI, for the algorithms, their feedstock—their absolute requirement—is data.’ Such a principle should help move the UK towards creating and facilitating an open and trustworthy data ecosystem. Regulators would play a more proactive role in ensuring high quality datasets, and would also enable regulators to manage the significant power imbalance that will continue to grow as companies accumulate data. This focus on data as a core principle would increase attention given to treating data as a force for good, and could encourage data access, data sharing, and the creation or development of interoperable data sets. We also hope it would bring greater attention to the importance of data literacy, data assurance and data standards.
The importance of data infrastructure
We welcome the Government’s recognition of the importance and potential of AI, and their ambition to make the UK a world leader in AI. But the proof of their intention to capture the value of the technology for the UK and mitigate its harmful effects will be in how seriously they are willing to back and fund the development and regulation of a robust, supporting data infrastructure. In addition, it requires the government to view regulation holistically, and to consider the impact of other proposed legislation. As my colleague recently wrote about, and spoke about, The Data Protection & Digital Information Bill (No 2) reduces certain requirements, such as the need for organisations to have a Data Protection Officer (DPO), and reduces the requirements for Data Protection Impact Assessments (DPIAs). Weakening data infrastructure could have a negative knock on effect on the AI that relies on it. And we agree with our members at the roundtable who said that we probably don’t yet know all of the risks - nor indeed, the opportunities - that the development of AI might provide. This will provide a challenge to the new regulatory regime. It also means the ODI will continue to think about the uses - and abuses - of AI and how to develop the best possible AI landscape, with an open, trustworthy data ecosystem at its core. If you have views on that, please get in touch.