Conference season is often dismissed as political theatre - a performance for a mostly diminishing membership base. This year, it mattered. Fifteen months into a Parliament and still years from the next general election, the major parties used the autumn gatherings to test their narratives and did their best to revive their appeal to the masses.
Across packed main halls and the more revealing fringe rooms, one theme cut through: data is no longer a niche technical issue, but a live political priority. How the UK governs its data and steers AI will affect everything from NHS waiting lists and productivity to climate change and education. Our job at the ODI is to help decision-makers make those choices well - translating standards, governance and evidence into tangible benefits people can feel.
Below is my read-out from the season, drawn from dozens of conversations and a lively webinar we hosted with practitioners, policymakers and campaigners at the end.
What parties emphasised - and why it matters*
Labour: big-ticket investment and state capacity
Labour set out a muscular vision for technology-led growth: a headline £86 billion R&D package across AI, quantum and space, AI “growth zones”, and a push to recognise data centres as critical infrastructure. They also aired the digital ID - “the BritCard” - proposal.
Why it matters: The investment story lands because it links compute and research to productivity. But capital spend is only part of the puzzle. Without open standards, interoperable public data, clear stewardship and trustworthy access models, money risks chasing shiny pilots that don’t scale. Digital ID raises familiar questions about assurance, proportionality and redress - along with significant concerns about data privacy and protection. These debates aren’t abstract. Assurance means being clear about who is accountable when systems fail or data is misused. Proportionality means ensuring that identity verification is no more intrusive than necessary, and that people can still access services offline or through alternative routes. Redress means giving citizens clear, workable routes to challenge errors, discrimination or wrongful data use, without burdening marginalised communities. Underpinning all of this is privacy - ensuring that data collected for identification cannot be repurposed, sold, or linked across systems without explicit consent.
Conservatives: security tech and the energy lens
Conservative Party messaging focused on live facial recognition to support immigration enforcement and a cost-of-energy package for data centres and AI infrastructure.
Why it matters: There’s a legitimate public interest in secure borders and national resilience. But surveillance capabilities only sustain trust when accompanied by independent bias testing, impact assessments, audit trails and hard limits on function creep. On energy, the UK needs more capacity - and disclosure. We would like to see open reporting of data-centre energy and water use, better siting decisions tied to grid constraints, and incentives for efficiency and heat reuse. With better data, we as a country can make informed decisions about how we use our resources. What gets measured gets managed.
Where ODI can help: convening credible assurance infrastructure (tests, audits, red-team access) and advocating open environmental data standards so communities and investors can see the real footprint and benefits of digital infrastructure.
Liberal Democrats: online harms and research access
The Lib Dems leaned into child online safety, a “doom-scrolling cap” - a proposed limit on the amount of time children can spend endlessly scrolling through social media, particularly on addictive feeds, broader platform accountability and tech trade.
Why it matters: Platform policy only works with transparent, standardised risk/harms data and researcher access that protects privacy. Vague exhortations won’t move the needle; mandated disclosures, independent audits and safe data intermediaries will.
Where the ODI can help: designing open data frameworks for content moderation and recommender-system risk, and building privacy-preserving access routes for independent researchers.
Reform UK: deregulation and energy liberalisation
Reform offered few new AI or data specifics, but they continued emphasising deregulation and cheaper energy.
Why it matters: It’s easy to caricature regulation as a brake. In data and AI, good rules are growth infrastructure: standards, clarity and accountability reduce transaction costs and market uncertainty. The risk here is a narrative that undermines the very open data and assurance mechanisms innovators need.
Where the ODI can help: setting out the economic case for well-governed data - showing how standards and trusted access enable markets, competition and new entrants.
Green Party: ethics and accountability first
The Greens didn’t unveil significant new tech policies, but they have consistently championed stronger AI regulation, public accountability and digital rights, alongside environmental data priorities.
Why it matters: The caution is understandable - and popular - but can risk drifting into over-prescription that locks up useful data and slows beneficial sharing. The challenge is balancing safeguards with access and interoperability so evidence-based innovation isn’t throttled.
Where the ODI can help: articulating proportionate models of stewardship - clear guardrails, yes, but also standardised pathways to share and use data where it serves the public interest.
Cross-party themes you should notice (and act on)
- Data as national infrastructure is finally mainstream. Compute, connectivity and data centres are now framed as critical assets. That’s progress - but without rules, they will still underdeliver value. We need to treat standards, stewardship and assurance as part of the infrastructure bill, not nice-to-haves.
- Assurance and rights are the new fault line. Whether it’s digital ID or facial recognition, the divide is less “for or against tech” and more “how do we govern it credibly?” Expect impact assessments, auditability, explainability and clear redress mechanisms to become political tests.
- Energy and sustainability have entered the data debate. Given AI’s insatiable resource appetite, expect scrutiny on siting, water use and grid impact. Open environmental reporting will be called for more and more.
- Online harms need real data. If we want safer feeds and better accountability, we need comparable metrics, comparable disclosures that can be analysed across platforms and research access that works across platforms.
- Prevention is back - if we leverage data and AI effectively. From social prescribing to community services, prevention of public harms, from mental health issues to obesity strategies offer us a brighter future. They will stall without interoperable local data, safe cross-organisational access, and clear ownership of standards.
What our audience told us
To test and enrich these insights, the ODI hosted a cross-sector webinar after the conference season, bringing together voices from government, academia, civil society, business and the data community. Participants discussed how political announcements on data and AI are landing, where they see the most significant policy gaps, and what government should prioritise next. The conversation was lively - and revealing - offering a real-time snapshot of expert sentiment.
We also ran ten live polls to capture how our attendees interpreted the policy landscape. Here’s what they told us - and what it means.**
1. Who has the clearest data & AI strategy?
Nearly half of respondents (47%) said Labour currently has the clearest strategy on data and AI, followed by the Greens (12%). A notable quarter (25%) chose “none of the above.”
Interpretation: Labour’s visibility on AI policy is landing, but based on the discussion after the poll result, they have not necessarily inspired confidence in their ability to execute. The “none” result signals a broader trust deficit - the sense that all parties still talk about AI more than they act with data.
2. Which conference pledge could shape the digital future most?
37% chose Labour’s £86 bn R&D and AI Growth Zones, with 27% backing the Greens’ stronger regulation and 18% Labour’s digital ID plan.
Interpretation: The public is looking for balance - investment coupled with credible guardrails. AI optimism and ethics aren’t opposing camps anymore; people want both.
3. Confidence in policymakers
A sobering 85% were not very or not at all confident that UK policymakers understand the value of data as national infrastructure.
Signal: Our participants believe that the political class hasn’t yet earned the right to lead this debate - and that’s precisely why we believe that the ODI’s work translating the technical into the tangible is so critical.
4. What’s hardest to grasp but most urgent to fix?
Responsible AI and assurance (43%) topped the list, followed by skills and digital literacy (22%) and data sharing for innovation (19%).
Interpretation: The audience sees responsible AI and assurance as the missing foundations of UK data policy - the system of testing, oversight and accountability that turns lofty principles into safe practice. But they’re also telling us that capability still matters. Policymakers need literacy, not just legislation.
5. Where should the government prioritise investment?
Climate and energy (29%) narrowly beat healthcare (27%), followed by business and industry (20%) and public administration (18%).
Interpretation: As climate urgency sharpens, the audience recognises that data is a net-zero enabler. But health remains a close second - a reminder that public trust will be won in sectors where AI touches daily life.
6. What would most build public trust?
Public engagement and citizen participation (43%) came first, with stronger regulation (31%) second and independent assurance (22%) third.
Interpretation: Trust isn’t built by regulation alone - it’s built by involvement. People want to shape the systems that shape them. For the ODI, we believe that this means embedding citizen voice in standards, governance and oversight.
7. The “quick win” for the UK’s data ecosystem
Common data standards across sectors (43%) led decisively, ahead of stronger personal-data protection (27%).
Takeaway: Interoperability is finally being recognised as the foundation for everything else - growth, innovation and accountability all depend on it.
8. What politicians most often get wrong
45% said politicians treat AI as purely a tech issue rather than a societal one. 36% said they over-hype benefits.
Takeaway: People want leaders to talk about AI’s social contracts - fairness, accountability, trust - not just the headlines about productivity or existential risk.
9. How the ODI should engage politicians
This was a close race. Public campaigns/media commentary (32%) and commissioned research/policy papers (30%) led, followed by closed-door roundtables (24%) and practical demonstrations (14%).
Implication: Our participants want us to stay visible, evidence-driven and collaborative - translating our technical authority into both accessible communication and rigorous influence.
10. What persuades decision-makers
Economic impact (55%) was judged the most persuasive evidence for policymakers, well ahead of voter trust (23%) or public-service improvement (13%).
Reality check: Our participants believe that economic framing still rules Westminster. If we want progress on standards and assurance, we must continue to show the growth, efficiency and productivity dividends of doing data well.
From signals to strategy: what the UK should do now
The conversations across conference season and our webinar pointed to one conclusion: the UK’s data and AI future depends not on rhetoric but on how well we build and govern our data infrastructure. The ODI’s five-year strategy sets out six guiding principles - and each one offers a compass for what government, business and civil society should do next.
1. Recognise data standards as critical infrastructure
(Principles 1 & 2)
Strong, trustworthy data infrastructure is the foundation of an open and fair digital economy. That means formally recognising sector data standards as critical public infrastructure, embedding them in procurement, and funding stewardship to maintain them.
Open data should remain the foundation of this system. Only with open, well-governed standards will people, businesses and government be able to realise the full potential of the UK’s data infrastructure.
2. Build the UK’s AI assurance backbone
(Principle 3)
For data and AI to work across borders - geographic, organisational and political - they must be underpinned by trust. That requires clear assurance mechanisms, including algorithmic impact assessments - open to civil society and researchers, tiered risk classifications, and clear communication to those impacted by its collection and use.
This is how we create ethical, cross-sector confidence in AI, ensuring citizens, innovators and regulators can trust both the data and those who use it.
3. Open up energy and sustainability data for digital infrastructure
(Principles 1 & 4)
A resilient digital economy must also be a sustainable one. We need open reporting and access to data for oversight of energy and water use, grid connections and environmental impact data for all large-scale digital infrastructure.
Trusted, independent organisations play a crucial role in making this possible - verifying, standardising and sharing data in ways that enable transparency, innovation and accountability.
4. Create privacy-preserving research access
(Principles 3 & 4)
We should expand trusted research environments and data intermediaries so accredited experts can analyse sensitive datasets securely and responsibly.
By giving researchers safe access to high-value data while maintaining privacy and public confidence, the UK can lead the way in ethical data use across borders and sectors.
5. Raise policymaker literacy - fast
(Principle 6)
To deliver meaningful change, we need a new generation of data leaders across government. MPs, peers and officials must understand the value, limitations and ethics of data - not as a technical skillset, but as a lens for policymaking.
The ODI will continue to provide practical training that helps policymakers navigate interoperability, data quality and assurance with confidence.
6. Make public engagement the default
(Principles 3 & 5)
Public trust is built through participation. Citizens’ juries, participatory design and transparent feedback loops should be standard practice in any major data or AI initiative.
This isn’t just about communication - it’s about equity. To make data work for everyone, those designing and using it must be conscious of bias, inequality and power asymmetry, and act to correct them.
7. Procure for openness and outcomes
(Principles 1, 2 & 5)
Government procurement should set the tone for the entire ecosystem: mandating open standards, auditability, interoperability, clear redress mechanisms and measurable outcomes.
Every contract should help build a fairer, more open data environment - one that unlocks innovation while protecting citizens from harm.
A note on proximity to Big Tech
Several participants raised concerns about the government's closeness to large US tech firms. It’s right to scrutinise influence and conflicts of interest. But the answer isn’t isolationism - it’s confident communication and guardrails:
- Set the rules (standards, assurance, transparency).
- Procure with backbone - contracts that lead to benefit for the British public.
- Back domestic capability where it matters and partner internationally from a position of clarity.
The UK can be both an AI maker and a rule-maker if it builds the strong foundation - standards, stewardship, and assurance - that lets public value and private innovation reinforce each other.
How the ODI will play it
Our policy team is focusing on three areas:
- Translating the technical into the tangible. We’ll link data governance to what people care about - faster diagnoses, safer streets, lower bills, better-targeted spending.
- Meeting decision-makers where they are. Productivity, growth, trust, cost control, regional prosperity - we’ll connect each to data done well.
- Evidence plus storytelling. Rigorous research and analysis, real case studies, and hands-on demonstrations to make it tangible and real.
We’ll continue convening a broad coalition including civil society, industry, local leaders, researchers and communities to discuss these issues. If you’d like to partner on standards adoption, assurance pilots or public engagement, get in touch.
The choice in front of us
If there was one frustration in this autumn’s conversations, it’s that we still treat AI and data as “tech issues” to be managed in a corner. They are societal choices, choices about who benefits, who is accountable, and how we balance innovation with rights and trust.
Conference season showed the outlines of competing answers. The next phase is implementation. If we focus on standards, stewardship and assurance, we can convert rhetoric into results, and rebuild public confidence along the way.
Let’s get on with it.
* Parties listed in order of Parliamentary presence
** Eighty-six people participated in the webinar, representing organisations across the public, private and third sectors. While this is not a representative sample, it offers a valuable snapshot of expert and practitioner views at that moment in time.