Image: Adobe Stock

Objective data? Reflections on the Commission for Race and Ethnic Disparities report

Thu Apr 15, 2021
$download_content = get_field('download_content');

In this blog post, Dr Jeni Tennison OBE, ODI’s Vice President, looks at the objectivity of the data used in the Sewell report by the Commission for Race and Ethnic Disparities

In this blog post, Dr Jeni Tennison OBE, ODI’s Vice President, looks at the objectivity of the data used in the Sewell report by the Commission for Race and Ethnic Disparities – focusing on how ethnicity is categorised, the government’s use of data, and inequalities in the UK’s digital public services. 

In July 2020 the UK government created the Commission on Race and Ethnic Disparities – chaired by Tony Sewell –  to review inequality in the UK, focusing on areas such as poverty, education, employment, health and the criminal justice system. On 31 March 2021 the Commission published their report (‘the Sewell report’), which has been widely – and in our view, credibly – criticised for its methods as well as its conclusions.

We believe that responsibilities and opportunities for addressing race and ethnic disparities in the UK lie across a broad range of sectors and domains in our society and economy; and that there are ways for people and organisations to do this effectively and without being constrained by the limitations of the Sewell report.

The Sewell report contains 24 recommendations. One of these focuses on data: ‘Recommendation 23: Use data in a responsible and informed way.’ But what does it mean for data to be used in a responsible and informed way? In this mini-series we look at aspects of this question, and the implications for addressing race and ethnic disparities.

Dr Milly Zimeta, the ODI’s Head of Policy, has also written about this topic, focussing on structural biases and on digital technology, and how we can all innovate in data policy and data practice for transformative results.

Objective data?

One of the striking features of the Sewell report is how much it emphasises the importance of data in the conclusions it argues for. This emphasis should be heartening: policy should, after all, be evidence-based. However, our work, and the work of other scholars in data policy and data practice, suggests that responsible and informed use of data requires a much more critical approach than that demonstrated in the Sewell report.

Responsible and informed use of data requires a much more critical approach than that demonstrated in the Sewell report

The Sewell report contrasts ‘single-issue identity lobby groups’ that ‘stress the “lived experience” of the groups they seek to protect’ with those (presumably including the report authors) who rely on ‘objective data’. But data is rarely objective. Reality is multi-faceted, and the angle from which you view it determines what you will see. We all choose, in more or less rigorous ways, what data to collect, how to collect it, how to model it, how to analyse it and which bits to emphasise in our visualisations and reports. For example different organisations can provide very different numbers for even relatively simple things like ‘Covid-19 deaths’ (‘deaths within 28 days of a positive test’ compared to ‘deaths registered where COVID-19 was the underlying (main) cause on the death certificate’).

Responsible data use means seeking out as full a picture as possible using multiple sources and types of data

Responsible data use means seeking out as full a picture as possible using multiple sources and types of data – including both quantitative data and qualitative data. Responsible data use also requires an explicit recognition of the limitations of the data, the assumptions in what questions are being asked, and possible blindspots or gaps in the data itself. In our project on data about children’s lives in the pandemic, we combined data from civil society organisation Barnardo’s, social media company Mumsnet, and open source government data from Department for Education (DfE) and Department for Work and Pensions (DWP) to create a rounded and nuanced picture of impacts on children, families, and teachers. Organisations of any size with an interest in tackling ethnic disparities can collect and share different kinds of data to create a fuller understanding of the challenges and opportunities.

The BAME game

The Sewell report recommends that data standards are introduced to help ensure that ethnicity data from different public sector sources are easy to compare, with standardised reporting. This could help make it easier to ensure analyses are consistent – something that has sometimes proved difficult when ethnicity data has been categorised in different ways, such as ‘White/non-White’ or ‘BAME’. But, as Eleanor Shearer’s excellent essay on the project of representing race in data observes, the unwelcome ‘BAME’ acronym is just one of a long history of clumsy approximations and moving goalposts for what’s really being measured. And as we try to standardise data about race and ethnicity, we should also aim to standardise data about racism and bias.

It means examining the characteristics of the environments in which ethnic disparities occur, and not just examining the characteristics of ethnic minority individuals and communities.

Informed use of data for complex areas such as race and racism require critical thinking about the purpose of analysis. It means examining the characteristics of the environments in which ethnic disparities occur, and not just examining the characteristics of ethnic minority individuals and communities. And it means collecting and analysing data from sectors or domains with a positive track record of reducing inequalities – and not just those with the worst problems – so that everyone can learn from good practice and contribute to developing it further. All of these are key areas for data collection or analysis by researchers or research funders keen to play a part in reducing ethnic disparities.

Seeing like a state

The Sewell report leans heavily on the database created by the Cabinet Office’s Race and Disparity Unit (RDU), which contains ‘all the important data on race and ethnicity’ as the basis for its findings. But the RDU database only includes government data, and government figures and official statistics necessarily view society from the perspective of the state. Work on indigenous data sovereignty highlights the limitations of these views. The data the state collects reflects its assumptions and goals, which are frequently limited and can be misguided or harmful. Communities can be unseen or misunderstood in government data, particularly if they interact less with the state due to their culture or choice (for example when they do not trust the government to use data to their benefit).

The data the state collects reflects its assumptions and goals, which are frequently limited and can be misguided or harmful.

This means there’s a vital role for organisations other than the state. Our work with the NASUWT teaching union on teachers’ experience of the pandemic highlights the role of organisations, like trade unions, to collect and share quantitative and qualitative data on a significant segment of society. This brings a richness, granularity, and timeliness to the sparser and infrequent data collection by government. And our recent report on inclusive data contains a fascinating case study about the Romani Cultural and Arts Company that works with Gypsy, Romani and Traveller researchers to collate data on their own communities – drawing on access, insights, and networks that are typically out of reach for non-Gypsy, Romani or Traveller researchers.

There is a strategic opportunity for civil society organisations to step up to act as data institutions – stewarding data on behalf of communities. And also for the public sector to make better use of data that doesn’t originate from itself – for example, building on experiences of public-private data-sharing collaboratives for pandemic response.

Digital divides in public services

Finally, the public sector has to be committed to collecting data on its own services. The Sewell report does not address one of the public sector data gaps we explored early last year: data for monitoring equality in digital public services. With so many of our key interactions now taking place through online tools, we need more data to understand who is, and who isn’t, accessing digital services; and how experiences of these services might vary across social groups and across different digital services. It’s equally important that this data collection is done in a way that is acceptable to people, and doesn’t have the unintended consequence of deterring people who might already be disenfranchised from accessing digital services for support.

We need more data to understand who is, and who isn’t, accessing digital services

One way of looking at all of this is through the lens of data skills. Businesses increasingly recognise that the skills needed for making the most of data go beyond technical skills, such as quantitative analysis and coding. Data literacy encompasses ‘soft skills’ such as building communities, working ethically, developing strategy, and leading change. For several years now, our ODI Data Skills Framework has been widely used by organisations to build their capacity to work well with data, cultivating a kind of holistic, sustainable, embedded data competency. For responsible and informed use of data, data literacy is something that can be put into practice in any sector, and by any organisation that wants to do its part to understand and address race and ethnic disparities.

Any kind of decision making, especially in areas that are complex, contentious and nuanced, requires the use of data from multiple sources and viewpoints, to make communities feel seen and understood, and to create a more complete picture of both challenges and opportunities. Government has to learn how to do this better, but all kinds of organisations – from academia to civil society to the private sector – can contribute to creating a rich data landscape that supports this deeper understanding.