Designing for fairness panel at the ODI Summit 2018. Image by Paul Clarke (cc by 2.0)

ODI Summit 2018: How can we design for fairness?

Fri Jan 11, 2019
$download_content = get_field('download_content');

How can data benefit everyone? Researcher Ben Snaith reflects on discussions from the ‘Designing for fairness’ panel at the ODI Summit 2018

In November 2018 over 500 delegates attended the ODI Summit 2018 at Kings Place in London for a day of talks, panel sessions, unconferences and poetry recitals exploring how we currently value data, and how to extract more value for governments, businesses and citizens.

The ‘Designing for Fairness’ panel looked at who benefits from data, and ways to ensure data’s value can be felt by the many, not the few.

Azeem Azhar (Founder of Exponential View), chaired the panel made up of: Nóra Ni Loideain, Director and Lecturer at the Information Law and Policy Centre;  Kit Collingwood, Deputy Director, Department for Works and Pensions; Martin Tisné Managing Director at Luminate; and Claire Melamed, Executive Director for the Global Partnership for Sustainable Development Data.

Neutral data?

The world is relying increasingly on algorithms, analytics and automation – to make decisions for us, advertise to us and improve services. These functions rely on data, but what if that data is biased, skewed or incomplete?

The benefits of data are not felt equally by all. Organisations may hoard data in silos to use for their own advertising and insights, with minimal benefit to the consumer and wider society. Data is also being used in harmful ways, and is prone to amplify discriminatory real-world biases. The idea that data is neutral is slipping out of the collective lexicon. As the AI Now Institute states, data will ‘always bear the marks of its history’.

Data collection, standardising datasets and the training of algorithms all have one big issue in common – they depend on human decision making. Yet humans are flawed – ‘to err is human’ – and these flaws are reflected in data which can cause unfair results.

However, ethics and fairness can be embedded within the design of data-driven services to enable better social and economic outcomes for everyone.

Who decides what is fair?

The panel dove straight into a fundamental issue: who decides what is fair?

Martin Tisné suggested that “fairness is a question of perspective”, and that society needs to decide who they want to set the moral framework that data operates within. The ‘who’ matters as they will have their own personal and subjective definitions of fairness – reflective of power dynamics, experiences and motivations.

People-focus

The panel explored whether focusing more on people would overcome these challenges. A people-centric approach can, and should, be looked at in two ways, they argued.

Firstly, the fundamental point of ensuring that humans are not harmed – or as Kit Collingwood put it “how do we exploit data without exploiting humans?”. Considering this further, members of the panel argued that the description of the session itself was unfair (‘Designing for fairness: growing data’s value for markets and citizens’) by using the word ‘citizens’ (legally recognised people), rather than simply for ‘people’.

The second factor is to look beyond simply avoiding harm to examine how people can benefit from data. Debates around giving citizens ownership of data can often be naive, according to Kit Collingwood, as many aren’t in a position to effectively utilise data ownership. She added that many lack the digital skills, experience, time or knowhow to capitalise and therefore they are not empowered. But data can and should be used to benefit citizens – data should inform and empower.

Nóra Ni Loideain also discussed what she terms the ‘consent myth’ noting that few have the time to read the ever-sprawling terms of service, and even if they do, many certainly can’t understand the legalese. Therefore, even if they have ticked the box, have they really consented? Instead, the panel argued, citizens should be empowered – potentially through simplified terms that can be read, understood and acted upon. This rightfully should be safeguarded by other checks and balances that may be currently lacking – offering more complete protection for people.

Claire Melamed mused that it was “interesting how the debate on privacy is divorced from other human rights”, making the point that we are not all individually required or responsible for ensuring that we have clean air, or that we are free from torture, so why is onus is on the individual to protect their own privacy from invasion and disregard?

Who will drive this change?

Some people call for more regulation and for that regulation to embed greater ethical and fair sentiment. Martin Tisné suggested that regulation isn’t necessarily given the respect or value it should – he described the “false dichotomy between innovation and regulation.” Innovation and regulation should work hand-in-hand to protect individuals and act fairly whilst still seeking growth and development.

GDPR is a “mission statement” in that regard, according to Nóra Ni Loideain, but is not the “silver bullet”. Whilst law may, naturally, always be chasing tech to keep up with the latest developments there are other institutional remedy structures available such as the courts, petitions and other legislation which can provide additional protections when things go wrong.

For help and guidance, we can look elsewhere to what has already been done. Claire Melamed discussed how ethics have been embedded into the medical profession for millenia through the hippocratic oath. Indeed, there have often been calls to enact a similar oath for developers and data scientists so they similarly can be guided, and held accountable, for acting fairly and ethically. The panel made the point that rather than starting from scratch, the tech industry can borrow from existing ethics frameworks.

Ethics by design?

We are not necessarily closer to defining ‘fairness’ for data but the growing expectation that emerging and developed technologies should include ethics in their design and operation will keep the conversation going. There is at least awareness from many experts that this unfairness exists, and some of the big tech companies are making the right noises but whether that results in concrete actions – protecting individuals and bringing greater economic and societal benefits – is something that only time will tell.

For more content from the ODI Summit check out the other panel blogs and the #ODISummit on Twitter.