We felt that each of the scenarios we explored was technically possible and that there would be viable business models for organisations offering these services, but there were things that might stop them emerging easily, for example whether data stewards – the people who hold data – decide to work together to make the data they hold portable to an open standard, or whether regulators intervene to force them do so. These echoed the limitations to data portability that we have seen in other sectors and contexts.
We could also see risks in each scenario. A badly built Holo-day might lead to disaster tourism, with people getting pleasure from virtually experiencing other people’s discomfort. The Hypr-Local bike tour could lead to identikit holidays with everyone touring the same few tourist traps. All of this could damage local economies although, perhaps, some local communities might enjoy corralling tourists into one place.
The Reputation Barometer caused most internal debate
At the ODI we are strongly in favour of individuals having more control over data about them. We believe that data should be used to make more and better decisions, but everyone needs to be careful with the type of futures that can be created by and for societies. People, organisations and societies that end up being driven by data and making all of their decisions by ‘clever algorithms’ might be efficient and convenient for some, but could also lead to flawed, ineffective and unfair societies for others.
The Reputation Barometer envisioned a society in which the services people received were determined by the reputation scores other people give them. It is a scenario familiar to viewers of the Black Mirror episode Nosedive. In such a society, people, businesses and governments might expect, or even demand, to see reputation data when making decisions.
Would it be fair that a bad rating on a sharing economy platform could damage your ability to rent a flat for the long term or get a job? How would you understand or appeal the decisions? Would you even know if an organisation was making a decision about you using bad data or a dumb algorithm? Would you be able to change your mind about giving those organisations access to data about you and stop those organisations from using it? Or be able to correct bad data?
Data needs a stronger human element
A system based on reputation data might be what some societies want and decide to build. Some say China’s social credit system is heading in that direction, or that Western society’s credit score systems will do the same. But we believe that many societies will choose to have a stronger human element.
This will start with a clear rights framework. This framework will include strong regulators and protections in favour of privacy, against discrimination and in support of innovation. Organisations who deliver services will include ethical considerations and good design practices in their processes. Human decisions – with all of their own flaws and biases and moral dilemmas – will improve by virtue of being informed by data, with support from consumer organisations, well-designed ethical services, and regulators as a backstop.
Monitoring progress towards good outcome
It is still hard for us to monitor the effects of data portability and whether, despite all of our efforts, we will still end up with adverse outcomes. That might be due to accidents or malicious actors. Once data is shared between organisations it can be copied and it becomes ever harder to understand who has access to it and how it is being used.
Despite these difficulties, societies – and particularly policymakers – will need to learn how to monitor and respond to the effects of data portability, both in terms of the benefits and the harmful impacts. This monitoring will need to include being open about how personal data is being used as well as how equitably the benefits are spread. It will be necessary to help societies shape technological progress to deliver the outcomes they decide on.
If harmful impacts occur, and trust in data starts to be lost, then we can imagine stronger government intervention across the world. Building from our example of reputation data, perhaps reputation will become what the UK calls a ‘protected characteristic’, with warning signals and inspection measures in place to check that organisations are only using reputation under certain conditions.
At the extreme there could be a public backlash that takes us into a ‘data wasteland’. People might withdraw permissions for data about them to be used, even for public good. As individuals restrict their privacy settings, we risk adding bias into what might otherwise have been useful datasets. In some countries, controls may be put in place that increasingly limit researcher and business access. This could unnecessarily inhibit the potential of data portability to create more effective and efficient services and a fairer data ecosystem for us all.
Building a better and more open future
The team debated these issues long and hard.
Our consensus was that the future we glimpsed was as scary as it was full of the potential to make things better. We think that a future more informed by data will be better for governments, businesses and people. We think that a future in which people can move and share data between platforms and services has many benefits: such as more choice, more access to more efficient and effective services, and a thriving economy. We can also see there are risks of harmful impacts and even dystopian possibilities.
The work of the Open Data Institute, and other organisations, needs to help achieve the many benefits while heading off the risks. Our innovation programme, which both this project and our recent exploration of data portability in the telecoms sector with IF have been part of, helps with this work.
We are about to start another project in that programme looking at different ways to widen access to data while retaining trust, and we are also starting more data portability projects for business and governments.
If you want to talk about the issues raised in this post or work with us to use data to help build a better future, get in touch at [email protected].