Some think individuals should own data about them to ensure they fairly benefit from it. Some go as far as saying people should be paid when other organisations use it. In this post, Jeni Tennison looks at how we can share all in the benefits of personal data
Facebook and Google have two of the largest stock valuations in the world, according to statistics pooled by Statista. As Visual Capitalist charts have shown, most of their revenue comes from ads. That ad revenue is generated from their colossal reach and from how they use data they glean from user activities, enabling organisations to place ads in a targeted way.
Some consider this business model as being centred around these organisations selling personal data. That’s not precisely what they’re doing – they are selling ad space, rather than personal data – but to many it can feel like something that is ‘theirs’ is being sold. People ask whether these companies are unfairly benefiting by exploiting data about us, and want those benefits to be more fairly spread.
Getting paid for data about us
One alternative model that some propose is for individuals to have an explicit property right (ownership) of data about them, and for revenues made from that data to return to those individuals. Data ownership differs from having data protection rights. It enables people to license, lease or sell data about them – transferring rights temporarily or permanently – whereas data protection rights cannot be transferred in this way. It would require a significant change to legal frameworks to make this happen.
In practice, large profits by companies would not equate to large payouts to consumers. In 2017, Facebook generated about $9 revenue per user in the EU. The most recent equivalent figure for Google is less than $7 per user. Their profits are less than this revenue which would reduce any potential payout. Sharing in the profits of even the most profitable tech companies would not mean much actual income for any of us individually.
Would receiving $5-10 a year (on top of access to the free service itself) feel like a fair return for the way Facebook uses data about you?
The troubling implications of direct payouts
At the ODI, we worry that introducing direct payouts to consumers could introduce new, and troubling, dynamics between people and data about them.
Specifically, it gives people a financial incentive to share more information about themselves. For example, some suggest that data-rich but economically poor Indians should monetise data about themselves to get credit or better healthcare. The notion of people exchanging data about them for their health or wellbeing is particularly troubling. It is just one example of where direct payouts could lead, contributing to a pernicious dynamic that could play out in different ways across the world.
The fundamental human right to privacy cannot only be upheld for those rich enough to afford it. As Privacy International says, privacy should not be a luxury. This is also the position of the European Data Protection Supervisor – an independent institution of the EU:
"There might well be a market for personal data, just like there is, tragically, a market for live human organs, but that does not mean that we can or should give that market the blessing of legislation. One cannot monetise and subject a fundamental right to a simple commercial transaction, even if it is the individual concerned by the data who is a party to the transaction."
Understanding the controls we all have
In many countries, there are already quite a lot of rights for people to control the collection and use of data about them. For people in Europe, or people who use services provided by a company that processes data in Europe, more rights are being introduced from May 2018 by the EU General Data Protection Regulations. GDPR raises the bar to a higher standard for explicit consent when data is to be collected or shared, with exceptions for things like vital interests and public tasks. We will be able to access data about us and force organisations to delete it. Many organisations already provide versions of these rights in their services, not just in the EU but worldwide.
GDPR is only just coming into force. It was designed to address some data issues. It will take a while to see which issues it will help solve, which it won't and what its effects will be on the individuals, businesses, the market or society. Unfortunately, many of us do not exercise the controls we already have over how data about us is used. Having data ownership won’t suddenly make people more able or inclined to navigate and use the controls they have.
In many cases, the controls we have can be hard to understand or find. Not many of us read and understand the terms and conditions of services before giving consent for them to use data about us. Not many of us modify our privacy settings away from the defaults.
As more data is collected about us, it is becoming harder to understand the consequences of the permissions and consent we do give. And, because data is seldom about only one person, other people may use their controls to enable access to data about us in ways we do not know about.
Payments for data introduce a financial incentive to provide broader consent to data access and use. In the Cambridge Analytica case, the first users of the thisisyourdigitallife app were paid $1 or $2 to use it. A condition of that payment was giving access to data about them already held by Facebook – their likes, posts and other activity – and access to data about all their friends too, unless they had changed their default privacy settings. The result was data on 50 million Facebook users being used in ways they had technically given consent for, but was completely unexpected and unanticipated.
We can better educate and we can improve the design of terms and conditions. But we also need effective regulation and for regulators, consumer groups and the media to be able to access information about how firms are using and sharing data, so they can be held accountable.
Better ways to share benefits: better taxation and data portability
Market forces are not driving towards better consumer outcomes around the use of data. People feel resigned to the way data about them gets used rather than making rational decisions about the tradeoffs, as a 2015 Annenberg survey showed. Facebook has simultaneously huge adoption and high dissatisfaction rates – something brought to light in doteveryone’s 2018 Digital Attitudes Report
But there are more effective, less dangerous ways to share benefits than through data ownership and paying people for data about them.
For governments, better taxation regimes can return a fairer share of the revenue from large technology firms, so it can be used to fund public services.
For individuals, data portability – which becomes a right in GDPR – provides an opportunity, if implemented with the right safeguards, to benefit from services that provide insight from data collected about us, such as using data about your shopping basket to advise you about where to shop. Those services could be built by new startups, community groups or governments, creating jobs and services in a local context. This data could also be used to benefit society, whether by providing secure access to accredited researchers or by firms like Uber publishing anonymised and aggregated open data.
Safeguards are vital. After all, the scandal currently surrounding Facebook and Cambridge Analytica stems from misuse of data provided by a similar mechanism. We will return to government's role in data portability in another post.
Data is moving from being scarce and difficult to process to being abundant and easy to use.
However, people are losing trust in the models around how data is currently collected, shared and used. We must find ways to ensure the benefits of that data are felt not only by big tech, not only by those able to afford it or knowledgeable enough to control it, but by everyone.
You can read all of our thought pieces and interviews on the Facebook and Cambridge Analytica story in our ODI View page. If you have comments or experience that you’d like to share, pitch us a blog or tweet us at @ODIHQ.
If you are getting to grips with ethical If you are getting to grips with ethical data use in your organisation, the ODI offers workshops, coaching and resources designed to help you navigate them. Speak with us at [email protected] to find out more.