Patrick Fagan

Former Lead Psychologist at Cambridge Analytica talks with Anna Scott on trends in consumer behaviour, how it’s analysed in increasing sectors to influence people’s buying, voting, donating or traveling habits, and where he thinks responsibility and trust come in. You can read the ODI’s comments on the story around Cambridge Analytica and Facebook, and our views on personal data being shared, on our ODI View page.

Patrick will be on a panel on about trust in data and tech – with panelists from DeepMind, the UK Information Commissioner's Office and Doteveryone – at the ODI Summit. Book your place here

Hi Patrick. How are you, and where are you working at the moment? 

I’m very well thanks, I just came back from Italy. At the minute I’m at a company I contract for that does short-term work placements. I’m helping them understand employees to put them in the best jobs.

You study consumer behaviour. Could you tell us about what that means?

My elevator pitch is that I turn minds into money. I take psychological insights across the range of psychological research, from developmental psychology – how the stages of development in childhood impact what people buy – through to cognitive psychology, and everything in between.

It’s about taking science and academia and applying it in a commercial setting – understanding how to get people to buy something, vote for somebody or go to a leisure destination.

What sort of companies have you seen becoming more interested in consumer behaviour recently?

It’s really taking off a lot right now. Most companies across the board are getting more interested in it. I would say it built up speed in advertising – which has always been about understanding and influencing an audience – but also financial services, because it’s very rational and there are concrete behaviours and data involved. Charities are also interested, for the same reasons. The public sector, also. Everyone is interested in it – all sorts of companies and sectors.

How much do you think consumers understand how data about them is being used by companies? Do you think it matters?

I don’t think they understand at all, and I put myself very firmly in that category too. People generally care about emotional things, and data is very rational and complex. YouTube is more popular than Wikipedia, for example.

The vast majority of people don’t read the fine-print in financial services products, so they won’t have the time, energy or information to look into data privacy. It probably doesn’t matter as it’s such a rational thing.

With Facebook, for example, I suppose there’s a very emotional thing about people not liking to feel creeped out. But we care more about being able to keep up with friends than we do about data privacy, and it’s too complex for anyone to get too deep in to.

But does it matter, ethically? Yes, of course.

Do you think the onus is on people to understand more about how data about them is being used?

It is, but then people don’t make the best decisions in their own self-interest. I don’t think we can rely on people – and I include myself in this – to make the best decisions and to be informed on data privacy.

While I’m not personally a fan of the ‘nanny state’ or what have you, I do think there needs to be some oversight on these things, because people just don’t have the time or the know-how to do it themselves.

Do you think GDPR has made a difference in how people see data about them being used?

I suppose it has brought awareness to the issue and made people think about it, and also data scandals have brought awareness to this issue and people didn’t realise that data about them was being used in that way.

So, it’s useful for awareness – but in terms of making people knowledgeable, I’m not really convinced. It might just be me, but whenever one of those new GDPR pop-ups comes up on a site, I just click whatever button that’s biggest to get rid of it. I don’t read it, and I think most people are the same.

Do you think there’s a crisis of trust in data or tech at the moment?

I think there’s a crisis of trust overall, in everyone and all institutions. Particularly the old guards, such as politics and the media. I think that’s driven by data, because there’s freedom of information and it’s much easier to find out if a company is acting unethically or if a media outlet is not reporting as honestly as they should be.

It’s also more about the people behind it. So there’s probably more of a crisis in trust behind Mark Zuckerberg than the data itself. People – particularly young people – recognise the value and usefulness of data. But it just needs to be used ethically.

How far do you think trust in a service affects its use, and do you think this is changing?

Trust certainly does affect use. There’s some research saying that people will punish something that’s deemed unfair even if it hurts them, so they’ll go somewhere else that’s not as good a service or not as cheap. It definitely affects use.

I’m not sure if it’s changing. I think the effect is still there, that people are getting less and less trusting. And also they have more opportunity now to switch to other services quite easily.

You’re very much an expert in methods for audience segmentation. Do you see those shifting in the next few years?

Yes. Obviously, behavioural science is taking an increasingly important role in understanding groups and individuals. Micro-targeting will be a trend. And it will be in a more sophisticated way than just the ‘big five’ personality traits, which is good but we will become a bit more sophisticated in terms of the traits we look at.

Segmentation is likely to occur more and more through passive data collection than through trait measurements. Like how, for example, we can know what someone’s personality looks like, based on the types of groups they ‘like’ on Facebook. You can tell how extroverted someone is from how many people they call on average each day. You can see if they’re depressed by looking at their location data and seeing how much they’re moving around. I think we can almost certainly see a move towards that passive segmentation.

The final thing is AI (it’s not really AI, but people call it AI): automated segmentation when no human was involved. And automated usage of that segmentation, such as optimising adverts in an automated way without any humans involved.

Aside from the humans who program them, right?

I guess so … for now.

What are you excited about for the ODI Summit this year?

I’m excited about the panel on Emerging Tech, because that’s what I’ve always been interested in – particularly biometrics and emotional recognition. I hope that might be covered in the discussion.

Patrick will be speaking on ‘Building trust in data and tech’ at the ODI Summit. Tickets are available here.