Here, we speak to Pamela Duncan, Acting Data Project Editor at The Guardian, and Andy Dudfield, Head of Automated Fact Checking at Full Fact about trust and misinformation
As part of the Data Decade, at the Open Data Institute (ODI), we are exploring how data surrounds and shapes our world through 10 stories from different data perspectives. Trust and misinformation, explores the practices and technologies that are emerging to address the spread of untruths and misinformation
Listen to the podcast now
Listen to "Trust And Misinformation" on Spreaker.
Listen on Apple Podcasts or Spotify
You can also subscribe to the ODI’s podcast on your preferred platform to receive future episodes when they’re released: Apple Podcasts | Spotify
Emma Thwaites: Hello, and welcome to data decade. The podcast from the ODI and I’m Emma Thwaites. And in this series, we look at the last 10 years of data, the next decade ahead, and the transformational possibilities for data in future. So far, we’ve looked at data in arts and culture, how data is shaping our cities and who looks after data. But in this episode, we’re exploring an issue really close to my heart as a former journalist, we’re asking the question: true or false? False and misleading information is nothing new. We can go back thousands of years to ancient Rome to find stories that aren’t true.
Whether to make people question who we can trust or to change people’s views about something. But we now live in an age of the web and social media, and there’s far more false and misleading information that travels faster than ever. And misinformation affects everyone. It can cost people, their livelihoods, their mental health, and in extreme cases, even their lives.
At the ODI we want data to work for everyone. For this to happen, people must be able to trust the way that data is used, shared, collected, and stored. But how can we build this trust? When we hear about fake news and misinformation. Let’s find out welcome to Data Decade.
So I’m really looking forward to this episode, and over the next 30 minutes, we’re going to explore the practices and technologies that can help us to stop the spread of misinformation. And also what happens if it’s left unchecked. I’m delighted to be joined, to talk about this by Pamela Duncan, the acting data project editor at The Guardian and Andy Dudfield, head of automated fact checking at Full Fact.
Welcome to both of you. Great to have you with us. So Andy, I’m going to start with you, if I may. Full Fact is about the same age as the ODI. I think you’re 11 years old. You sound quite advanced and mature for an 11 year old, got to say! What changes have you seen in the last decade around your work and around data in general.
Andy Dudfield: That’s yeah, that’s a great question. So I’ve been at Full Fact for about three years now. Full Fact for those who don’t know is a fact checking organisation based in the UK with fact checkers and campaigners looking to expose and counter the harms of bad information.
And I think the main thing that we’ve seen over the last 11 years or so is a change in scale. We’ve gone from a world where we’re – we’re not quite checking the front pages of newspapers only – but we’ve gone from a fairly obvious, smaller group of things to an explosion. The internet has fundamentally changed the way that we communicate and it’s changed the scale of the problem that we’re addressing of misinformation.
And so we’ve had to adapt and I’m sure we’ll be talking about a bit more about this later, but we’ve looked at the use of technology to help us support our work because the size of volumes of data we’re dealing with have become huge.
And we’ve also had to adapt our tactics. Fact checking is a relatively new discipline, I suppose, but we know that we’ve moved to the idea of publishing a fact check and thinking that it will automatically change the world by just publishing that fact check. To more of a world where we publish our fact check and we follow up, we ask people to correct the record we pursue change. We look at where we can lobby for change in legislation. We partner with internet companies and we try to work with a wider cycle of, making sure that we can have the greatest impact possible.
And also the thing that we notice with misinformation is that it goes in cycles. We see things come and go over time. We see the fact that, just the other day I was looking at a fact check. The origination of it was in 1954 – about somebody arriving in Japan without a passport. And there’s a modern twist of this where apparently a man arrived in the 1950s in Japan from an ‘alternate universe;, but we see these things come and go.
And that’s a kind of a glib example, but we see these things time in, again, around vaccine hesitancy and things that really have a damaging impact on people’s health and the choices that they’re making.
Emma Thwaites: So there are kind of trends of falsehood?
Andy Dudfield: Yes, there are certain things that do come and go and sadly around some of the medical things – we can see this with, say things like vaccine hesitancy around the COVID pandemic, but also we’d seen this for other vaccines previously.
And we also see some of this around democracy and electoral misinformation. Pretty much every time we have an election, we will see some kind of information about maybe the polling day has moved for a particular party, or that you aren’t allowed to use a pencil or a particular type of pen when you are making sure that you are doing your democratic duty and those things come around time again, you can almost set your clock by.
Emma Thwaites: Absolutely fascinating. You’ve mentioned as well in that answer, previously, the scale of the issue has just grown almost exponentially. Can you give us a sense of what sort of scale we’re talking about? How many untruths are circulating on a daily basis?
Andy Dudfield: Sure. So we use a range [of methods], particularly artificial intelligence, to monitor the UK media landscape. And we have an AI model that we’ve developed based on a lot of annotations from fact checkers to identify claim-like statements. So things that we might be able to fact check and each day we’re probably identifying around a hundred thousand potential things that we could fact check.
And so that is a huge amount of information that we’re seeing as an input. And also if we compare that to the volume of fact checks that exist in the world, fact checking is maybe a 20 year old, maximum, discipline – in this kind of in this way that we are looking at it, it’s 20 years old. And there have been around about a hundred thousand fact checks written in that time.
So the totality of fact checks that exist is roundabout the same as a number of claims. We are spotting that we could fact check each day. And so we really are looking at the world where there are hundreds of thousands of things and the information that we are publishing, the fact checks, are seen by hundreds of millions of people.
Emma Thwaites: Some of this can have very serious consequences, but there is a light side to it. You mentioned the person arriving in Japan with a passport, from an alternative universe. Have you got any other examples? The sort of slightly more bizarre things that circulate.
Andy Dudfield: I think the world continues to surprise us. And very much, it depends on where the information is coming from and what we’re expecting to see. So we partner with some of the internet companies to start looking at information that spreads on social media.
And that is the world where we might see claims about people from alternative dimensions and the like. But then also we partner with other organisations where we’re seeing things that are much more pertinent around health information. We did a project with where we partnered with an organisation called Pregnant then Screwed, who are an amazing charity who look rights for pregnant people.
And we saw some very particular stories that are coming through from that as well. So working with communities who are affected by misinformation, means that we see a really wide spectrum of the different types of information that is important to check for people’s lives.
Emma Thwaites: And are you able to spot social trends, you know, along similar lines?
Andy Dudfield: So I think there’s an element of being able to do that. It’s the, we have a vast corpus of data now we’ve been using technology for a number of years and we have a pretty strong back catalog of that information.
And we can certainly see things, for example, around, 5G. So this was maybe a couple of years ago. Now there was quite a lot of misinformation around 5G where people were worried about 5G telephone networks becoming near their houses and things like that. It’s completely false obviously, but, there has been something like that has carried on for quite some time.
4G had some other things, 3G had some other things and we can see these trends build up over a number of years and to an extent. It is an impossible art, but we do try and keep ahead of where those trends are going so that we can try and make sure that we’re taking preventative action where possible.
Emma Thwaites: Excellent thanks. Pamela, you’re a journalist and you use data in your stories and write about how data is used. What issues have you seen in the last few years?
Pamela Duncan: There are a couple of different ways of answering that question. I mean, journalistically my job is usually to provide solid facts rather than to disprove misinformation. And we’re very lucky to have the Full Facts of this world to do that job for us, because it really is no easy task, as you can tell by the scale.
But while disproving misinformation does make up a small part of my job, it does occur. So for example, a number of years ago, my colleague Alex Hern and others, looked at the number of times that Twitter accounts were being cited in UK news articles by established mainstream organisations, including our own, although we were very glad to only have a handful of these examples. But we were able to trace a number of tweets back to deliberate misinformation, pushed out by Russian bots.
In June, 2018, US Congress released details of 1,000 accounts that Twitter believed to be run by the internet research agency, a state-backed misinformation operation run out of Russia. That was on top of 2,000 accounts that they’d already identified. And our exercise was to take these accounts and trace them back to UK media outlets. And we found that these Russian trolls have been cited more than 100 times by established mainstream UK news organisations.
And I think that’s really telling in that, we, as journalists are always striving for accuracy, but we’re also time poor, and sometimes, you want to reflect the zeitgeist, but even in doing so, even in like trying to, trying to hoover up information from the internet, or Twitter in this instance, you can come across these deliberate misinfo postings that you in then in turn can embed in your piece. So that’s a really good example of how it can trip up even people who are professionally trained against it.
And then a different investigation with the same journalist Alex, and in collaboration with the German broadcaster, Norddeutscher Rundfunk, we looked at predatory open access publishers. So basically organisations that purport to put out scientific papers, the majority of which undergo no peer review. And so again, you can see where there’s a deliberate effort to make misinformation, and in this case, sometimes disinformation, to look legitimate through these means.
Emma Thwaites: It must be so hard for you as journalists, I can’t imagine, you know, operating in that kind of environment, because as you say, you are really time poor. I would imagine often working on several stories at once with quite a lot of editorial pressure. How do you navigate that on a daily basis?
Pamela Duncan: Firstly, it’s lovely to hear sympathy for journalists because that’s not something you tend to get very often. It’s really very difficult to navigate and we as journalists have a responsibility to be careful not to give oxygen to the fire of this misinformation, either through rehashing information, via tweets or citing information that is false.
And there’s no easy answer to that. Some of it comes down to journalistic instinct. We as data journalists are often asked to help interpret things that involve data because data can be forward spun, misconstrued. Part of our job, I often say, is not actually to create stories. It’s also to kill stories because we can’t stand over the data behind them. And we have a kind of fire warden role in that regard. And it’s really important that all journalists young or veteran… I think regardless of a journalist’s experience, you have to be aware. Because the traditional routes that some veteran journalists might have used, in order to source their stories might not necessarily just be what they look like anymore.
So I think everybody has to be very, very careful about who they’re citing and who might be behind that information. Most trained journalists will do that as rote, but I think our ultimate fear is that we will one day get caught out and that’s just something that is another consideration in any journalist’s mind.
Emma Thwaites: Some of this misinformation – where does it all come from?
Andy Dudfield: The way misinformation propagates and spreads, and I suppose where ultimately where it comes from, it is a really difficult question to answer. It is a very complicated thing and there are definitely differing factors at play. Part of it is around conspiracy theories, perhaps which would tend to flourish amid uncertainty. We’ve certainly seen studies that are showing that people are likely to turn to conspiracy theories when they experience a state of anxiety and powerlessness. So when their motivations are being played with I suppose. And so that is really important – potentially one of the areas of spotting misinformation is one of the things we often say to people is: how does something make you feel? Do you feel a strong connection? Do you really strongly agree with this? Do you really strongly disagree with this? Does it seem too good to be true? Because perhaps potentially they are those warning signs, those early warning signs to think about: where did something come from? Can you see an authoritative resource for it? Can you see lots of people talking about it? Do you recognise the name or the brand of the site where this information is coming from.
And then in terms of the different routes, I suppose, that things can come from, it can be lots of different things. It can be people who are bored and messing around on the internet, the kind of things that might come out of some of the more fringe message board communities.
It can be perhaps at the other end of the scale, potentially from things like full state-actor-level misinformation. And it can be those sort of jumping off points between something going from a particularly fringe or viral thing coming to a more mainstream piece of information. And that’s a direction we see things coming from.
And ultimately conspiracy theories perhaps aren’t limited to the fringes of the internet. We fact check national media organisations who have moved into these kind of areas as things become more and more popular. Things like the QAnon movement mainly out of America and turning into really real world consequences is maybe a good example of how powerful these things become very quickly.
Pamela Duncan: That idea of: it’s how you feel, right? So I think there are various different examples of where you can see people’s motivation of being deliberate, misrepresentation or disinformation, and it serves a purpose, right? An ideological purpose.
And then there are people for whom it is based on a feeling or a personal experience. So, I hate to bring up such a sad example of data, but when it came to COVID deaths that formed a massive part of our journalism from 2020 onwards.
Maybe naively at the very start, I thought, Covid deaths, it can’t be that hard to do, we’ll get the number of deaths from the ONS. We’ll count them week-on-week. Those will tally with what the government figures are telling us. And we find really quickly that that was actually a very difficult thing to do, and to do accurately.
So the ONS data is still the most categorical count of Covid deaths. And is the gold standard when it comes to how you do the statistical counting. They represent deaths both in terms of occurrences as per the date of death.And then in terms of registrations: how many deaths are registered in a week.
The issue with the ONS data is that that level of methodological checking that they put into all of their statistical bulletins means it takes a long time to complete.So there’s a 10 or 11 day lag in the registration week that they’re talking about and the data actually being published.
Because we had a live situation, the government chose to go a different route. It counted the number of deaths that had occurred among people who’d had a positive Covid test within the past 28 days. And at the start that only applied to people that died with Covid in hospital. And so we found very quickly that we had this relatively small number coming out of government and a relatively large number coming out of the ONS but as a delay.
And those who were conspiratorially minded or who didn’t believe in COVID or for various different ideological reasons jumped on this gap between the two figures as somehow proof that it wasn’t correct. And then it transpired later on that the government figures were counting too much in certain instances. Because if you might have gotten Covid and then recovered but a week and a half later say you were hit by a car, your death would still be recorded in the government as a Covid death. And so when that was corrected for again, we had this spike in people using our article, explaining why this had happened and the difference between them. Turning what we were saying – all fact checked all verified – and using it for the purposes of misinfo by saying: “see, even The Guardian are saying that this is wrong”. And you feel that very keenly because you’re very aware that all of a sudden your name is being used for evil.
But then there are other examples where it’s nearly like innocent misinformation. But as Andy said, if you, you have to look at how people feel. And we got a letter from a reader, basically saying that they had a personal experience whereby someone close to that person had died, and they had a longstanding condition. So the primary cause of death was not Covid, but they also had Covid at the time of their death. And so they were listed as being a Covid death.
And around the same time, there was a piece of misinformation put out based on an FOI, so it looks legitimate – where it said: how many people had died directly off Covid in a certain time period, and that time period was quite small, so it wasn’t representative to the wider pandemic. And so this reader wrote to us saying, “see your figures aren’t right, because I have this figure from an official source that says that this is not the right way of counting them” and trying to undermine ONS’s information. And you’re just in this kind of minefield then, because you can completely understand where that person is coming from, but at the same time, you don’t want to discount what they’re experiencing personally, because they feel that they have a legitimate claim.
I think that there’s like deliberate misinformation, and then there’s getting taken in because you may be in a vulnerable position for whatever reason and need to believe something to be the truth.
Emma Thwaites: It feels to me like there’s a very thin wafer between fact and fiction, truth and falsehoods. It all just feels a little bit out of control in a way. And Andy, I’m going to come to you with this next question. Do you feel things are getting out of control both in terms of volume and some of the subtleties that Pamela was talking about there? And actually, how do we as normal citizens navigate the world and know what is, and isn’t real?
Andy Dudfield: This is the existential question we’re starting to get to now. So is it out of control, no.Is there more that needs to be done? Absolutely. Part of this is connected to the mediums where this kind of information is propagating. The web is a wonderful thing. It’s a beautiful medium. I suspect we’re all fans on this podcast. But it does mean that we have this decentralised approach where anybody can say anything about anything at any time. And that creates just a huge volume of information. And there is no real centre to the web. There’s no ‘bit’ where you can go – well, this is just where the real solid information is where you can just totally rely on.
What we do have is fact checks. And I think one of the key things with fact checks is, they are there to allow people to make up their own mind. It’s a really important part of the fact checking process to say: here is our working out here is how we’ve arrived at this conclusion. You can decide whether you agree or disagree with what we’ve put together. But to try and make sure that we have those kind of steers. And I think it is really important to think about how do we make sure that kind of information propagates far and wide in terms of: how does that content appear other platforms? One of the really interesting data points, I suppose, from a Full Fact point of view is that we have a really healthy volume of people who are coming to our website, but far more people will consume information on other platforms. So hundreds of millions of people will see our content in Google search results, or in Facebook interfaces or in YouTube or in Instagram. And thinking about the right way to motivate people when they do see information so that they’re not annoyed or frustrated by this, and they see it as helpful and supportive, I think is a really important thing.
Emma Thwaites: And there’s certainly a real awareness amongst the public of misinformation and disinformation.
Andy Dudfield: Yes, I’ve worked for Full Fact for three years now, and it really feels like I’ve brought fact checking into the mainstream. It feels like it’s a very zeitgeisty issue these days. I think is there’s a general awareness of it, which is really important, but also there are systems. So fact checks provide excellent evidence of challenges and problems that need to be addressed. And some of that can be counted by individual people, seeing individual bits of good information. And some of that is about the wider processes. So that does become the world of what are the policies of large internet companies. And certainly in the UK that comes connected to things like the legislation. So we have the online safety bill as a piece of legislation that’s been worked through at the moment or a proposed act that the government are trying to establish as a regulatory framework at the moment to tackle harmful content online.
And I think there’s definitely an important part of that, to think about how that tackles bad information to make sure that harmful information/disinformation online is actually part of that. And also that we really consider the harm it can cause to individuals and communities. And I guess ultimately it’s a democracy because when you start to get people losing trust in some of these institutions that can be appropriated by bad information, you have really serious consequences I think.
Emma Thwaites: I was going to ask you the most under-reported aspects of misinformation and disinformation – what we don’t hear enough about. And I think in the interests of fairness, I’ll come to you with that one Pamela first.
Pamela Duncan: You’ve got to credit people with a lot of common sense. And a lot of people do realise already that there is a lot of misinformation in the world and they themselves want a better standard. So if there’s anything to be positive to be said on this whole topic, it’s that misinformation/disinformation does have the knock on effect of inspiring people to invest in our journalism. And at The Guardian we are able to measure that through subscriptions. So people are doubling down on the truth by subscribing to The Guardian at times of upheaval. So during the Cambridge Analytica, the 2022 presidential elections and you know, Trump’s bogus claims at that time and then all the way through to like the Capitol riots. And around the climate crisis, when we’re exposing climate deniers, correcting inaccurate data and myth busting. So I think if there’s any positive to be taken it’s that people also have a strong appetite good, well researched, traditional journalism.
Andy Dudfield: My background is in being a nerd and doing data things and I feel like there’s definitely a part of the process is making sure that good information exists to help with the process of fact checking and support the process of fact checking.
And I don’t think we spend much time talking about good information. By that, I mean, things like the importance of the way that official statistics are published, or the way that information is published on government websites. I think we’re tremendously lucky in the UK to have the ONS who produce really high-quality content. But there are always ways where information can in PDF somewhere, or it can be in an Excel file that’s hard to find somewhere. And that creates complexity. So making sure that good authoritative information is easy to find, has all the correct context and caveats to ensure that it can be used in the right ways – is really important because I think it’s a really strong foundation for good information. But also these are the kind of things and the kind of unglamorous plumbing that is required to speed up the process of fact checking.
And that’s something that we really need to engage with. We want to be able to go faster. We want to be able to do more. And some of the things that stop us doing that are the inputs and so better quality data being published is definitely part of that. And then being able to use technology to support parts of that as well. So we’re experimenting with the use of artificial intelligence to support our work. And I think that is definitely going be part of the picture in terms of the things that we need to talk about. Anytime where technology is being used in a process – and for Full Fact, everything is always human in the loop with a technology, supporting somebody to make a choice rather than replacing. But just making sure that we’re talking about the consequences of technology, being transparent about how it’s used, what algorithms are used before, what they’ve been trained on. All of those things are really important parts of building trust in any system I think.
Emma Thwaites: Excellent. I feel like we’ve done quite a lot of crystal-ball gazing and trying to sort of look ahead at what the future might hold in this podcast. But I do have to ask you to make a prediction. I’m going to do a Desert Island Discs now and allow you one prediction for the future, which can be one that you’ve already mentioned or something new. And if you could absolutely make yourselves hostages to fortune by making one prediction each, that would be amazing, like the next 10 years, or even beyond. Where, where do you think we’re we we’re, we’re headed?
Pamela Duncan: Unfortunately I fear that things are going to get worse before they get better. I think that trust in the media has really dipped. Which is kind of terrifying because as a journalist you’re permanently worried about a correction of clarification. I can’t kind of put across how much we strive for accuracy in what we do, and we can’t always get it right.
You stake your reputation every time you publish an article, and there are also legal implications, so that you really do want to get things right. But now we’re at a juncture where your work can be either misrepresented as being a false flag or a means of proving a conspiracy theory, as I mentioned earlier on. I feel like for a lot of journalists, it feels like the rug is being pulled from under us. As we spend so much time checking sources and then to be undermined by somebody with a strong conspiracy theory with a big audience, that can just undermine it without proof. I think that’s muddying the waters quite considerably. I’m hoping that things will come good, but I fear that there’s more road in conspiracy theory, mis- and disinformation for a time to come.
Emma Thwaites: Andy, maybe you can offer us some, offer us some hope!
Andy Dudfield: I can try to be hopeful. I think there will be a lot more focus on international collaboration in this space. I think one of the really interesting things about misinformation is there’s no world where everything is neatly tied to misinformation that will only happen in a particular country. Things come and go.
And I can see a real world where people are collaborating and working with each other and particularly sharing the use of technology to support each other. It’s something Full Fact have done a fair bit over the years and something we will continue to do more of.
But the kind of technologies we’ve been describing that have been monitoring the UK media have been used in Kenya and Nigeria and South Africa over the last year. And I think trying to make sure that the precious resources and time and money we can focus on this problem, joined up as well as possible, feels like something that we have no choice, but to engage with and succeed with in the coming years.
Emma Thwaites: So in your own individual unique ways, I wish you well in fighting the good fight. Pamela Duncan from The Guardian and Andy Dunfield from Full Fact, thank you so much for joining the Data Decade podcast.
In the age of the internet and social media, there is a huge volume of false and misleading information, and it can travel faster than ever before. This misinformation affects everyone. For some people, it has cost them their livelihoods, their mental health, and in extreme cases, their lives.
At the ODI, we want data to work for everyone. For this to happen, people must be able to trust the ways that data is used, shared, collected and stored. But how can we build this trust when we hear about fake news and misinformation?
Here, we speak to Pamela Duncan, Acting Data Project Editor at The Guardian, and Andy Dudfield, Head of Automated Fact Checking at Full Fact, to explore the practices and technologies that are emerging to address the spread of untruths and misinformation, and what happens if it’s left unchecked.
The internet has changed the world of fact checking, from a relatively small-scale task to an explosion of information and claims. The internet has fundamentally changed the way that we communicate – now, everyone is a publisher – and the scale of the problem of addressing misinformation has expanded exponentially.
Full Fact is a fact checking organisation based in the UK with fact checkers and campaigners looking to expose and counter the harms of bad information.
Andy Dudfield, Head of Automated Fact Checking at Full Fact, explains: ‘We’ve had to adapt. We've looked at the use of technology to help us support our work because the volumes of data we're dealing with have become huge.’
Dudfield also notes the cyclical nature of misinformation. ‘We see these things time and time again, around vaccine hesitancy – things that really have a damaging impact on people's health and the choices that they're making.’
Elections are also a time where misinformation is rife: ‘Pretty much every time we have an election, we will see some kind of information about, maybe the polling day has moved for a particular party; or that you aren't allowed to use a pencil or a particular type of pen when you’re doing your democratic duty’.
Dudfield explains how the team uses artificial intelligence (AI) to monitor the UK media landscape, including the development of an AI model based on a lot of annotations from fact checkers to identify ‘claim-like statements’.
Pamela Duncan, Acting Data Project Editor at The Guardian, explains how her work as a journalist dovetails with this: ‘Journalistically my job is to provide facts rather than disproving misinformation,’ she explains, but notes that sometimes her work does stray into investigating (sometimes dubious) sources.
Along with colleagues, Duncan investigated how often Twitter accounts were cited in UK news articles by established mainstream organisations. They found that deliberate misinformation pushed out by Russian bots were cited more than 100 times by established mainstream UK news organisations.
Duncan also discusses sources she describes as ‘predatory open-access publishers’ that purport to publish scientific papers, but the majority of which undergo no peer review. ‘So again, you can see there’s a deliberate effort to make misinformation and disinformation look legitimate’. She notes the responsibility of journalists to act as a ‘fire warden’: ‘We as journalists have a responsibility to be careful not to give oxygen to the fire of this misinformation either through rehashing information, via tweets or citing false information’.
Origins of misinformation
But where and why does misinformation occur, and what is the motivation to create and share misleading content?
Dudfield notes that conspiracy theories flourish in times of uncertainty. ‘We've certainly seen studies that are showing that people are likely to turn to conspiracy theories when they experience a state of anxiety and powerlessness,’ he says, adding that these can originate from the fringes of message-board communities, all the way through to full state-actor-level misinformation.
‘Ultimately conspiracy theories perhaps aren't limited to the fringes of the internet. We fact check national media organisations who have moved into these kind of areas, as things become more and more popular.’ And when conspiracies such as the QAnon movement are picked up so widely, there are real-world consequences.
Out of control?
So is the scale of misinformation out of control? And how can people navigate the world and know what's real?
‘Is it out of control? No. Is there more that needs to be done? Absolutely,’ says Dudfield. He points to the web as a source of propagation. ‘The web is a wonderful thing. It's a beautiful medium… But it does mean that we have this decentralised approach where anybody can say anything about anything at any time. And that creates just a huge volume of information. And there is no real centre to the web. There's no ‘bit’ where you can go where there is real solid information that you can just totally rely on.’
And that circles around again to fact checking. ‘I think one of the key things with fact checks is that they are there to allow people to make up their own mind. It's a really important part of the fact checking process,’ he says. ‘You can decide whether you agree or disagree with what we've put together’.
Trust in government, statistics and the media
While the UK government and the ONS publish a lot of high-quality content, there is room for improvement in the format and method of data sharing, which, in turn, can improve transparency and trust. Important data, statistics and information is all too often still buried in a PDF or MS Excel file, making it hard to access, use and share.
‘That creates complexity,’ Dudfield says. ‘So making sure that good authoritative information is easy to find and has all the correct context and caveats to ensure that can be used in the right ways, is really important because I think it's a really strong foundation for good information. But also these are the kind of things, the kind of unglamorous plumbing, that is required to speed up the process of fact checking.’
Making data FAIR (findable, accessible, interoperable and reusable) is key to unlocking its power. ‘We want to be able to go faster. We want to be able to do more. And some of the things that stop us doing that are the inputs. And so better quality data being published is definitely part of that.’
Trusted media outlets are also vital, and one of the side effects of the avalanche of (mis)information to navigate is the renewed demand for trusted, factual reporting.
Duncan explains that people are demanding a better standard of fact-based journalism and reporting. ‘If there's anything positive to be said on this whole topic, it's that misinformation and disinformation do have the knock-on effect of inspiring people to invest in our journalism’. She notes that subscriptions are up: ‘People are doubling down on the truth by subscribing to The Guardian at times of upheaval. So, during the Cambridge Analytica scandal; the 2022 presidential elections and Trump’s bogus claims; all the way through to the Capitol riots; and around the climate crisis when we were exposing climate deniers – correcting inaccurate data, myth busting’.
The decade ahead
When looking ahead to the next 10 years of truth seeking and falsehood fighting, has this now plateaued and are the tools and mechanisms in place enough to keep misinformation in check?
Duncan thinks we’re not quite there. ‘Unfortunately I think things will get worse before they get better,’ she says. She notes that when fact-based articles can be misrepresented as a false flag or misconstrued as proving a conspiracy theory, for a lot of journalists it feels like ‘the rug is being pulled from under us’. She adds: ‘I’m hoping that things can come good, but I fear that there's more road in conspiracy theory, mis- and disinformation, for a time to come.’
International collaboration is an area where there is room for optimism, as information crosses borders and languages, and so joining up fact checking globally is a key step. Dudfield says: ‘Fact checking and good information as a community – that's really powerful,’ adding: ‘I can see a real world where people are collaborating and working with each other and particularly sharing the use of technology to support each other.’