Wonks and War Rooms

Information Disorder with Claire Wardle

February 23, 2022 Season 4 Episode 2
Wonks and War Rooms
Information Disorder with Claire Wardle
Show Notes Transcript

Claire Wardle is a professor at Brown University and the co-founder of First Draft, a non-profit that focuses on misinformation and the tools needed to fight it. She and Elizabeth chat about information disorder, a term Claire helped coin. The term helps us think about issues related to mis- and dis- information as bigger than being about fact or not. Claire explains how it is actually much more important to think about the information environments people find themselves in, how they might be different from other people’s information environments, and how things like emotion and sense of community come into play.  They also talk about the idea of inoculation against mis- and dis-information.

Additional resources:

  • Claire and her colleague Hossein Derakhshan coined the term ‘information disorder’ in this 2017 report where they break down different types, phases and elements of mis- and disinformation.

  • The hypodermic needle theory also comes up, which is an early idea about how media messages affect audiences. This video primer explains what it is (and why it definitely  isn’t accepted anymore).
     
  • Claire reminds us that humans are hardwired “to be really bad at this stuff” and talks about the role that emotions play in spreading mis- and disinformation. Claire explains the connection in this First Draft video.
     
  • This article by Linda Monsees looks more closely at the emotional reasons that people share mis/disinformation, and some of the weaknesses of relying only on media literacy as a solution.
     
  • Elizabeth brings up the idea of inoculation theory as a promising way to deal with mis/disinformation, and Claire introduces the notion of pre-bunking. This First Draft guide explains what pre-bunking is and how it can help fight mis/disinformation.
     
  • At the end of the show, Claire also brings up algorithmic amplification, which sometimes gets lost in the mix of conspiracy theories and ‘fake news’ – this CJR post includes examples and a bunch of resources to learn more.

Check out www.polcommtech.ca for annotated transcripts of this episode in English and French.

Elizabeth: [00:00:04] Welcome to Wonks and War Rooms, where political communication theory meets on-the-ground strategy. I'm your host Elizabeth Dubois. I'm an associate professor at the University of Ottawa, and my pronouns are she/her. Today, I'm recording from the traditional and unceded territory of the Algonquin people. In today's episode, we're talking about information disorder with Claire Wardle. Claire, can you introduce yourself, please?

 

Claire: [00:00:25] So, my name is Claire Wardle. If my grandma was listening, I would be Dr. Claire Wardle. I am originally an academic. I did a PhD in communication at Penn. And then I moved back to Europe and I was a professor and thought that's what I was going to do with my life. And then after a few years, I worried that really it was only my mum that was reading anything that was writing, and around that time I'd become obsessed with this question. Well, questions around user generated content. So how did organizations like the BBC know when somebody emailed them a picture of a sunset for BBC Breakfast, whether or not it was really a sunset from that morning, etc.

 

[00:00:59] So weirdly, back in 2008, my research subject was. "How do you tell what's true on the internet?" And so over a decade later, I am still somebody who cares a lot about this question. And it turns out a lot of other people care about this question. So I co-founded a nonprofit First Draft, which I've been running for the last few years. I'm also a professor at Brown University, which is a recent move in the School of Public Health. So I sort of straddle academia and practice, which I think on a subject like this is very useful.

 

Elizabeth: [00:01:29] I agree, I think having those different perspectives really sheds light in ways that coming from just one angle can't. So today we're talking about information disorder, which is a term really you and your colleagues have brought to the fore and really defined for us. So normally I start podcast episodes where I'm like, I'm going to give you an academic definition and see whether or not you think it fits. But I'd I'd venture to say, you probably think it still fits. So instead, I'll just start off with this. Information disorder, as I understand, it is a bit of an umbrella conceptual framework that helps us organize our thinking around all the kinds of things that might come to mind when people say fake news. The, you know, most hated term in academia.

 

[00:02:16] Fake news. It's this widely used term. Popular media in particular, has really spent a lot of time talking about fake news, calling parody "fake news", calling propaganda "fake news", calling things we just don't like "fake news" and it's not super helpful. So there's been a bit of a shift towards saying mis- and disinformation and this season of Wonks and War Rooms, that's our focus, and we frame it as mis- and disinformation.

 

[00:02:43] But I think your framework around thinking about the types, phases and elements of information disorder is really helpful to get in beyond just, you know, is this content that's been created with false information with or without an intent to harm? And getting into like, who's creating it and how is it interpreted? I think that's really helpful. So why don't we kick it off with a little bit about those three sort of buckets that you propose; the types, phases and elements. What made you want to go in looking at types, phases and elements distinctly?

 

Claire: [00:03:27] It's a good question. I mean, I think what I'd say is, the report came about as somebody who has spent a lot of time in these spaces watching what happened. And your point is absolutely right. I mean, I kind of famously, back in 2017 just refused to use the term "fake news". I mean, I just was so frustrated that a lot of the content wasn't fake. A lot of the content wasn't news. And, as you said, was actually being used by people as a tool against a free and independent media against academia. So I was just like, I don't know why we're using the term and actually very quickly a lot of academics still use the term. I would say they're the only people who still use it. Journalists have stopped using it, most policy people have stopped using it, but it's almost as if academics who want to make sure that they turn up in Google Scholar still use the term anyway.

 

Elizabeth: [00:04:14] Hmm.

 

Claire: [00:04:14] That's another conversation, but as somebody who is in this space all the time, I just was deeply troubled that there was a failure to understand the full spectrum of the problem. And particularly, as you said, information disorder is a kind of an umbrella term, but the biggest category of problematic content we see is genuine content. It's just used out of context.

 

Elizabeth: [00:04:37] Mm hmm.

 

Claire: [00:04:37] So this focus on falsity and fakeness was making people obsess about that as opposed to no, this is just a photo from 2016 or well, this is just, you know, a piece of hate speech or it's something that's just being used to cause harm. So in terms of why I decided to write the report in the way that I did, it was to try and drill down to the things that I was seeing all the time, which is that sources are distinct from articles, and something starts often as disinformation. So it might be if we were having this conversation in 2017. We talk about Russian trolls, but what annoys me is that actually the bigger threat now is domestically. But people still have this idea of Russian trolls in the basement. But somebody creates a disinformation and then it shifts and my mom picks it up and she shares it. And then it's misinformation. And then somebody else thinks it's satire and reshares it.

 

[00:05:29] And so I was trying to get people to understand the complexity of the way in which this content moves through the whole ecosystem and crosses platforms and the sources change and whether or not you know the person directly or not changes. And you know, the same claim might start as a meme, but then move into a video and then move into a tweet, then move to an Instagram post.

 

[00:05:48] So I just felt that the debate was so focused on what I call atoms of content. So should this YouTube video be taken down? Should this tweet be labeled as if it was this one thing? And I just was frustrated that ultimately we as people who track this stuff would often find something in 4chan. Watch it move to Telegram. This movement was everything that we were trying to do. Yet the conversation was as if there was one piece of content that happened at one time, and then there was this hypodermic needle idea, which is, oh my goodness, one person has seen a meme and therefore they're not going to vote.

 

Elizabeth: [00:06:22] Yeah.

 

Claire: [00:06:22] Like a really simplistic idea of this. And my PhD is in communication. Communication theory we all, you know, for the 1970s figured out that the hypodermic needle model was rubbish.

 

Elizabeth: [00:06:35] Yep.

 

Claire: [00:06:35] But yet we've now come back to so-called new media and we're having the same conversations as we were having in 1945 about the role of radio. So that's the piece for me. That's why I was trying to make this whole conversation more complex. I don't think I really succeeded, but that's what I was trying to do.

 

Elizabeth: [00:06:53] And something I really love about the idea of information disorder is it does really bring in like the social context in which information is received and shared, and that is something that - you're right - communication scholars for decades now have recognized as an important part of how media is taken on board or not, and whether it changes opinions or behaviors or any of those things. And at some point, maybe we'll chat about the idea of inoculation against disinformation, because that's-

 

Claire: [00:07:22] I would love to talk about that. I would. But I think the other thing I really wanted to focus on in my writing was to try and get people to understand the role of emotion and performance in all of this. So, you know, I joke all the time about my mom and it's unfair and she doesn't actually do it much, but it works as a way to talk about it. But you know, somebody like my mom, you know, she's a member of her neighborhood group or whatever, like, she's performing what she decides to share.

 

Elizabeth: [00:07:48] Mm hmm.

 

Claire: [00:07:48] This idea that it's just information is either factually based or it's not, and that people share it rationally, it's nonsense. We all, all of us, myself included. Does it matter how educated you are? Doesn't matter what age you are, doesn't matter what you know, gender or country you're from. We are hardwired as humans to be really bad at this stuff.

 

Elizabeth: [00:08:06] Ha ha.

 

Claire: [00:08:07] Really bad. And yet all of the policy conversations about it are still really, I think, pretty simplistic, as is the idea of more facts of the answer. And unfortunately, that's not how we behave. So anyway.

 

Elizabeth: [00:08:18] Yeah. Yeah. More facts are the answer, or like just getting better at identifying what is fact and what isn't fact. And not to say that media and digital literacy aren't useful and aren't important, I think certainly they are, but we can't understand how content flows through an information system by only looking at one piece of content at a time, right? Like the whole idea is it's a system and we exist in that system and interact with that system.

 

Claire: [00:08:47] Yes. No, 100 percent. Yeah, and I think, you know, when we think about literacy training, I'm with you. It's absolutely fundamental and not enough resources went into it. Actually, Canada is excellent. One of the best countries for doing this. But ultimately, now there's a recognition that we're better off teaching people how their emotions are manipulated than to say, here's how you do a better Google search.

 

[00:09:11] Yes, it's important to teach people how to Google headlines and assess whether or not something is trustworthy. But actually a more effective way is to say your emotions are being manipulated to ensure that you don't use those critical thinking skills and you don't even get to Google because you know the part of your brain is just like "Oh my god!" You're angry, you're not going to go to Google. So I think that when we talk about media literacy training, we need to think about a kind of a broader framework of doing that rather than simply, we need to teach people how to find more facts.

 

Elizabeth: [00:09:42] Yeah, I would agree. Would you say this kind of comes in? So you've got your framework for information disorder, as it was laid out in that report, is types, phases, elements. And then in elements, we've got the agents like "who's creating the thing in the first place?" "What's the message?" "What's the type, the format, et cetera?" And then the interpreter, "who's it being received by and how are they interpreting it?" Is that where the emotion part comes in, like you as the interpreter? Is that like where we might intervene to say, think better about what's on your screen and how you're interpreting it?

 

Claire: [00:10:21] Yeah, I mean, it's all the way through because the disinformation agents, they know that emotion works, so they are deliberately choosing topics that they know will make people fearful or angry. They then create messages, the next phase, which are full of emotion and the format will probably involve if it's vaccines, a picture of a screaming baby, you know, they're using images that will deliberately make you emotional. Then as the receiver, you have to be aware of how your emotions are being manipulated and how particularly visual imagery might make you less critical.

 

[00:10:53] And then you also have to recognize that you, as you can talk about the phases, you're you received the message and then I decide I'm going to reshare it now. I might hate share it. I might then reshare it to my friendship group on WhatsApp, and "I can't believe this is happening, it makes me so mad" and I'm thinking, "Oh, they all know it's false", but then somebody in my friendship group doesn't. They're like, "Oh my god, Claire's angry. I trust Claire. Oh my God, I'm going to reach."

 

[00:11:17] And then kind of the cycle continues. So emotion and also performance I might hate share it with my friendship group because I'm saying, "Look at these people who are anti-vax. They make me so angry. Don't you all agree?" Like, I'm performing what I think, with my friendship group, is me being a good citizen who supports vaccines. I'm making a decision to share that with my friendship group. I'm getting social capital. By doing that, I'm getting, you know, so all of it, the whole process, it's about emotion, it's about performance, it's about identity. But that stuff is hard to measure. It's hard to explain. So we end up not talking about it enough and we think more facts please.

 

Elizabeth: [00:11:54] Yeah, and it's hard to measure, hard to explain and a lot harder to say, something is a good approach and something is a bad approach, right? Like when we're thinking about teachers in schools teaching kids these kinds of literacy skills, the idea of of having any sort of say on what one's performance of their identity should be or shouldn't be is really thorny and really political. And that's in a school context where we would expect teachers to be teaching children when we go out to the wider society, it's even harder to do that.

 

Claire: [00:12:30] Yeah, now 100 percent, because the people in this space who would have to talk about it would be journalists, researchers, teachers, you know, policy people like all the people who are, you know, taught to be impartial, to be objective, to be neutral. And so we - and I put myself in that space, you know, I spent a lot of time with journalists, I'm a researcher - it's hard to talk about emotion because that feels like a weakness. And there's this sense of, well, you know, facts and rationality are what society is based on. The problem is those of us who think that way, have failed to understand the role that emotion plays for all of us. They’re not like, "Oh, I'm immune to emotion." But those of us who have been trained as journalists or researchers have kind of been taught to train it out of ourselves. And that makes us very bad at having these kind of conversations. And I would argue the disinformation space does not have those fears. They understand that it's about fear. It's about division. It's about identity. It's about performance. They understand all of that. And so that's the kind of the thing that we have to figure out because we're not doing particularly well at it right now.

 

Elizabeth: [00:13:35] Yeah, yeah, I think that makes a lot of sense, unfortunately. When you're saying the disinformation space and those like that, that people or those groups, are you talking about specific kinds of groups or people, is it? How are you defining that collection?

 

Claire: [00:13:53] Yeah, so I mean, so for example, let's take January 6th. Like January 6th, had a whole host of different types of online communities that came together around that event. There were QAnon people, there were anti-vax people, there were just hardcore Trump supporters, there were militia people. I mean, there were a whole host of people from kind of dads in golf shirts all the way through to militiamen and very, very different - I would argue- communities that had been researched almost separately and then everybody saw them together.

 

[00:14:24] But what I often say to people is if you recognize the information ecosystem that those groups inhabited for the two months between the election November and January 6th, it was dynamic. It was participatory. It was nonhierarchical. People felt heard. They were told "what evidence of election fraud did you find?" "Tag your tweets with this hashtag." "Send it to this tip line." You know, "do your own research." "Tell us what you're seeing." It was, as I said, nonhierarchical and it was participatory and you were hearing the same things from the president and your talk show host. But the guy down the local bar, people in your church, I mean, it was everywhere.

 

Claire: [00:15:02] It wasn't all. Some guys who saw some YouTube videos or a bit conspiratorial, it was an alternate reality. Now the information ecosystem that I inhabit and probably you and probably everybody listens to this podcast. If you actually think about our information ecosystem, it's pretty top-down, linear and hierarchical. You have the expert. We're doing it right now. "Claire, tell us, you know, tell us about your PhD." You know, expert that then has a conversation and then there's an audience which I will probably never hear from. If I if they were commenting, I probably wouldn't look at those comments. And there's this idea that I am telling people what to believe and they are listening and you know. And so what that means is the New York Times, the BBC has - they have - Facebook pages, but are they really listening? Is it participatory? Do people in our information ecosystem feel heard? No. But for people like you and I, that's OK. We've been trained to believe that that's the system. We believe in science. We believe.

 

Elizabeth: [00:15:57] Right.

 

Claire: [00:15:57] But for many people whose lives have been turned upside down, they do not feel heard. They do not feel like they have any agency, feeling like they were part of Stop the Steal movement was exactly what they were looking for. They felt that they were part of something and they turned up on Capitol Hill. So when we think about it, I argue that there's these two different information ecosystems. And if we just focus on the different types of content, we're completely missing the point that they're fundamentally different structures that work in very, very different ways. And it's easy to be like, they're wrong. We're right. There's actually a lot about that other system that we should be learning from, because instead we still have a communication system that's pretty 1996.

 

Elizabeth: [00:16:34] Right.

 

Claire: [00:16:34] And hasn't updated to recognize that we have networked communication technologies, but we still use these network communication technologies in the way that we did in 1996 or 1986. Maybe it's better. We did have some of the not quite Web 2.0 in 1996.

 

Elizabeth: [00:16:49] Yeah, yeah, it's it's interesting because as you're describing these two different, very different types of communication environments, it makes me think about all of the promise of of the internet and social media in particular, like we had all of these great hopes of the democratizing nature of these tools. And in recent years, the kind of trope has been like, well, that didn't pan out, right? Clearly, that didn't work. But in a way, what we're seeing is there are some groups within society that are making use of these tools in very democratic ways when we think of democratic in terms of participation and ability to have your voice heard. Now, even within those groups, I would say there are still some voices that are much more prominent than others.

 

Claire: [00:17:37] Yeah.

 

Elizabeth: [00:17:37] It is not equal access to dissemination of your ideas, but perhaps a bit more than in the more top-down environment. It makes me wonder, is that a function of the tool itself, or the way that different people are willing to use it? Is there something that folks in that sort of top down side that you've described like that there's something they'd have to give up to be able to use it in that more democratizing way that they don't want to give up?

 

Claire: [00:18:10] Yeah, they'd have to give up their gatekeeping status. So I was at Cardiff University and in 2008 I wrote this report for the BBC and they were like, "Oh, do you want to come and work for us? But like, here's 50 days," and my mom was like, "You're an idiot. You've got a job with a pension. What are you doing?" And I said, "I'm sure I can come back to academia, but I'm never going to get this chance again." Now, 2008, if you spent time in newsrooms, there were some really innovative people using social media and these networking sites in the ways that they were designed to be. The BBC World Service created this amazing global map called Save Our Sounds, where people were uploading sounds that were becoming extinct. BBC Bristol, in the west of the UK, created a map asking people to upload things around the city that they wanted the new mayor to see who'd never been to Bristol before. Now they didn't take pictures of sunsets and bridges. They took pictures of dog poo mattresses that hadn't been picked up and problems. And as a result, the mayor then made all these changes and improvements, and it was a real participation moment.

 

[00:19:11] By 2010-2011, senior editors had worked at Facebook and Twitter, and they were like, "Oh no, no, that takes too much resourcing.” Instead, on Facebook, say, Tune in at 11:00. On Twitter, “just tweet a link to our article. That's the only way we should really use social media." So then we had this amazing two to three years and then the gatekeepers kind of closed ranks and said, "Oh, we just use this as another broadcasting mechanism. This just allows us to reach more people on Facebook." They weren't saying this allows us to engage with more people on Facebook. They kept talking about reach. So in order for the information ecosystem that everybody listens to, this podcast spends time in. We would have to we would have to hope that these trusted gatekeepers were OK in letting more people in

 

Elizabeth: [00:19:56] Right.

 

Claire: [00:19:56] And using the tools as they were designed to be. But that is scary. That is hard and capitalism -

 

Elizabeth: [00:20:04] Yeah.

 

Claire: [00:20:04] - I mean, there's a reason to get people to click on things, you know? So there's a whole host of reasons why it never really got used in the way that it could have done. But we see from the other side the kind of movement politics people using the Occupy Wall Street that people were using it as a movement. The same, that's what we see now. It's exactly the same tactics. It's an understanding of what these network technologies can do. So, yeah, I think it's just it's important for us to recognize the differences between these two information ecosystems.

 

Elizabeth: [00:20:33] Yeah, I think that's really helpful. It does make me think about the role of fact or truth or whatever you want to call it in democratic processes. Let's talk about democratic situations for now, because authoritarian regimes are a totally, totally different system. But in a democratic context, there's an assumption that we need people to have access to accurate information about how they're being represented by representatives in - for us, it's the House of Commons - but wherever it requires people to have accurate information about what's happening across their communities across the country. And one of the risks of those very participatory environments is that emotion can be the thing that drives not fact or you can be connected for reasons that are not necessarily tied to what's actually happening on the ground. Where does that sort of assumed need for some level of just factual information and shared understanding of what reality is for a democratic system to work fit in?

 

Claire: [00:21:43] I feel: "How do we work with communities to help create the spaces that are shared?" Because when there are things that bring people together, they tend to be pets, sports, faith. There are spaces where people, irrespective of political perspective, come together and share a love of golden retrievers or whatever. And if you think about the pandemic, if you spend any time with somebody who works in public health, they'll be like, "Yeah, yeah, we've always known it's about door to door. It's about working locally, it's about grassroots."

 

[00:22:15] So I suppose when I have these conversations with people, it's like, well, what does that look like in the political arena? We're trying to improve people's trust in the democratic system for the midterms. We're not going to do that by ads on the side of busses or another documentary on CBC or PBS. Like it has to be at the local level and it has to be finding trusted sources at the local level who look like people, so they're more likely to trust them. It's all of that stuff. So I think that when I think about participation, some of it's online, but a lot of it needs to be offline. And, you know, and it is the whole of society's approach. So we need news organizations to be more participatory. But if you know, if there were editors listening to this like, "yeah, good work, I've got all of this time to monitor the comments. I don't think so." But what does it look like to create spaces where people feel heard because we could sit here and talk for hours about content, problematic content? How do we define it? Blah blah blah.

 

[00:23:11] But it's very rare for people to say why? Why is everybody susceptible right now? What's going on in society, which makes people susceptible? And it's because their lives have been turned upside down. People are really struggling with economic insecurity, their communities look different, climate change, et cetera, et cetera. So in all of that context, they feel like their lives haven't turned out the way that they wanted to, and they don't understand it. But nobody is listening to them. They don't feel heard. And then a conspiracy theorist comes along and says,"I'll tell you why your life hasn't turned out the way you think it was going to, because there's a secret cabal, blah blah blah blah blah." And you've got this simple narrative that explains all. So I just think there's no one solution here for the whole of society, but there has to be much more of recognizing why people don't feel heard and providing spaces where they do feel heard. And until we do that, I don't feel like we're going to be successful because 4chan, Reddit, a whole host of other spaces, make people feel, I mean, TikTok, for goodness sake, it doesn't matter how many followers you have, you can suddenly have a viral TikTok like then suddenly people like, "Oh, I've got a platform. I feel heard." Like there's something going on here that we're just not taking seriously enough and we need to if we really need to look at our information infrastructure and recognize we don't need more factual content to change this. It's society, it's people. It's humans. It's not just the content.

 

Elizabeth: [00:24:34] Yeah. Yeah, I think you're right, I think that makes a lot of sense, and I think that when we think about our information environment very often we think about, well, what has social media done and what sort of spaces are available via social media platforms. And yes, that's important. And that matters. But what I really like about what you've just said is it really goes beyond just social media in particular to say this environment is much wider and the things that have changed. It's not just that now social media is a place where people could go and find a specific community and ignore the rest of them. It's that having that option as part of your wider environment changes the way you interact and all of these different spaces and the way information flows across quite a variety of them, right? Like your WhatsApp family message group is not stand alone. You also still have family dinners that you have to think about. What's that dinner going to be like if I share this in that group? And that sort of recognition of the fact that we are existing in a rather turbulent time in a wide and varied media system, I think is helpful.

 

Claire: [00:25:41] Yeah, and also, I mean, if you speak to many people who've lost family members to QAnon, they were like, Yeah, they work nights. They didn't have a big friendship group, but they would log on at night when they were, you know, doing their job and there were QAnon people been like, "Hey, how are you today?" I mean, that kind of fundamentals like social glue and community dynamic is so much part of it -"But I don't understand why people believe QAnon." - Because it was a big, basically an amazing game with clues that you had to decode and you did it in a participatory way with other people that made you feel like you'd won when you decoded it. That's nothing to do with what the content was. It was all of those other factors that made it so compelling, and those clues have stopped now. The Q drops have stopped and QAnon is going out of favor. It's that that element was so critical to it

 

Elizabeth: [00:26:31] Yeah.

 

Claire: [00:26:32] Anyway. I could go on for hours about it, but we we need to think about the social system much more than just focus on the content.

 

Elizabeth: [00:26:39] Yeah, I agree. Before, before I let you go, I do want to go back to that idea of inoculation because I think it'd be really interesting to hear from you what you think about this idea. So the idea that some are proposing is what we need to do is inoculate people against mis- and disinformation. We need to basically, like give them a bit of an injection of disinformation, tell them it's disinformation, and then suddenly they're going to be OK dealing with it. How do you feel about that?

 

Claire: [00:27:10] I feel very good about it because the research is very positive, so I sometimes talk about the information supply chain because I like to think about the whole process, and I think we're disproportionately focused on the end of the process, which is "Oops, woops, the disinformation is out, we will now supply a fact check," and I am a big fan of fact checkers. We need to have an accurate record. I'm glad they exist. However, I think there's been a disproportionate focus on that, and what we should have done is exactly this, which is to prepare people. So there's two different ways of thinking about inoculation.

 

[00:27:39] One is prebunking, which is when you explicitly talk about a certain rumor. So you might say "There's an election coming up. It is very likely that you might see a pamphlet or a meme with the wrong date on it." That is a known rumor that always circulates. Reminder that November 3rd is a date, or you might see a ballot box in the wrong location. This occurs all the time in elections around the world. "Be wary if you see that it, you should be more careful because we know it always happens." So pre-bunking is when you basically talk about specific rumors.

 

[00:28:10] Inoculation in the wider sense is when you teach people the tactics and techniques. So you say, you know, if I'm trying to build credibility, I'm going to rely on your use of heuristics, which is a mental shortcut. So I might put on a white coat knowing that you're going to be more trusting of me, if I wear a white coat. You're probably not going to Google. Is Claire really a medical doctor? You're just going to trust it because I'm wearing a white coat that is a tactic that you should be aware of.

 

[00:28:36] So once people make money from disinformation, so if you see a news story about vaccines, look to see, are they selling supplements because there are many anti-vaxxers who make money off of that. So you are teaching people the tactics and techniques and a wider framework, and lots of the research now shows that if you do that kind of work with people, they are much better at spotting misinformation in the wild, irrespective of their partisanship perspective. So because as humans, we don't want to be manipulated in the same way as we all watch those shows about scams and hoaxes and online dating, you know, none of us want to be manipulated, whatever we believe.

 

[00:29:13] But once you wait for the misinformation to drop and I tell you, don't believe what you're reading about the Ottawa convoy all of a sudden, either I am pro or depending on political like it's really difficult to break through that, whereas ahead of the time, the more you can talk to people about the tactics techniques they you don't want to be manipulated, the more effective it is. So I would, you know, we're currently testing a course for the W.H.O. on SMS to inoculate people in the hope that we're doing a, you know, a RCT to see whether or not the evaluation shows that it works or not. But we need more of those things, like bite-sized approaches to reach more people, to teach them about the kind of things that we've talked about on this podcast, but that most people don't necessarily have access to these kind of conversations.

 

Elizabeth: [00:29:58] That's awesome, and it's so good to hear that the research is showing it's actually working and and the way you've described it, the logic makes sense to write like you're much more likely to want to be like, "No, I'm not, I'm not going to be manipulated. I'm going to do a really good job to make sure I'm not manipulated." Then to say like, "Oh yeah, in fact, I was wrong. I developed an opinion off of something that I shouldn't have paid attention to." Then you're like -

 

Claire: [00:30:22] Yeah.

 

Elizabeth: [00:30:22] Raising the flag like, "I've been duped. Also, you shouldn't trust me." Like that feels uncomfortable.

 

Claire: [00:30:28] Exactly. Yeah, and fact checking, so much like is this true or false? Whereas inoculation is much more about, "have you been manipulated?" So it doesn't matter what the content is. And so a lot of that is just trying to stop people from sharing if they think there's any hint they've been manipulated and therefore let you say it takes off that shame around. Oh my goodness, I was wrong.

 

Elizabeth: [00:30:48] Yeah. All right, well, this has been a wonderful conversation. I really appreciate you taking the time. I end off each podcast, normally with a little pop quiz. Although I am sure you will ace this one because it's just a quick, short answer of how would you describe information disorder?

 

Claire: [00:31:08] All of the ways that information is causing harm, and that's either content that's mis- or disinformation, but it's things like algorithmic amplification, it's about micro-targeting, it's about Facebook ads, it's about all those things. And I think when we focus too much on mis- and disinformation, we miss those other things. And actually we then get a chance to talk about it. This is the long answer is the concept of data deficits. So misinformation flourishes when there's an absence of good information. So my worry is all of this conversation resorts in going towards, "oh my goodness, misinformation. Bill Gates is microchipping people." When actually it's well, how did we screw up during this pandemic and not give people good, credible, consistent information? So again, information disorder is all of those elements that we're not doing a particularly good job of.

 

Elizabeth: [00:31:56] Brilliant, thank you. I really, really appreciate you taking the time, this has been wonderful.

 

Claire: [00:32:00] Now, I've really enjoyed the conversation.

 

Elizabeth: [00:32:05] All right, that was our episode on information disorder, I hope you enjoyed it. If you'd like to find Claire's report, learn about any of the theories or concepts we talked about today or connect with other information, check out the show notes or head over to polcommtech.ca for additional resources. We've got full transcripts that are annotated in both English and French. This special season on mis- and disinformation is brought to you, in part by a grant from the Social Sciences and Humanities Research Council of Canada and the Digital Citizen Initiative.