Wonks and War Rooms

Echo Chambers and Filter Bubbles with Adi Rao (re-release)

March 30, 2022 Season 4 Episode 5
Wonks and War Rooms
Echo Chambers and Filter Bubbles with Adi Rao (re-release)
Show Notes Transcript

Adi, a lawyer and campaigner, explains how campaigns are often thinking about how they can crack into people's filter bubbles in order to raise awareness and find new supporters. During the  conversation Elizabeth and Adi tease apart the difference between algorithmically driven filter bubbles and echo chambers which come about as a result of individuals choices in their media environment.

Additional Resources:

Check out www.polcommtech.ca for annotated transcripts of this episode in English and French.

Check out www.polcommtech.ca for annotated transcripts of this episode in English and French.

Elizabeth: [00:00:04] Welcome to Wonks and War Rooms, where political communication theory meets on-the-ground strategy. I'm your host Elizabeth Dubois. I'm an associate professor at the University of Ottawa, and my pronouns are she/her. Today, I'm recording from the traditional and unceded territory of the Algonquin people.

In today's episode, we're actually re-releasing an episode from season one on Filter Bubbles and Echo Chambers. I wanted to bring this episode back for this special season on mis- and disinformation, because I think it's really important that we really dig into the ways that our information systems are shaped for us and by us, and how that plays into our likelihood of encountering information that is similar or different from our existing beliefs and existing ideas. I think this is a great episode to follow [after] that political polarization one that we had a couple of weeks ago with Sean Speer. And so, I really hope you enjoy this episode. Just remember, it was recorded back in fall of 2020. so a few things have changed. But, the basic idea of the similarities and differences across filter bubbles and echo chambers, they stay the same. Hope you enjoy.

 

Adi: [00:01:10] Hi, Elizabeth. My name is Adi. I am a lawyer by trade. I live in Fredericton, working for the Canadian Union of Public Employees and I used to live in Ottawa. And then before that in Alberta, I suppose I used to work also for the NDP at one point. And so that got me some campaign experience and I continue to be involved in sort of electoral campaigns, and I'm excited to talk with you.

 

Elizabeth: [00:01:41] Awesome. Welcome. Have you heard of the idea of an echo chamber before?

 

Adi: [00:01:48] I think more and more people have been talking about it. So I have heard about echo chambers. Maybe I have heard about echo chambers because people in my echo chamber talk about echo chambers.

 

Elizabeth: [00:02:00] There you go. So it's this theory that came out of political communication research that was observing that we're in this high choice media environment where people have all of these different sources and channels to get political information and news. And the metaphor comes from this idea that when you have all these choices, you can choose to surround yourself with information that confirms your existing beliefs. And so then basically everything you send out sort of echoes off the walls and comes right back to you. And the theory kind of plays into a bunch of democratic fears of fragmentation of society. And we end up worried that maybe we're all in our own little echo chambers and completely unaware of what's happening out in the rest of the world. Does that track with your understanding of echo chamber?

 

Adi: [00:02:48] I suppose so, yeah. I mean, I have a more layperson understanding of echo chambers being sort of preaching to the choir type of idea and that sort of thing. So yeah, I mean, I think that that sounds like what I've gleaned over the last few years of this conversation sort of happening in the world of pundits and politics.

 

Elizabeth: [00:03:07] Yeah. Do you feel like you're in an echo chamber?

 

Adi: [00:03:10] I think I might be, yeah. I mean, I think lots of folks I think when you do political work or campaign work, even if it's not political, but it's whatever kind of community organizing or anything like that, you can often end up in an echo chamber. Yeah, I haven't been using Facebook as much. And I know that Facebook is very much at the center of this conversation about Echo Chambers, but Twitter, which I unfortunately do continue to use quite a bit, seems to also exhibit some of those traits.

 

[00:03:43] Right. I mean, I'll get these things from Twitter and my email saying, hey, this person tweeted, you might be interested. It feels like Twitter has figured out what I'm interested in and tells me when somebody whose views I might agree with has said something. And I go over there like a lemming and I look at their tweet and I'm like, "Yeah, I do like that tweet". And then I hit it like and then sure enough, Twitter knows to email me the next time if I haven't logged on yet to remind me to go back and check what this person has tweeted. So, I mean, I think we all sort of end up in these echo chambers online.

 

[00:04:20] But I think what's particularly fascinating for me about echo chambers is that we are in echo chambers offline, too, right? I mean, this isn't I think we like to chalk it all up to algorithms and social media and so on. But I mean, I feel like this has been something that our communities have struggled with for the entire existence of communities. Right. You find the cause that you want to work on and you hang out with the people that also want to work on it with you. And before you know it, you're only hanging out with people you agree with or people that generally agree with you. And then you're kind of in an echo chamber. And I know that when we host events, for example, on whatever political issue, whether it's, you know, refugee rights or human rights generally, the people that show up to those events, you often see the same faces in the crowd and you often talk to the same people over and over again, and you start to wonder whether you've just been preaching to the choir or whether you're doing something wrong, and whether the people that you want to hear your message are even hearing it. The so-called persuadables.

 

Elizabeth: [00:05:15] Yeah, I think that's a really interesting and really important observation about the kinds of communities we end up being part of and who we end up hanging out with and who we end up engaging in discussion with. Right. We know that even outside of political communication, social science generally has observed for many, many years this idea called homophily, where we all hang out with people who are kind of like us. We tend to have friends who are a similar socioeconomic status. They tend to have grown up in similar areas. They have similar education levels. They like the same kinds of things. You know, you make friends by going to different events. The fact that you both chose to go to the event in the first place puts you in some sort of kind of similarity spot. And so we end up in these social circles that are very similar to us, which makes it harder to get out and see different perspectives because everybody kind of has a similar one. We also know then when you go the next step further into being actively politically engaged, whether it's through a specific political party or based on a particular issue you care about, then you're kind of going even a step further into an area where you're around a bunch of people who there already bought into the this "thing matters" argument, right? Like you don't need to spend time convincing them that climate change is a major issue because you guys all showed up to the climate rally together. Right?

 

Adi: [00:06:48] Yeah, exactly.

 

Elizabeth: [00:06:49] Yeah. Yeah. And so that is a, you know, a social phenomena that has existed independent of what our media environment looks like. The echo chamber argument sort of builds off of that idea that we like to be around people who are like us because it's comfortable. We like information that confirms our existing beliefs because it's comfortable. And it goes on to say, you know, and then we choose to consume information that fits with that sense of community that you're building.

 

Adi: [00:07:23] Yeah, 100%. I know. An example that comes to mind from sort of real world campaign work is the Sanctuary City Campaign. We ran in Ottawa in 2017. The Sanctuary City Campaign essentially asks municipalities to commit to ensuring that whatever services they provide and whatever services they fund will never share information that they get from people who are living in the community without status, without immigration status or precarious immigration status with border enforcement. So in other words, you can go to a food bank and you won't be asked for ID and if you are asked for ID, that information will be used to alert CBSA of your presence, that kind of thing. And also as it relates to police, so that police should be banned from talking to CBSA. And so we were trying to get this campaign going in Ottawa and it was fairly we had a lot of success in sort of mobilizing the community. And we had 30 community leaders, experts come to the city council and provide deputations and really sort of speak to the realities of folks who are living without status in Ottawa. And it was interesting because there were 30 people speaking for it.

 

[00:08:41] And then there was this one person who came to speak against it. And it turned out that this one person was like a young guy, like, maybe just first or second year university aged, came in and essentially just sort of said that he didn't really like the idea. He thought it was a slap in the face of the work that Ottawa's already been doing, that kind of thing. And and and it was not, you know, this kid didn't really have any expertise on the issue.

 

[00:09:11] He just sort of came to offer his opinion as members of Ottawa, citizens of Ottawa have a right to do. But what was interesting to me was that there was a radio station resident covering the story, along with some other media outlets. And all of the media outlets that covered it effectively said, you know, lots of people showed up, all of these experts, they took quotes and so on. But this one radio station made a point to only interview that one guy. And this is a right-wing radio station in Ottawa. And I think that sort of is a very real example of this not being solely a social media problem like traditional media does this all the time, like seeks out viewpoints or their audience to reinforce what their audience wants to hear.

 

[00:10:06] And the tragedy of the whole thing, of course, was that we were, in the end, unsuccessful, because there wasn't political courage to take that step and institute those policies protecting people who are living in Ottawa without status. But in that particular instance, what was kind of tragic was that that kid, it turned out, was an intern in one of the counselors' offices who was opposed to the campaign. And he had essentially planted this person to make that intervention. But that wasn't mentioned in the coverage that that radio station gave that student. And it was you know, I think that really sort of underlines how we have been living in these echo chambers all this time anyway. Right.

 

Elizabeth: [00:10:52] Yeah. And so, you know, we rely on journalism to help us become more informed about our political systems, because not everybody can be at City Hall all of the time observing what's happening. And so what you've just described of this sort of partisan approach to selecting who's interviewed and selecting which bits of information to share that plays into the echo chamber idea. But I do want to kind of make sure that we're teasing out a little bit the difference between the choices that are being made on the side of the media producers and the dissemination of the information. So there we could think of the radio station and the journalist who was doing that interview. We could think of Facebook and Twitter as companies that host information and help disseminate it, and they have algorithms that curate the information that gets disseminated.

 

[00:11:46] So there's a bunch of choices on that side. But the echo chamber theory really focuses on the choices that we as consumers of information are making us choose to listen to that radio station and only that radio station and having that be, you know, confirming our existing beliefs and ideas and feeling nice and comfy. That is one of the fears of the echo chamber. The other fear is you might end up having people who are like, you know what, I think politics is too stressful or it's dumb or it's boring or whatever, and just choosing not to pay attention to political information at all. And so then we have this fear of a growing gap between the people who are super informed and the people who aren't informed at all, which there's a concern in the context of a democracy where you ideally have informed voters going in, making decisions on Election Day.

 

Adi: [00:12:37] Yeah, 100%. I think that's a really good point. This is after all these companies are demand driven, right? I mean, they're trying to do whether it's a radio station or whether it's Facebook. In the end, it depends on what the consumers want. And when you make consumers feel comfortable, I suppose you want more comfort.

 

Elizabeth: [00:12:57]  Completely.You know, we started out the conversation you were talking about Facebook and Twitter, kind of as if each of those might be their own echo chamber. And it plays into those sets of decisions that those companies are making. They have designed their whole business around the idea of we can take all of this massive amount of information that's out in the world and we can curate it for you. We can give you what matters most to you. Right? And they figure out what matters most to you by looking at what you've liked and what you've shared and what you've clicked on and what you've commented on and what people who are like you, whether it's your friends on that network, your connections there, or other people that have similar habits to you on their platform, what they've been liking and sharing and clicking on and commenting. And so all of that helps promote their business model because the more you like the content from that site, the longer you spend there, the more advertising dollars you make them, right?

 

Adi: [00:13:54] Yeah. I think that makes a lot of sense. And we've all had that moment where you are talking to somebody about buying something and then the next minute you get a targeted ad on your Instagram or that thing and it creeps you out a little bit. And I mean, I'm sure there's- I don't know how that really works. But my guess is your phone probably isn't listening to you. It's just that they have gotten so good at predicting what you want that sometimes that happens.

 

Elizabeth: [00:14:24] Yeah. And the little kinds of cues that you send out to the world as you're navigating the Internet, you don't even realize how they're being integrated into what the various platforms you use on the Internet are then delivering to you. Right. But this actually brings up an important distinction between the echo chamber idea and another super common concern we have around filter bubbles. So filter bubbles are algorithmically driven. It's again a metaphor for describing when we end up in a context where the information that surrounds us is all kind of confirming our existing ideas and beliefs. And what happens is social media platforms, search engines and other digital tools, they develop these algorithms to filter information for us, information that we're going to like, click, comment, share, and they do so in a way that creates a bit of a bubble. You end up only seeing things that are all very similar to what you've already seen. And so if you have liked a lot of cat videos, you're going to get a lot of cat videos. If you've liked a lot of extreme exciting sports, you're going to end up with all of those kinds of videos. If you are a political wonk and you are super into reading about the minutia of parliamentary procedure, then maybe you're going to get some parliamentary procedure stuff showing up in your feed.

 

Adi: [00:15:53] I think that makes that make sense. I suppose so. Filter bubbles, create echo chambers is essentially what happens, I suppose.

 

Elizabeth: [00:16:00] Yeah. So they can but they don't necessarily have to, right? So we could be very much in a filter bubble on Facebook, but then make choices in our media habits to sort of counteract that bubble effect and then end up, you know, very intentionally looking at a bunch of different sources of information from a different set of perspectives than we're normally used to. And we might draw from radio and print news and online discussion forums and Twitter. And with all of those different sources of information coming in from different perspectives, then we actually avoid getting stuck in an echo chamber, even though on that one platform, Facebook, we are. Seeing just the cat videos over and over again.

 

Adi: [00:16:49] Yeah, they make sense. I mean, I think that for a lot of campaigning organizations, whether it's Amnesty International, which is doing human rights based campaigning or whether it's the NDP which does political partisan campaigning. I mean, this is both an opportunity and a hurdle, right. I mean, you try to penetrate these filter bubbles in order to get your message out to people who you think you want to who you think might be susceptible to your message or people that you want to convince to something. But part of the time, the challenge is just penetrating that filter bubble and getting your message in so that they can even see it.

 

Elizabeth: [00:17:32] Yeah. So what, what kind of strategies can be used for that? Have you seen any that are successful?

 

Adi: [00:17:37] Well, I think that this is an ongoing debate, right. I mean, I think that human rights organizations, which do a lot of online campaigning, have had a bit of a renaissance in campaign strategy. Right. So going from crisis-based messaging to hope-based messaging. In the hope, I suppose that hope-based messaging will penetrate some of these bubbles, because in the end, people want to be people want to feel safe and people want to be comfortable. And I think certainly for campaigning organizations, research has shown that crisis-based messaging is not effective in trying to win people over. And there's this whole idea about hope-based communications that a guy named Thomas Coombs has been really spearheading - Former amnesty campaigner. The idea that in order to sort of get folks thinking about these issues,  we have to pitch it to them in sort of solidarity-based, hope-based messages so that folks feel like they're on there, that they can resonate with the value that you're sharing. So the idea that it's values that matter not so much the news, right.

 

[00:18:50] Like saying to somebody that rights are being violated somewhere as opposed to here's a victory for human rights. There's a different emotional reaction that we should be talking about solutions, not the problems, the idea that we should be talking about opportunities and not threats. I think that's an ongoing conversation to see whether it's going to be successful or not. But I think lots of folks pushing that idea of hope-based communications and human rights campaigning, I suppose, are hopeful that that's the kind of campaigns that will penetrate some of this, some of the stuff. And to be honest, that's kind of what the right wing has been doing. Right. Like you have Ontario Proud, for example, which will hook you in with these cat videos or whatever. And then and then suddenly, boom. Here's a really partisan message about how Doug Ford's the person who's going to save you from the evil liberal elite, and you've been reeled in by these positive kind of cutesy photos.

 

Elizabeth: [00:19:57] How beautiful are these Niagara peaches?

 

Adi: [00:19:59] Yeah, exactly. All of a sudden, now you're part of that ecosystem. And I think what we have I don't know if this is the prevailing view or not, but I certainly think that we can't really undo the media landscape. But we have to work with what we've got now. And I think the fact that it exists the way that it does means we have to rethink the way we try to get our message out. And I do think that people want to be in solidarity with one another. I think people in the end want to feel like they're part of a community.

 

Elizabeth: [00:20:31] That kind of connects back to where we started with the idea of like, well, echo chambers. We're constantly in them because we like to be part of communities. And so it's interesting that sort of the response to getting people out of. Out of an echo chamber or a filter bubble, depending on whether or not we're talking about kind of the overarching media system or the specific algorithmic approach within one platform. You know, like the way to get them out of one that is not yours is to try and reel them into yours. Right.

 

Adi: [00:21:05] Yeah.

 

Elizabeth: [00:21:05] I wanted to kind of also flip this a little bit because we've been talking about, you know, the approach is on a communication style level and the emotion level. I wanted to point out, though, that one of the components here, particularly for the filter bubble idea, which is that algorithmically driven one, a lot of what information spreads is actually very dependent on how the tech company has designed their algorithms for prioritizing content on people's feeds.

 

[00:21:39] And it's interesting in this context because emotion is one of the main things that we've seen as a driver of likes and clicks and shares. And we know that things that are extreme and unbelievable and shocking and elicit a strong emotion are things that get prioritized by most of the major social media companies right now in the way they've designed their algorithms for us. You know, they're trying to respond to this to a certain extent by tweaking the algorithms so it's not as easy to game them. But at the end of the day, the things that make us feel are the things that end up being spread because of the way the algorithms have been designed to respond to us. And so there's an interesting question about the extent to which fear is a better motivator of online action. And by action, I mean like clicking a button versus hope.

 

Adi: [00:22:39] Right? I mean, I think that's a really good point. And the fact that the more action and more clicking that happens, the higher your revenue goes right for these companies. And so there was an incentive to kind of continue to foster a system or an algorithmic system that incentivizes sort of fear as an emotion so that you just continue watching or you continue seeing things that you continue clicking on things.

 

Elizabeth: [00:23:09] And yeah, and, and I think it's also important to then bring us back out to, you know, even if we got rid of those individual filter bubbles that are algorithmically driven on specific platforms, we still have a media environment where people are choosing from a lot of different sources of information. People will still be able to choose only things that fit with their narrow view if that's what they want to do. No matter whether or not we get rid of the filter bubble component, there's still the human choice component. And, you know, you could get rid of individual platforms potentially. I think it would be hard, but potentially you can't really get rid of the phenomenon. Right. Like the vast diversity of information that exists because we have the Internet and there's probably some pretty strong arguments for why we kind of need the Internet for a lot of life.

 

Adi: [00:24:02] Yeah, true. Right. And there is this weird social function that these media companies provide, which is in this filtering of information to suit your tastes. There's also the filtering that is actually in some ways helpful to drown out a lot of the noise and kind of distill it. Just allow all of that information to something that you want to consume. And in some ways that's kind of helpful. Right. Otherwise we'd all just be drowning in all of the information that exists.

 

Elizabeth: [00:24:33] Oh yeah. It's hugely helpful. We need search engines and these social media companies, which essentially are search engines just for our social interactions like we need them, they are very useful to us in our daily lives. But yeah, there's questions about how much they influence, what information we have access to. And I think also an important question is how transparent is the process of filtering that they are doing right. We need to know how it's happening so that we can be informed enough to choose to change our media diets in a way that might counteract a filter bubble. If we recognize we're in a filter bubble because Twitter keeps giving us the same stuff and we keep going back and taking it right. If we notice that and recognize that and understand how Twitter's doing that, then we can say, okay, I'm going to make these other choices for the other kinds of sources of political information. I have to help balance it out. Yeah. All right. I have one last question for you. It's a little quiz. I am wondering if you can tell me the difference between echo chambers and filter bubbles?

 

Adi: [00:25:44] Oh, my God. Are you serious?

 

Elizabeth: [00:25:47] You can do it. I believe in you.

 

Adi: [00:25:50] Well, I think what I think what I understood is that Echo Chambers, one of the main differences between filter bubbles and echo chambers is that filter bubbles has to do with algorithms and sort of the idea that there are these mechanical things that have been put in place that result in filtering information that you see. I suppose echo chambers can be a result of filter bubbles. It doesn't have to be, but it is the idea that you end up in environments with people who are similarly minded or have similar ideas to you and you end up hearing what you want to hear as a result of it.

 

Elizabeth: [00:26:31] That was great. Well done.

 

Adi: [00:26:33] Awesome. Great. I'm glad. Did I pass?

 

Elizabeth: [00:26:35]  You passed.

 

Adi: [00:26:36] Oh, good. I'm so glad.

 

Elizabeth: [00:26:37] Very well done.

 

[00:26:44] Well, that was our episode on Echo Chambers and Filter Bubbles. Thanks for joining. If you'd like more information about any of the theories or concepts we talked about today, go ahead and check the show notes or head over to Polcommtech.ca.This special season on mis- and disinformation is brought to you, in part by a grant from the Social Sciences and Humanities Research Council of Canada and the Digital Citizen Initiative.