Wonks and War Rooms

Big Tech and Political Campaigns with Becca Rinkevich

Elizabeth Dubois Season 5 Episode 4

Becca Rinkevic is the Director of the Institute for Rebooting Social Media and formerly the Deputy Director of Digital Strategy at the White House under President Joe Biden. This week she and Elizabeth are tackling the role of big tech in election campaigns and political advertising. They talk about the involvement of representatives from social media companies in election advertising campaigns, the changes in these relationships before after the 2016 U.S. presidential election, and what these relationships might look like in the future. They also talk about ways to report disinformation and harassment, guidelines around political advertising on social media, digital identities in the online environment, and the role of digital literacy.

Additional resources:

  • Both Elizabeth and Becca mention Cambridge Analytica, a company that was hired by the Trump campaign in the run up to the 2016 U.S. presidential election, and subsequently found to have misused user data. A 2019 documentary, The Great Hack, was made about the events.

  • Becca talks about the technique of social listening, a way of monitoring online information related to your brand to better understand what’s going on in that ecosystem.

  • Becca mentions a Rebooting Social Media fellow—Elodie Vialle—who is developing an escalation channel for journalists to report harassment and attacks received on social media platforms. The project is still in the works, but keep your eye on the RSM website for further info.

  • Elizabeth throws back to our last episode on the Meta Oversight Board with Julie Owono. Find it here.

  • Becca mentions this political ad tracker created by Bully Pulpit Interactive, where she used to work as Director of Political Programs.

  • Becca notes that both Facebook and Google have parameters around political advertising on their platforms. Find out more about Facebook policies here and Google policies here.

Check out www.polcommtech.ca for annotated transcripts of this episode in English and French.

Elizabeth: [00:00:05] Welcome to Wonks and War Rooms, where political communication theory meets on-the-ground strategy. I'm your host Elizabeth Dubois. I'm an associate professor at the University of Ottawa and fellow at the Berkman Klein Center at Harvard University. My pronouns are she/her. Today, we're talking about big tech, political campaigns and advertising. And I've got my guest, Becca, here to chat with us. Becca, can you introduce yourself, please?


Becca: [00:00:27] Yes, of course. My name is Becca Rinkevich. I'm the Director of the Institute for Rebooting Social Media at the Berkman Klein Center at Harvard University. Previously, I was the Deputy Director of Digital Strategy at the White House, and before that I ran a rapid response for the then-Vice President Biden's campaign. And previous to that, I was a longtime political and public affairs consultant.


Elizabeth: [00:00:49] Amazing. Thank you. I am so excited to chat with you today. We're going to talk about big tech, political campaigns, elections, all that fun stuff, which I know you have a lot of experience with. And so the way I start most episodes is I kind of give a bit of a background on the academic literature. So that's what I'll do now, and then we'll hop into our chat.


Becca: [00:01:08] Sounds great. I'm so happy to be here.


Elizabeth: [00:01:10] All right. So in the literature, when we're looking at the way tech companies have impacted political campaigns, a lot of the time we're thinking about things like, "Oh, they've provided this new avenue for information to reach potential voters". "They've created this new format that voters can use to connect with elected officials, with campaigns"—that sort of thing. But when scholars have tried to dig in a little deeper, they've seen that tech companies sometimes embed employees in campaigns specifically to help them develop digital strategies, but then also to do some lobbying and to sell their product, because these are firms with commercial purposes. We also have seen there's a big difference between the amount of engagement these campaigns have in, say, an American campaign versus a non-US based campaign. And there's been all kinds of questions about the kind of partisan implications of having teams in different campaigns. And what ends up happening sometimes is a tech company might actually have like a Democrat representative and a Republican representative in order to deal with some of those partisan kinds of issues. So there's a lot here to unpack. Where I want to start is this idea that, you know, we used to think about the role of tech companies in campaigns as providing this avenue for communication. And they were really this conduit, right? And now we kind of need to think about them as also actively shaping and changing election campaign strategies by embedding people, by offering trainings, by doing some lobbying, sometimes. So let's start there. Does that make sense? Is that the lay of the land that you see? Is there anything missing in that description?


Becca: [00:02:59] No, it's a really interesting point you make. I mean, you look at these companies and in a lot of ways, they've carved out entire business operations around their political engagements, and that varies widely from company to company. There's some social media and tech companies that don't take political advertising at all. And there's some that make a huge amount of money and have a huge investment on their resourcing side to support that, as a book of business. I've had a variety of experiences and my experience running big digital operations and specifically digital ads has changed a lot over the course of the last decade. Even if you just look at, from 2016, pre-2016, we kind of call it the "Wild, Wild West of social media companies" and political ads and content to 2022 and beyond. The policies, the infrastructure, the teams and the resourcing has changed dramatically. It is true that oftentimes social media companies will embed their partnerships or client leads into campaigns and into different outside organizations and superPACs who at times are driving greater revenue for them than campaigns. I saw this across my work on the outside organizations, and I also saw it within campaigns, across Mike Bloomberg's campaign, at the Biden campaign, which ended up being mostly virtual, but does have dedicated client service members. It's also true that those teams are often organized on a partisan basis. That's true across Google.


Becca: [00:04:37] That's true across Facebook. That wasn't true across Twitter at the end. Who knows what they're doing now, But it's worth calling out as we think about their structures. When I think about the influence and sort of the role of social media companies and big tech on political campaigns in particular, the one thing that I think is the most influential and also has seen the most change is the advent and now the iteration around political advertising. And in the pre-2016 era, there was very little guardrails around what you could or couldn't run, and there was very little requirement to prove who you were before you ran an ad. That's changed a lot since then. Still an imperfect system, but companies have made decisions about what they're going to take and when, and they've also put some strict guardrails around who can run political ads and what kind of disclaimers they need, which has really constricted and changed the type of people who are able to run ads and might be limiting some of the good actors rather than really taking care of the nefarious actors that these companies claim to be combating. So, yes, there's a lot to unpack when it comes to political politics and tech companies' business models. I think, you know, if we follow the money, it's a really, really interesting conversation and one that could go on for hours, if not days.


Elizabeth: [00:06:01] Yeah, absolutely. Well, we'll try and keep it podcast-sized for today. There's a bunch in there that you bring up that I want to get to, but for a moment, let's kind of look at what the specifics of that kind of engagement looked like in specific campaigns. So, you know, pre-2016, Wild Wild West situation, is that a situation where like as a member of a campaign team, you're just like trying to figure out how to use Twitter and Facebook and YouTube and whatever else with no input from the companies. Was there any interaction? Yeah. What was that like at that point?


Becca: [00:06:37] It's actually a very interesting story to tell. And I think the story is clearest when we use Meta, formerly Facebook at the time, as our example. It actually was a higher level of engagement and our engagement with them was much more strategic. So our client partners at Facebook back in 2016 and the "before days", if you will, really helped us build campaign strategy. It's worth noting that the internal teams at Facebook were organized by party. So we worked with Democratic reps and they would often come into our offices. I was at a Democratic consulting firm at the time running big ad budgets. And they would sit down with us and we would share with them our audience, our targets, our messages, our goals, and they would help us to determine the best Buy set-up internally on the back end for these ad campaigns and really read us into their understanding of the algorithm. It was a very hand-in-hand relationship. They were absolutely strategic partners in every way. And honestly, a lot of what I learned was from those conversations and then observing how, sort of, our campaigns have changed in terms of performance since then and making inferences from there. After the 2016 election, there was a huge change in the way that we interacted with client services and representatives from these companies. Essentially, sort of, the strategic component of their jobs was cut. And it's interesting, you actually see a mass exodus. There was maybe ten people dedicated to Democratic campaigns at the time, but a lot of those folks left because the nature of their roles changed.


Elizabeth: [00:08:20] What prompted that?


Becca: [00:08:22] I think a lot of that had to do with the heat that Facebook took coming out of the 2016 election. They knew that they had a problem with bad actors on the platform using organic and paid content for disinformation and targeted campaigns. And they knew that those partners, I'm not going to speculate on what was happening on the right, but that they knew those partners were working closely with outside organizations to try to hack the system. And optically that becomes a problem when you see just how dangerous that sort of dynamic can be and those platforms, what those platforms can do when they're weaponized. So at that point, they made an internal decision, I'm inferring, to change the nature of those roles. And, you know, that was a conversation that we had with our reps. They had been, you know, sort of in the trenches with us, helping us to design these programs, really invested in their performance. And they were disappointed that, sort of, their paths at Facebook were changed. So now, and since 2016, really, but especially now, your interactions with client leads are much more about policy and content guidelines and what you can and can't do, which are very important conversations and more sort of a stakeholder engagement relationship. And it's also about, you know, lines of credit and sort of the more operational business decisions that get made as part of a huge advertising campaign and massive engagements. But it's not, sort of, that sit down, "Hey, here's what we think is behind the black box. This is what you can do to hack the algorithm." It's very much a, you know,  "we're here to facilitate your business, but we're not so much your strategic partner.”



Elizabeth: [00:10:06] Right. That's a really interesting shift. And the shift being connected to the heat that Facebook and other platforms all took when foreign interference and disinformation and micro-targeting all of a sudden became this huge set of concerns that are, you know, they were talked about constantly on the news. Everybody had an opinion about it. And to a certain extent, it's still a concern. So, yeah, it's interesting to see that, sort of shift from, you're a partner, you're part of the team, which, could also be thought of as like, "Yeah, but is that like campaign donations and like, how is that ever reported? And like, are these platforms like skirting rules there?" And in the U.S you guys don't really have the same sort of spending limits that we do in Canada, but in Canada that becomes a bigger question of, well, just how much of that kind of strategic partnership is a reasonable amount?


Becca: [00:11:06] Totally. No, it's a really, really good point. And you know, something you just said really underscored for me, how disappointing is it that when there's issues and, you know, all of these concerns are being raised about the targeting that's available on the back end, sort of the lack of structure and regulation internally on what constitutes an ad that Facebook is willing to platform, Et cetera, Et cetera. Everything that came out of the Cambridge Analytica saga, how disappointing is it that Facebook's reaction to that was, instead of sharing greater transparency and sharing about the changes that they were making to the system to, sort of, remove that sort of, insightful relationship and those strategic managers from the equation and create more of a black box.


Elizabeth: [00:11:53] Yeah.


Becca: [00:11:54] Our experience coming out of 2016 was a lot of the targeting that used to be available to us just wasn't. And so the way that we would sort of build out our demographic targeting totally changed. And then we didn't have our strategic partners anymore and now we're just sort of left to imprints. And I think that is a nice sort of example of the evolution of at least campaigns' relationships with big tech and social media companies over time.


Elizabeth: [00:12:24] Yeah, that's a super helpful example. And yeah, it's interesting. You know, Cambridge Analytica happened, all of these concerns, I think, you know, rightly, there was now discussion about the role of these companies in campaigns and people were spotlighting it in a way that's helpful. But you're right, instead of then "Going to like, okay, like more transparency, all of that info we were giving this specific campaign teams let's give that to journalists. Also, let's give that to researchers and folks who can use that information to check on the ethics of what's happening." And instead of making it more transparent, they pulled information away. And that, that's unfortunate because it's really important for us to understand where our information is coming from, if we're going to make, you know, good informed decisions in a democratic system.


Becca: [00:13:18] I totally agree. I think that's very well said.


Elizabeth: [00:13:20] All right. So we've talked a little bit about the history, a bit about what it looks like now. Do you have a sense of what's coming? Do you think that this is now the new status quo, what you talked about of "The teams getting, well, I guess we got to figure out the demographics on our own and let's just try it out and see how it goes!" Or do you see potential for the relationship between these companies and campaigns continuing to change?


Becca: [00:13:46] It's a really interesting question, I think. You know, I'm coming from a little bit of a pessimistic space, given my experience in the 2020 election. What I found in terms of our partnership with Facebook and with other platforms is that a lot of the impetus for dealing with issues on the platforms has fallen onto campaigns. So, you know, we're talking about 2016 and all of the sort of turn around misinformation, disinformation, micro-targeting, nefarious acts online, the Cambridge Analytica saga, we all, as we were preparing for the 2020 election and in my work for the Biden campaign, we were taking all of that into consideration and thinking about sort of, you know, the role that Facebook and other social media companies were playing and trying to proactively address those sorts of issues. Our analysis was that it was lacking. Facebook often uses the line that they're so big that they can't enforce their own content guidelines, which is quite, quite the line. And we really saw that in action. And in response to that, we built our own proactive campaigns to try to counter the disinformation that we saw online and then stood up an infrastructure to intentionally track it and mitigate it. And so the relationship that developed over the course of 2020 was one of us identifying flags and raising it to Facebook, at this point, Meta to deal with, rather than them proactively identifying issues and bringing it to us. And I think that's really disappointing. I think, you know, there's very sophisticated technology and we designed our program on the Biden campaign that I hope is deployed across different campaigns.


Becca: [00:15:32] It was called our Disinformation Mitigation Campaign. But basically we took it upon ourselves to use social listening to track the narratives that were moving online that were harmful misinformation narratives targeted at the President. We then took those narratives based on volume and used traditional polling to understand the impact that they were having on voters, so how it was impacting their view of the President. And then we tested a variety of different messages to targeting these persuasion audiences who we were concerned about being persuaded by these misinformation narratives. We tested messages to see how we could, sort of, mitigate the effects of that disinformation. As part of that social listening process, we, of course, came across a lot of social posts that had a lot of momentum and an enormous amount of reach that did not follow Facebook's content guidelines. And we would share those examples with Facebook and eventually they would be taken down. And I will say that our lovely partners at Facebook at the time were very responsive and very supportive. And this is by no means a criticism of the individuals who are working there. But the system is currently set up so that again, it's too big to be able to enforce its own guidelines. And that leaves campaigns and issue organizations in a position where they're the ones who have to proactively identify. Identify these issues and that’s a problem.


Elizabeth: [00:17:03] Yeah, absolutely. And you know, in the US context, there are multiple teams and groups of people and in a lot of countries there isn't even a team or a person who is that main contact. Because one of the things I want to clarify there is when you're saying, you know, you would raise this to Facebook, what does that mean? How did you do that? Was it, and you don't need to tell me the specific person, but was it like you had like a dedicated line that you called? Was it you know, you were texting with somebody? Like, what's the relationship between the campaign and Facebook when you're trying to be like, "Hey, there's something problematic I found."


Becca: [00:17:43] This is a really good point and one that I take for granted because it's been a part of the kinds of campaigns and organizations that I've led digital strategy for. But we had dedicated client services representatives. So one of our client service representative was one of my former colleagues and an amazing professional with awesome strategic chops, clearly also a very good relationship manager. And she was dedicated to the hard side of the Democratic campaign space. So she was on the Biden campaign versus all of the outside organizations who were supporting his campaign but cannot coordinate. And she sort of plays this internal role of one, communicating to us all of the policy updates through 2020, there were many, and working through what that means for our programs, et cetera, and critically performing this, sort of, triage role internally. So as we would identify, you know, perhaps a group that's going viral and is just promoting misinformation, et cetera et cetera. Or individual posts that might reach a certain level of virality and also be mis- or disinformation. We would ping her with those specific links or direction to find whatever we're talking about, and she would run that up internally to go through the, you know, both literal execution but policy process of acknowledging that something does not meet content guidelines and taking it down.


Becca: [00:19:18] And we were very fortunate that she, one, was excellent, but two, that Meta had set up that system for us. And that's not true even for state-level campaigns. Forget about local and to your point, that's not standard across the globe. It's worth noting one of our RSM fellows is working on a project in her time here to try to stand up an escalation channel similar to this for journalists. And you look at journalists who, especially female journalists, are heavily targeted online. They're often disincentivized from doing their jobs because of targeted campaigns against them goes far beyond a harassment. It is online abuse and they don't have the same sort of escalation channels that I had the advantage of because I was in politics on a high profile campaign, spending hundreds of millions of dollars. And that's just not how client services should work. And it's certainly not how mis- and disinformation or abuse online should be regulated.


Elizabeth: [00:20:23] Yeah, I agree with you. And just a reminder, we put stuff in the show notes, so we will link to that project and other information in the annotated transcripts that we'll be creating and linking to. So yeah, this idea of, you know, you have to be big enough, important enough, valued enough to get those escalation channels to get that dedicated person and that, you know, kind of makes sense when you come at this from an advertising perspective, which is largely how we've been chatting about this so far, but especially when we start bringing in things like dealing with mis and disinformation, dealing with harassment and hate speech and other forms of abuse. There's also the perspective of these platforms are creating spaces that we have really important political conversations in, that a lot of our civic life is lived on through. When we see these platforms as providing this kind of social infrastructure, really, we have to kind of think about them slightly differently. Because you're right, the dedicated person for the biggest campaigns only makes sense when you think purely from the advertising perspective. When we think from the, how you exist on the Internet and do your jobs on the Internet—kind of needs to shift.


Becca: [00:21:37] Of course. No, I totally agree with that. And I think, like I said, you follow the money and you sort of see the patterns of these companies, but also you follow the influence, Right. You know, we aren't running big advertising budgets while I'm at the White House, but we still have access to that person at a moment's notice. And they prioritize our asks above all else. Now, I see the value in that as I think it's critically important that the White House, regardless of who's in power, has their message get out effectively and efficiently. But it's not right that, sort of, the escalation channel that's available to people in power and brands or campaigns that are spending a lot of money isn't available to a female journalist who's undergoing a vicious attack online because of something Tucker Carlson said. And, you know, he's got all the trolls fired up and now, you know, her online and offline safety is at risk. Or even your average person who's dealing with a similar kind of abuse online. And that's something that we need to reckon with, perhaps through outside organizations or the companies themselves need to step up and figure out a way to better govern.


Elizabeth: [00:22:53] Yeah, absolutely. And on the companies themselves' front. The last episode that we recorded was with Julie Owono, who is a Facebook or Meta rather, Oversight Board member. And so we had a great conversation. So if you haven't listened to it yet, go back and listen to that episode because we get into a lot of the questions around the role of these companies as they're doing moderation and thinking about what should and shouldn't be on their platforms. So yeah, interesting episode to go back and listen to.


Becca: [00:23:24] Julie's amazing and I will definitely be listening. I would love to just hear someone pick her brain all day long.


Elizabeth: [00:23:31] Yeah. Yeah, it was a very fun chat. All right. I want to switch gears a little bit and go back to something you brought up at the very beginning, this idea of, you know, how—what you're allowed to advertise and the extent to which you can target and what information is required has shifted and changed. And a lot of that is related to this idea of having these ad registries, these repositories of all of the political ads in Canada. It is both partisan ads from campaigns and then what we call issue ads. Each jurisdiction kind of has their own set of rules. But what I think is interesting about that example is it shows how we're pulling in, you know, government and policy decisions and how that then shifts the relationship between the company and the campaigns. And, you know, thinking about these different actors that are all involved, I think is an important part to to really get explicit about, because it's really easy to feel like, "Oh, these companies, they're like huge and they're just like making their choices." But government's having some impact.


Becca: [00:24:41] Of course. And just the fact that, you know, ad transparency reports exist now gives them another layer of accountability that didn't exist before. And that certainly is a response to sort of the black box of 2016. None of us know what happens because there's no repository of the ads that were run. And I think that's really exciting. One of my former employers, Bully Pulpit Interactive, has a ad tracker, and there you can go and see, based on political party, spend targeting and the individual ads. And if folks haven't spent time sort of going through those, that sort of tool, I would encourage you to do. So. It's very eye opening. You know, that kind of transparency reporting is only available for political ads. And of course, there's other kinds of dangerous actors out there, but at least Facebook and Google have a pretty wide, broad definition of what constitutes a political ad. It's pretty much anything issue/advocacy-related or political in nature. And I think that's a big step forward. It provides researchers the ability to go and observe and analyze the kind of content that's running for a given campaign. And of course, that then enables them to shine light on potential issues, and that can make its way up to regulation.


Becca: [00:26:02] The other thing that I think is worth noting, just as we're talking about how much things have changed, obviously the advent of these transparency reports, which the two big ones are Google and Meta/Facebook (I'll never be able to fully commit to Meta). The other thing that's changed a lot since 2016 and I don't think a lot of people know this, but is what is required on Facebook specifically to run political ads. And that means the Facebook ecosystem. You can run ads across Facebook and Instagram that are political. You know, there's a lot of interesting conversations happening about digital identity and what that means, how we validate identity as a way to hold folks to a higher standard as they are engaging in online ecosystems. And, you know, this is a big open question when we talk about the decentralized social network platforms and how we might make identity consistent across new social media platforms. The way that Facebook does this for political ads is really interesting. So I'm outdated. Maybe things have changed in the last three years, but at the time that they implemented this rule, which is the lead up to 2020, they had a process to validate anyone—who was going to run political ads— identity.


Becca: [00:27:15] And the goal is that you prove you are a citizen of the country in which you're running ads. You can see sort of what that was in reaction to. So you share a picture of your passport with Facebook, passport or license. I'm not sure if it was one or the other. And they validate that on their end. Then they send you a code through the mail to your home address, which you then use to register your business account on Facebook, which has to be connected to your personal account. So all business that you run, all ads that you run as an advertiser on Facebook is connected to your personal Facebook account unless you choose to make one. And then that's the whole thing. And eventually you're verified to run political ads. That doesn't last forever. It has to be re-verified. And I'll note, God bless them, they tried to send this product up very quickly and did a great job, but there were some bugs. So in the run up to 2020, we're all trying to get this implemented. A lot of bugs, some of them user error, some of them technical in nature. But I share that as an interesting example. Obviously very involved. I—


Elizabeth: [00:28:19] Yeah.


Becca: [00:28:19] Mean, there was some issues with mail … but you know, as we talk about digital identity, this is something Facebook is doing. It's a cumbersome process. It's imperfect. But I don't see that as a lot of the conversation about identity, digital identity online and sort of that as part of the solution to healthier online ecosystems.


Elizabeth: [00:28:41] That's a really, really interesting example. And I, I definitely didn't know the part about mailing a code to the address and then having to use that. Like, that's also extremely time consuming. Most of the time we think about, Oh yeah, online advertising, you can flip a coin, you can just immediately get everything up online. It's no big deal. It's going to be super quick. And you're like, well, actually now we need to wait 2 to 3 weeks for the mail to come through.


Becca: [00:29:08] One of the biggest tech companies, the biggest tech company—maybe number two? On Earth is using snail mail as a way to verify digital identity.


Elizabeth: [00:29:18] That's. That's a lot. That's super interesting. And yeah, I will. I'm going to look up what they've done since. I remember in the Canadian election. So in—ahead of the 2019 federal election, there was the Election Modernization Act and it basically was meant to deal with some of the foreign interference, disinfo and a number of other things, update our election laws generally. And one of the things that came out of that was like, probably shouldn't be allowed to pay for Facebook advertisements with rupees. You should probably have to spend Canadian dollars to buy the advertisements.


Becca: [00:29:55] You took all that to get to that conclusion.


Elizabeth: [00:29:57] Yeah, exactly, exactly. And it's, like, there are some little things that that change. And then there are other things, well, it's actually a very big problem. I remember also talking to somebody from Twitter and they were talking about the differences between how the US and Canada identify. We call them third party actors, but civil society groups, PACs …


Becca: [00:30:20] Yeah.


Elizabeth: [00:30:21] People and organizations that are not political parties and candidates who are involved in election advertising. And apparently Canada/US have drastically different approaches to just, like, assigning numbers to these organizations, making it really difficult to identify them and ensure that their advertisements are in fact indexed properly. And so even just thinking about the bureaucratic aspect and the fact that nation to nation across different states, they're all going to have different data structures


Becca: [00:30:53] Yes.


Elizabeth: [00:30:53] For these things, which is another layer.


Becca: [00:30:55] Well and no, totally. So that's just the digital identity piece for advertisers, folks—hands on keyboard—either looking at reporting or executing the ads themselves. The other big update that Facebook made in the lead up to 2020 and boy, this is going to give me PTSD, thinking about all the …


Becca: [00:31:14] Bugs that we ran into and, you know, all the times that our ads accidentally got shut off because they thought something was wrong with this. But they instituted a disclaimer policy for all political entities, which is what you're getting at. And based on what kind of entity you were, you had a different tax number or way of verifying the legal standing of your given organization. And then as the ads run, there is a very clear disclaimer above the content that explains what this organization is and what it does, which actually I'm a huge proponent of. Aside from all the bugs and the issues with our ads not going through, which I'm sure have been remedied since 2020, I thought that was a really interesting and helpful way to validate the kinds of organizations that were running ads. You know, that's an imperfect system. It's easy enough to get an EIN and go and run an ad campaign under a new entity. I'm doing air quotes, but that was a big step forward. And I think, you know, anything we can do on these platforms to make advertising and especially political advertising more clear is important as a way to sort of, you know, establish that this isn't just content streaming through your feed and make people stop for a second and assess whether something is mis- or disinformation, you know, whether it's being spoon fed to them for a reason, they're being targeted—


Elizabeth: [00:32:46] Yeah.


Becca: [00:32:46] Or if it really is coming to them organically.


Elizabeth: [00:32:49] Yeah, I think that that's right. And, you know, we spend a lot of time talking about media and digital literacy and the importance of understanding why things show up on your screen. And these kinds of measures give people the tools.


Becca: [00:33:04] Yes.


Elizabeth: [00:33:05] Whether or not they want to use the tools. That's a different question. But give people the tools to actually enact that digital literacy if they want to and if they've kind of been given the background on it.


Becca: [00:33:15] Yeah. And I mean, you know, the positive side of all of the issues with political advertising and digital political advertising specifically, the benefit is now folks are more aware. I think the average consumer is a little bit more critical of what they see in their feed on a day-to-day basis. And I'm really disappointed that it took, you know, the whole Cambridge Analytica cycle and sort of all of the coverage of big tech and the atrocities that they've committed to get people to that point. But I think little by little, the sort of narrative around big tech and political advertising has helped folks to become more literate or at least more pessimistic.


Elizabeth: [00:33:57] Yeah, I think, I think you're right. I think it has shone a light in really important ways. And it's been really interesting to chat with you about this, sort of, shift over time of how that relationship has changed and modified with the circumstances.


Becca: [00:34:17] Absolutely. Yeah. And you know, in part due to PR campaigns, in part due to technical infrastructure that needs to be built as we get more sophisticated with our advertising and frankly, also our organic social media strategies. But it'll be really interesting to see where we are come 2024. And my hope is as the companies all work to be a little bit better, there's also outside actors and new tools that can be implemented to mitigate these issues even further.


Elizabeth: [00:34:48] Yeah. Yeah, that makes sense. This has been a wonderful chat. We could keep going for ages, but unfortunately, we don't have time. So I will end with one final question. It's a little pop quiz, as I do for all my episodes. So for this one, I want to know whether or not you think the relationship between tech companies and campaigns is different depending on what campaign you're with. Is it national, state? local? Is it based on partisanship? Those kinds of things? Are there differences in that relationship between tech companies and campaigns based on the campaign?


Becca: [00:35:31] Absolutely. I think as with most things with tech companies, they resource according to revenue. And the bigger the campaign, the greater the revenue and teams are staffed as such. I have only ever worked on the Democratic side of things, but I think there is an equal or not greater chance for making money on the Republican side. And so I operate under the assumption that each of these companies staffs the two sides of the aisle equally. But absolutely, there's a direct correlation to the attention, the resources, the staff, allocated to an account or a campaign and the amount of money that they're spending.


Elizabeth: [00:36:19] Wonderful. Thank you so much. This was a great conversation.


Becca: [00:36:22] Thank you. I'm so excited to follow along with your series and to have more folks from the BKC and Institute for Rebooting Social Media team on the Pod.


Elizabeth: [00:36:34] Yeah, me too. We've got a great lineup for the next little while, so it's going to be a good—a good season.


Becca: [00:36:40] It will. Thank you.


Elizabeth: [00:36:41] Thanks. All right. That was our episode, looking at big tech, political campaigns and advertising. I hope you enjoyed it. As always, you can check the show notes for more resources or head over to polcommtech.ca to find annotated transcripts that are available in both English and French and have tons of links to a bunch of useful resources. Thanks so much and have a great day.

People on this episode