Wonks and War Rooms

Technological Affordances with Rachel Aiello

November 24, 2021 Season 3 Episode 7
Wonks and War Rooms
Technological Affordances with Rachel Aiello
Show Notes Transcript

This week Elizabeth chats about technological affordances with Rachel Aiello, an online politics producer for CTVNews.ca and a member of the parliamentary press gallery. They chat about how technology is changing how journalists report and how audiences receive information, from politicians on social media to journalists working from their phones. They also talk about technological determinism in order to highlight why it is important we think about affordances in the first place.

 

Additional Resources

Elizabeth uses this article by Butcher & Helmond and this article by Nagy & Neff to build her definitions of the different types of affordances she discusses in this episode.

Rachel discusses how phones have become a key tool for journalism, check out the Mobile Journalism Manual to learn more.

Elizabeth mentions Authenticity with Kevin Parent from Season 1, Episode 7.

Rachel and Elizabeth talk about the changes to the news cycle and how news is consumed. This study from Verizon shows how often people watch videos on mute.

 Elizabeth also discusses technological determinism. Here is an overview of the theory.

Check out www.polcommtech.ca for annotated transcripts of this episode in English and French.

Elizabeth: [00:00:04] Welcome to Wonks and War Rooms, where political communication theory meets on the ground strategy. I'm your host Elizabeth Dubois and I'm an associate professor at the University of Ottawa. My pronouns are she/her. Today, I'm recording from the unceded and traditional territory of the Algonquin people.

 

[00:00:18] In today's episode, we're talking with Rachel Aiello about technological affordances. Rachel, can you introduce yourself?

 

Rachel: [00:00:25] Yeah, I'm Rachel Aiello, and I am the online politics producer at CTVNews.ca, and I've been a member of the Parliamentary Press Gallery since 2014.

 

Elizabeth: [00:00:34] Wonderful. Thank you so much for being here, I am so excited to talk to you. We're talking about affordances today, and I thought that you would be particularly interesting to talk to about affordances because you have dealt with so many different channels of communication and forms of information.

 

[00:00:51] So, [here's a] quick rundown on what affordances actually is. In academic literature, the idea of affordances is actually used in a bunch of different disciplines. What it boils down to is affordances are the things that objects allow you to do. And so we're going to talk today about technological affordances, which are essentially the technological features that enable or constrain different kinds of actions. So we might have something like WhatsApp allowing us to auto delete messages or Twitter constraining how many characters we can use [or] a news website allowing or disallowing comments on an article. Those are all different kinds of affordances.

 

Rachel: [00:01:34] Cool.

 

Elizabeth: [00:01:35] Is that making sense so far?

 

Rachel: [00:01:36] So far, so good.

 

Elizabeth: [00:01:37] Excellent. Ok, so there is a bunch to unpack with affordances (like any academic theory), but let's ease our way in. I'm wondering if you see any connections with this idea of affordances with the kind of news production and sharing work that you do?

 

Rachel: [00:01:56] Absolutely. So I think [it connects] in a couple of ways. One of them for me [is that] I mainly do online reporting, so we are watching what people say online, but also using our website to put out content. But I think I noticed this the most on the election campaign trail—that what I was able to do was constrain[ed] to my phone in a lot of situations. I was using it in multiple different ways. It was my main channel of communication to my desk [by] using email and Slack to message my team to figure out what's going on. But also being able to use it as a recorder and recording what's happening in press conferences. I'm also holding it in my hand with notes for questions to ask at press conferences. So it's kind of using all of these different elements of it, and basically my entire reporting and job was connected to the device in my hand.

 

Elizabeth: [00:02:44] That's such a useful example, and it's so true. We have these devices that afford us so many different things. They can do so many different things and that has changed as the technology has evolved too, right? Like a phone used to be [that] it afford[ed] you calling someone.

 

Rachel: [00:03:01] Right. And so now [though], I can take a video or a photo of the press conference [and] immediately tweet that out. I can then go into Safari, grab the link and share that. Now there's an app to record so you can record your transcripts on there and grab through, listen to it, stick in your headphones, and you're on a call or you're listening to that. So I think the only limitation in the phone for me in my job is the publishing to our website—that stuff still is "laptop required". But in pretty much every way, my phone can be basically my main reporting tool now. That isn't usually the case; most days I'm sitting at a desk with a computer. But absolutely when you're on the hill or when you're on the road traveling [or during] election campaigns or [when] you're out covering a story. There's a lot that you can do just by the thing in your hand or in your pocket.

 

Elizabeth: [00:03:53] Yeah, that is pretty cool, and it has definitely changed the way somebody preps to go out in the field.

 

Rachel: [00:04:01] Absolutely. So battery packs are necessary! But the ups and downsides of it is [that while] you can get things out really quickly because you have that ability right in your hand, [you are] feeding the news cycle necessity of needing to, you know... people know you have that ability, and then so there's that pressure sometimes to get things out quickly or to be able to, in the moment, be feeding things. And I think there is a merit and value in that. But there's also those times where you do need to kind of sit down, take a breath and look at the whole picture.

 

Elizabeth: [00:04:33] Right. Yeah, because it creates this expectation [that] we've talked about in other podcasts [about] what a news cycle actually looks like and how technology might be affecting that. And moving from, "OK, well, there's a twenty-four hour cycle and there's daily deadlines for things." And that's not a thing anymore because we've got these tools that afford you the ability to do things on the go and quickly and edit stuff really fast.

 

Rachel: [00:04:59] Yeah, and it's not even just us, the journalists in the election campaign (to continue that example), that have this ability. The political people who we are covering as well also do. So they have the affordance—or the ability—to get out their own messages that way or conduct their own media promotions, social media, [and] all of those things on their own. Obviously, I think there's a different role for the content you would then get and see from us on the road. But [the political actors] do have the ability to then take that and do it themselves. One of the best examples I saw of this was [that] all of the leaders use social media in different ways during the campaign. Obviously, Jagmeet Singh and the NDP were big on TikTok, and he was pulling his phone out at every stop to promote his events. But Justin Trudeau was doing.. I was with them the first week and I noticed after events he would go out or off to the side with whatever candidates or people who were standing behind him and [they] did their own version of the announcement that they just carried on national television before Instagram. And so it was like, "I'm here with Doug Eyolfson at this grocery store in Manitoba, and we just made this announcement about ~this~." And [Justine Trudeau was] able to then share the message of what the announcement was, [and it was] usually obviously something smaller and rarely, depending on the day, was [it] the actual headline or what we would get out. So there is that ability for them to use their phones and get that stuff [out]. I'm sure that went through layers of editing before it actually got posted. But he was recording it himself on his phone selfie style.

 

Elizabeth: [00:06:32] Right, yeah, and it gives that ability to do it right there on the road, and it gives that ability to create a sense of personal connection, which is...

 

Rachel: [00:06:41] Authenticity.

 

Elizabeth: [00:06:41] Yup, [authenticity] is totally core to a lot of comms strategy. We did an episode in our first season of Wonks and War Rooms on authenticity with Kevin Parent of Ottawa Public Health, and from crisis comms to election comms to whatever it is, that authenticity is a core goal right now.

 

Rachel: [00:07:01] And I think one of the big changes that the pandemic brought was our audiences are a bit more forgiving or generous when it comes to the quality [now]. Not in the factual [way] or [in] the knowledge [aspect] of it, but if your video is a little shaky or it got cut off weirdly or something like that, it's because the nature of [the pandemic]. Or your cat jumps into your screen at Zoom—like these are just the limits of the technology that we have been using in the pandemic or out on the road or whatever situation. So I think audiences have become a bit more desensitized and less requiring [of] complete TV studio set up and the polished edited [coverage]. There's a lot more native video going around that people are just more used to seeing, and they don't always think that a news piece or a news item or what you're getting from me on Twitter or Instagram or something like that [is] necessarily what's going to go to air or be in my story. It's different perspectives on things. And so, I think that's also a message for reading a bunch of different places or getting your news from more than one spot. But absolutely, having the ability to use your phone or other technologies—[journalists] can do it, our audience can do it. And there's this new layer of relatability.

 

Elizabeth: [00:08:17] Yeah, yeah, I think you're right. And it's interesting because when we're thinking about affordances, originally, the theory was talking about what the technology offers, and it was often thought of in terms of what a tech company building a piece of technology was intending to create and intending you to use. But in reality, affordances are whatever the technology enables you to do, and it's usually in a social context. And so this idea of creating this relatability and this personal connection that you can do via having a weird cut off video that's a little shaky—that was probably not what people who created Instagram lives were going for, but...

 

Rachel: [00:08:59] Right in the same way that the Facebook we've ended up with was not the Facebook, I think, they probably started out intending to build, maybe. But... And I think that's just the nature of technology and people pushing the boundaries and trying to figure out what else can be done with things. Like, for sure using your phone as a reporting tool has not necessarily been new, but I am curious to see where things go from here. [For example,] you can use it to go live and do hits or Facebook things or FaceTime, things like that. There was some of that we were seeing in the campaign trail as well. What would before be, "You have to set up a tripod, you have to dial into our system and do all of that," [is now] just like, "Do you have headphones? Is your battery charged and are you at at a decent angle?"

 

Elizabeth: [00:09:43] Yeah.

 

Rachel: [00:09:44] And [if so] let's just talk to you. So..

 

Elizabeth: [00:09:46] Yeah, and like maybe don't have sun right behind you. [Laughs]

 

Rachel: [00:09:48] Ideally, but some days, you know, if the news is big enough, we'll get you on whatever way it looks.

 

Elizabeth: [00:09:52] Yeah, I think that's that's a really interesting shift because we often think about the value of news as being professional and expert and trained—and it is still all of those things—but [now] we don't have the polish requirement the same way we might have before.

 

Rachel: [00:10:15] Yeah, and I think it's definitely a challenge for reporters to keep in mind the content of [news] because it's very easy [to forget about it] when you're on the fly and there is a value in being casual sometimes. But at the end of the day, our job is to be factual, accurate, fair, [and] balanced. And so, some reporters have an ease of picking up the phone and doing six things at once, [but] not everyone is that way. And that doesn't make you any better or less good of a reporter, it's just a different style. But it is absolutely a delicate dance, and I know journalism schools are probably having to grapple with the like, "OK, we need to teach you the basics," right?

 

Elizabeth: [00:10:47] Yeah.

 

Rachel: [00:10:47] Like C[anadian] P[ress] style and grammar. But also you need to be able to use your phone, do video, do this or that, because that's going to be the expectation when you go into the field: that you have the ability to juggle those things and still do the core part of your job, which is reporting what's happening.

 

Elizabeth: [00:11:03] Right, and so reporting what's happening then requires all kinds of digital literacy. You need these skills and understanding of how these tools work technically, but also how people are going to interact with them and what people will trust [and] what people won't trust.

 

[00:11:18] I want to go back to [when] you mentioned thinking about the content. One of the other ways that affordances might play in is in the way content needs to be shaped to play on these different kinds of platforms. And I know that you do stuff, obviously for CTV online, but also [you do] podcasting and [stuff] on television and across...

 

Rachel: [00:11:40] Newsletter.

 

Elizabeth: [00:11:40] Social media. Newsletter—yeah, the newsletter! So, do you think about creating your content differently for those different platforms?

 

Rachel: [00:11:50] Platforms. You have to in some ways. Like again, at the end of the day, the story is story, but we do have to think about the audience and how where they are reading it is going to change things. So in the case of the newsletter, I know that most folks are probably opening it up on their phone. It's called Capital Dispatch—during the election it was daily and we called it Election Dispatch [and] it was basically a recap of what happened that day on the campaign trail. And so instead of presenting you with a full thousand word story, it would be briefer with links back so if you did want to read more into it, you would be able to do that. And so that whole experience is thought about in the way that people are reading it, whether it's they've opened it up on their computer or not. It's an email format, so you don't have that long. I try to keep it tight. So in that sense, it's definitely the one line kind of [approach, like],"This is what you need to know. Follow through on our website."

 

[00:12:50] One of the joys of being a mainly online reporter is I don't have to worry about time limits. Word counts [do matter] in the sense of at a certain point, folks stop scrolling so you have to hit your sweet spot and keep them engaged. But you do have the ability to go longer—it's not like my TV colleagues who have like, "OK, you've got a minute 30, so you have to fill a minute and 30 seconds." So the way that they draft their stories is different in the sense of you have to write to pictures; you have to think about how are we going to put video or visuals on the screen that match what I'm saying? And if I'm talking about a dull policy thing on the Hill, sometimes you don't really have... it's like, how many Parliament Hill stock [images/videos] can we put on the screen?

 

Elizabeth: [00:13:32] And will they actually be useful for the person viewing it?

 

Rachel: [00:13:35] Right, and because the other part of this is so much of the news now people are watching on mute. So you have to have that factor as well of conveying the story [even] if they're not listening to it.

 

Elizabeth: [00:13:44] So even television, there's an assumption that people watch on mute?

 

Rachel: [00:13:48] In certain situations. Definitely on a daily news channel, I'd think. You know, you see [them playing on mute] in airports or barbershops, that kind of stuff. So the banners and the things you see across the bottom are deliberate in the sense of conveying the story. And sometimes [..] you kind of have to put [the video] into that context. National news is a different story. Obviously, that's programing TV; people are sitting down and watching. But even in a lot of our social media now, we're keeping in mind the mute factor. You know, you're scrolling through Instagram or Twitter [and] you want to watch the video but you're at work or you're somewhere where you can't just let the music play (or the audio). So we are putting on captions on our videos and trying to think about... Yeah. So a roundabout answer to your question is we absolutely do think about the platform in which people are accessing the media. It doesn't necessarily change the content, but it probably changes the structure.

 

Elizabeth: [00:14:40] Right, that makes a lot of sense. And I love the mute example because that's a very clear way we can think about affordances, right? Like both television and our phones wherever on the internet allow for the audio, but they also allow people to turn the audio off. And so you have to plan for that. You have to plan for the choices that these people might be making when they're presented with, "These are the options."

 

Rachel: [00:15:05] Same way that you can put six headlines on a front page, and if nobody reads past the headline, we can't do much about that. They have the ability to click through and read it. We hope that they always do, but that's not always the case. So you do... There are always those thoughts in the editorial process. Like, the headline is the headline for a reason because you're conveying the news as quickly [as possible] because you know that not everyone's going to go all the way through the piece. And that's why, generally, news stories are structured the way they are—all the good stuff at the top and the more background and details as you go down.

 

Elizabeth: [00:15:38] Right, exactly: get those top level things; make sure that the stuff that people are less likely to opt out of gets them the core info they need.

 

Rachel: [00:15:48] Yeah, exactly.

 

Elizabeth: [00:15:49] Cool. So one of the reasons that I find the idea of avoidance is so useful is because it helps us deal with this thing called technological determinism. Have you heard of technological determinism?

 

Rachel: [00:16:00] I mean, it sounds familiar, but I couldn't tell you exactly what it means so I'm looking forward to you explaining it.

 

Elizabeth: [00:16:07] Excellent. So technological determinism is basically this idea that the technology determines how we're going to interact with it and what the outcomes are going to be. So an example here is [that] when the internet was first created, a bunch of people truly believed it was necessarily democratizing because of the structure of the internet. Because of the decentralized nature of how it was created, it was like, "Look, everybody is going to have equal access. Everybody's going to be able to give their opinions. We're going to have true democracy because of this technology."

 

Rachel: [00:16:42] And free speech.

 

Elizabeth: [00:16:43] And free speech. Exactly. Yeah. And like, obviously, that's worked perfectly. [Laughs] Right? Like, [the internet is] not necessarily a democratizing tool. It's true that you can use the internet to serve pro-democratic desires... But also anti-democratic, right? There's ways to use it however you want. So, the technological determinist would say, "Because the technology is here—because we got Facebook—we were always destined to end up wanting to see really short videos that we could turn the sound on or off [of] and we were always destined to end up with captions and that was predetermined [for] us. We are just humans who ended up having to follow the the will of the technology."

 

[00:17:28] On the flip side, there's the idea of social constructivism [which says that] actually we all use tools just however we want, and it's all about the social context of them. Which also is not entirely true. [For example,] we can go back to [the constraint on] the number of characters you're allowed in a tweet.

 

Rachel: [00:17:43] Right, yeah, or algorithm serving you up deliberate things for deliberate reasons.

 

Elizabeth: [00:17:47] Exactly, exactly. So the idea of affordances kind of allows us a middle ground between those. We're able to say, "Yeah, technology shapes the choices that we have," but also, "We are making choices and there's social context to those choices that help[s] us figure out what the actual impact of technology is going to be. What the actual impact on how we get and share information is going to be."

 

Rachel: [00:18:09] Right, so [if you were] the user on the phone in that [earlier] example, you could download multiple news apps and check them regularly, you could download Twitter. Or, you could not. And, you could be a radio news listener or watch [news] on TV at night. You have the ability to become wildly educated on every news possible at any moment of the day, or you could choose to not, or you could choose to pick one and only listen to, watch, or read that one. So that is an interesting general life/societal conundrum...

 

Elizabeth: [00:18:39] [Laughs] Yeah.

 

Rachel: [00:18:39] ..of my business, [or] I guess you could say, in the technology of journalism?

 

Elizabeth: [00:18:44] Yeah, and I think that it connects back to this idea of digital and media literacy too, in a lot of the ways that you were just describing. You could choose a bunch of different sources ([this is called] lateral reading). You could choose to click all of the links that you have in your newsletter to read the rest of the stories. You could choose not to do those things. You could choose to go and use a search engine to verify something that you saw in a TikTok by somebody who you think is really funny but you don't actually know if they are a trustworthy source of information. And having all of these options available can also get a little bit overwhelming because, you know, "I am afforded all of this access. And now what do I do?"

 

Rachel: [00:19:28] Yes. Or, "I only have a few minutes in the day to follow the news and the way that it was presented in this Instagram picture that was really artfully designed is the only facts I need." And that's not always the case, but... [Laughs]

 

Elizabeth: [00:19:39] Yeah, exactly. And so that there is a really good example of how the social context then impacts the affordances we have because we are making these choices based on how much time we've got, and how pretty the Instagram picture was, and it all starts to layer on top of each other. It makes it really difficult then, obviously, to deal with a problem like, "How do we deal with disinformation?" Because like...

 

Rachel: [00:20:04] Yeah.

 

Elizabeth: [00:20:05] It's multifaceted.

 

Rachel: [00:20:07] Well, and also in the sense of the conversation a lot of people are having right now about access to journalists online and the hate and attacks that journalists receive. [As a journalist, you] need to be, at a certain degree, on social media for your job. But more and more, we're seeing that that is also the place where the audience is coming back to you in ways that are... "not positive" would be the biggest understatement of the day. But that is also a factor in our jobs now. You have to keep in mind or consider the feedback element and the hate and the, you know... Even if someone didn't like your article or [if] it's to the extreme of inciting violence and and a whole lot of nastiness. But that is the, I would say, probably downside to having this authentic posting ability all the time. It's [that] they're also able to respond to you or get in your face in that way.

 

Elizabeth: [00:21:04] Yeah.

 

Rachel: [00:21:05] That is a factor that we didn't necessarily have even that long ago.

 

Elizabeth: [00:21:09] Yeah, absolutely. And it's interesting because that example there is another way in to understand the role of tech companies, right? Because... let's take Twitter, for example. Twitter is the site of a lot of this kind of harassment and hate speech. And Twitter, over the years, has been trying to figure out, "OK, how do we deal with this? Do we want to deal with it at all? And if we do, how do we deal with it?" And they've created new affordances to try and test out, "Does this help?" So like, mute buttons and block functions and...

 

Rachel: [00:21:42] You need to read the article before you tweet it—or they at least ask you to. Which, for me, I laugh every time because I'm sharing my own article, [so] it's like, "I wrote the thing!"

 

Elizabeth: [00:21:49] Yeah. Yeah.

 

Rachel: [00:21:50] For sure, I'll click on it.

 

Elizabeth: [00:21:51] It's like, "All right, yes, I want positive reinforcement for this." [Laughs] Yeah, that's exactly it. And so building in those different kinds of affordances—which, to varying degrees, are successful or unsuccessful—[is] an example of how these tech companies actually have a whole lot of power, right? Like I talked about technological determinism, and generally we want to avoid being too technologically deterministic, but recognizing the power of a company like Twitter to determine whether or not its users are going to have any sort of recourse when they're being harassed—that's that's an important, powerful role.

 

Rachel: [00:22:30] Yeah, well, and they also have played a role in taking people away. Like de-platforming people or having them be removed and not have that ability. So it isn't entirely a democratic choice.

 

Elizabeth: [00:22:41] Yeah, yeah, exactly.

 

[00:22:42] So we are getting close to time, but I wanted to touch on two other types of affordances that are talked about in the academic literature. One is the idea of imagined affordances, which is essentially when the users start to perceive affordances whether they're there or not. And you start to imagine what a tool can be used for and why it should be used and in what context it should be used. And so, sometimes as users, we actually kind of constrain ourselves, even if the technology hasn't done that. And [conversely,] sometimes we imagine that a tool is going to be able to do something that actually it can't do. Like, you know, how many times do you say, "Well, like Google probably knows the answer to that," and then when you actually go searching for the information, you're like, "OK, well, maybe... maybe it's not an all knowing machine."

 

Rachel: [00:23:36] Yeah.

 

Elizabeth: [00:23:37] The other kind of affordance that sometimes gets talked about are hidden affordances, [which are affordances that] might be there, but we don't even know it's happening. So an example is Facebook's News Feed. You can actually go and toggle how much of different kinds of information you want to show up, but most people aren't aware of it. And then beyond that, there's also the fact that Facebook's algorithm feeds to you information that you're likely to like and share and click on, and we might not know about that either. Those kinds of things, which then impact your news diet pretty substantially if you're using social media, are just kind of under the radar.

 

Rachel: [00:24:17] Yeah, and I guess that is—to the first point about perceived or imagined affordances—it would be when you don't realize that you're in a new silo or in a bubble and you're only getting news from one perspective. Or you think that, because you've seen a headline from a news outlet on this, that they haven't also done a story  on y, and so there is that requirement for you to go and do that checking. But one of the more tricky ones is figuring out algorithms in the sense of how... Even us as journalists, we are news consumers and making sure that we continue to get a broad range of things, but also keeping an eye on the people we are meant to cover. So you do have to kind of structure [your information consumption]. Like, I make deliberate choices—I make lists of MPs, for example, so I can go and know that...

 

Elizabeth: [00:25:06] Like Twitter lists kind of thing.

 

Rachel: [00:25:08] Yeah, and so you can keep track of certain groups that way. Otherwise your main feed can probably get fairly echo chamber-y, depending on what you're reporting on or following. And that is, again, one of those things that, because my job is core in media literacy, [that] that's the thing I'm aware of. But absolutely, every user probably doesn't think through [Twitter] in that way and they just assume that the people they follow are going to give them all the news they need to know. And that's not always the case. There's tons of stories that not everyone is covering or that are in an ethnic media market—that [is] a major story there that we had a blind spot to. There are so many examples of the ability of technology [to allow] that news to get out there, but there are still limitations in that. There isn't just a firehose of information coming in that we get to pick and choose from. It's deliberate and [like] turning on the right taps, I guess you could say.

 

Elizabeth: [00:25:59] Yeah. Turning on the right taps is a really good metaphor to use because there is this flood of information all of the time, and so recognizing that you're only getting it from one tap and then figuring out which other ones you might want to turn on—that's digital and media literacy.

 

Rachel: [00:26:17] Well, and also even in our job of covering politicians—having to keep in mind what taps they are and aren't turning on and figuring out when they're using new technologies. Or, you know, using phone banking or telecalling/teleconferences—how are they using these things? Who are they speaking to and what new technologies or new evolutions in political campaigning or messaging or all that are they now exploring that we have to be literate in as well?

 

Elizabeth: [00:26:41] Yeah, yeah, and learning how those tools technically work, but then also how these political figures are choosing to use them, right? Like you talked about Jagmeet Singh's TikTok: one part of the literacy needed around that is understanding how TikTok works, and then there's a whole other layer of "But how is he as a politician using it?" Because he as a politician is going to use it differently from me filming my cat and putting [up] TikToks of my cat.

 

Rachel: [00:27:10] I mean, some days they would be probably fairly similar.

 

Elizabeth: [00:27:13] [Laughs]

 

Rachel: [00:27:15] And I think the broader conversation then is... We had a lot of attention on, "OK, he's using social media in this way," or, "Trudeau's using social media in this way," or, "Erin O'Toole is doing these tele-town hall style things." [But] the broader, bigger conversation is about how effective those were in the end.

 

Elizabeth: [00:27:32] Mm hmm.

 

Rachel: [00:27:32] Obviously, the way that you're using technology has become a key part of campaigning—especially in a pandemic. But, it's not predeterminate for people's votes. Like, you don't just think because you've got a bunch of followers on TikTok or you reached tens of thousands of callers in B.C. one night [that you are being effective]. There is so much more to creating the democracy in Parliament and all of those things than just what technology allows. At the end of the day, you are still pencil-to-paper putting your vote down and [that vote]'s based on people at your door getting you out to vote or conversations with other people face-to-face. It's an interesting thought of how technology has become kind of all consuming, but there still is that require[ment] to pull your head out of it and realize that you can put it down.

 

Elizabeth: [00:28:16] Yeah, I think that's a really good point. We spend a lot of time thinking at the micro level and individual level, but, when you get out to a macro stage...

 

Rachel: [00:28:26] Yeah. Well, and it's been so necessary in the last year and a bit, having to talk to everyone virtually—Zooms and all of that—but more and more now, hopefully, as things get better, we're able to have face-to-face conversations. Even just thinking [back] to the last group of parliamentarians on the Hill, I don't think I have the same kind of recognition of them all because, for a good chunk of their time [in office], they were a face on a Zoom screen in the virtual House of Commons—which is another technology that has now been allowed in this hybrid sitting [that] I won't get into... And that's a whole other conversation about democracy and doing it from your office at home instead of in the Chamber.

 

[00:29:06] But hopefully now, being able to, more and more, get into a place where [...] you [can] walk down the hall and [ask] someone, "How's your day been?" and you just have that [conversation]. It's [a] different dynamic, and I'm sure that everyone's experiencing that in their own professional lives in different ways. But it is, I guess, a good reminder [that] it's really easy to put your head down and get into the tunnel, but there is a lot more to our conversations and society and our lives than just what came through a screen.

 

Elizabeth: [00:29:36] Yeah. And I love what you've just said too because it's like, "Yeah, there were affordances to face-to-face conversations and physical co-presence (as we say in the academic literature). Like, those were affordances too. They're not technological, but they were meaningful in the way that we interacted. And as we can do them more [again], it shifts the need for different technology and whatnot. Anyway, I think we could go on for ages...

 

Rachel: [00:30:07] [Laughs]

 

Elizabeth: [00:30:08] ...but we don't have time. So instead, I'm going to end it with the classic pop quiz.

 

Rachel: [00:30:14] Alright.

 

Elizabeth: [00:30:15] Can you tell me what affordances are?

 

Rachel: [00:30:16] So affordances are things that allow you to do other things. It's a tool or a device or a mechanism that would permit you, or not permit you, to engage in a way. So in technology, I think, the example of your phone or a platform would be an affordance. Yeah?

 

Elizabeth: [00:30:38] Very close! I would say your phone has affordances.

 

Rachel: [00:30:41] Okay.

 

Elizabeth: [00:30:41] Yeah. So it's like: the phone is the object, but then it affords you to do all kinds of things from calling somebody to texting to recording to posting on Instagram, whatever it is.

 

Rachel: [00:30:54] Right.

 

Elizabeth: [00:30:55] Awesome. Thank you.

 

[00:31:02] All right, that was our episode on technological affordances. I hope you enjoyed it. If you'd like to learn more about affordances or any of the other theories and concepts we talked about today, you can check the show notes or head over to palcomtech.ca..

 

[00:31:13] This special season on media and digital literacy is funded in part through a Connection Grant from the Social Sciences and Humanities Research Council of Canada and the Digital Citizen Initiative.