The Humanities District is a podcast about creativity and community. I talk to academics and alumni in Springfield Missouri about their work.
Hello and thanks for visiting my Substack! I’m posting transcripts of podcast conversations here for those who prefer reading over listening. You can always listen on any podcast app, or here. Enjoy!
-Jay
Episode Title: Post-Truth Truth with Heather Walters
Conversation recorded: February 15, 2024
Episode published: February 26, 2024
Jay Howard: Hello and welcome to the Humanities District, a podcast about creativity and community and higher education. I'm Jay Howard, a senior instructor in communication. I like to talk to academics and alumni in Springfield, Missouri about their work. And today our topics are AI and post truth. My guest today is Heather Walters. Heather is also a senior instructor in communication, and my office neighbor just down the hall in Craig Hall. Heather, welcome to the podcast.
Heather Walters: Thank you so much, Jay. I'm so excited to be here to chat about those important topics.
Jay Howard: Yes, those and those and other topics. You know, longtime listeners to the podcast may notice that I've had at least one episode focused on AI earlier in the feed. And that was with Stacey Rice, who's a educational Oh, darn, I can't remember her exact title.
Heather Walters: Yeah, she works in the faculty center for teaching and learning. And I listened to that episode of the podcast and also found it fascinating.
Jay Howard: Oh, yes, absolutely. Well, and the amazing thing too about that, about Stacey is, I mean, I had just learned what chat bots even were and was immediately starting to panic about it. And there was Stacey already creating explainers for us. So I was really appreciative of her efforts. And similarly though,
You know, Chatbot just came on the scene. I don't know what its first birthday was, but it's not very old. Chatbot GPT.
Heather Walters: Yeah, it was basically November 2022. Yeah.
Jay Howard: And when was your intercession course on the same topic offered? That would be like Christmas of last year.
Heather Walters: Right, so I taught this intersession for the first time in the winter 2024 intersession. So just this past January.
Jay Howard: Nice. So, I mean, there's not that much time between when this really dawned on the popular consciousness in a real way and when people have started to develop courses and trainings for students. And I see your courses as on the forefront of that. As an academic advisor, I'm getting emails from students saying, hey, is there a certificate in an AI? Are there classes I can take? And I was able to point them right to your intersession class. So I'm excited to talk about that today.
Heather Walters: Yeah. Well, my inspiration for creating the intersession was to be able to engage the students in a topic where the level of excitement about it is obviously palpable.
I mean, people know that AI has the potential to reshape a lot of what people know and experience in our society. And I just think it's really important that we have the opportunity to talk through the implications the technology will have on our future. And I think students are really interested in those topics.
Jay Howard: Yeah.
Heather Walters: So, and those conversations can really make a difference in how the technology is ultimately used and deployed in society. Like people being able to feel like they are.
participants and the kind of global conversation about the role AI should play in our life will be important to how we all decide, eventually integrate it more fully into our daily lives.
Jay Howard: Yeah, pedagogically, it was a perfect storm of an interesting topic, a timely topic, and a useful topic. And as far as the content, before we even get to how it should be used, the fundamental questions of what even is it, how does it work, what are its capabilities, people just need information at all levels about what it is, how to use it.
And so how is this or how did you approach it as a communication course? Is it about how people should interact with AI or how we use AI itself to communicate? And also for listeners, I should specify this is a special topics intersession comm class using our beloved course code COM397.
Heather Walters: Yeah, so it was a week long class, so we were a little bit limited on the scope of what we could talk about, but we addressed how uses of AI could affect communication and media professionals in the workplace, like what role it might have in job replacement or how well the nature of the communication or media professions change as AI becomes more dominant. In the workplace, we chatted about the risks, potential risks of AI. We chatted about how AI influencers are becoming more popular and what role we think those individuals are.
characters should play in society. And I tried to kind of make a breadth of topics that would be interesting to students and highlight for them both what the technology can do, benefits and risks. And I received a lot of interesting comments about the class, like people said things like, “I would define my experience with this course as enlightening,” or “my knowledge was limited about AI, and this course completely changed my perspective,” or “I learned a lot about how society is evolving and I'm blown away about what the technology can do.” And I thought that was really fulfilling to be able to engage students in talking about something that they are.
Jay Howard: That's awesome.
Heather Walters: really interested and excited in. It seems like basically an educator's dream to have a whole group of people who are just excited to be there to talk about something that they think is gonna be really relevant to their daily life moving forward. What more could we really ask for?
Jay Howard: I couldn't agree more. Yeah. Yeah, that's definitely a win-win. You said AI influencers, and then you use the word character. Is it like, there are influencers who are just total chat bots or?
Heather Walters: Yeah, totally created by AI. And the potential benefit of them to companies is that you can work 24-7 if you are an AI-generated influencer. But.
Jay Howard: Wow.
Heather Walters: At the same time, the ability to regulate what an AI influencer can say is limited because in the end, they are not actual people that laws and regulations can apply to. So just whether people realize that it's possible for companies to advertise using AI and what those individuals or chat bots could be telling them because they look real.
Jay Howard: Goodness. Yeah. This isn't like Clippy from Microsoft Word is the hype man for Hershey's chocolate or something. This is like an actual avatar of a person.
Heather Walters: Mm-hmm, actual looks like a real person. Looks like a real person is telling you that this product or service is the best possible thing you could buy or invest in. And they are not real.
Jay Howard: Yeah. Well, we emailed a little bit about the horsemen of the apocalypse in preparation for our conversation today And I'm just gonna count that as one of them. I'm not comfortable with chatbots running around telling me what to think and buy and do, although…
Heather Walters: haha. But I was glad for the opportunity. Like if somebody doesn't know that is a possibility, if I could tell someone and spread that message, I felt like kind of my work here is done. I've done something positive for the future because I warned somebody that this was possible.
Jay Howard: Yeah. You're raising awareness that the storm is coming. At least that's my words. And this is…
Heather Walters: Yeah. That there is a horseman out there.
Jay Howard: Yeah, and you know, we're in, we are in an election year and we have kind of this new technology with, I mean, I'm sure we'll get into it, deep fakes and I mean, I just feel like we're in a Black Mirror episode of what's the craziest thing we can think of that could derail us and... history will show that our imaginations were just not up to the task of predicting the crazy things that we're about to see, I'm afraid. Yeah.
Heather Walters: Well, I kind of think that is where our role is. Educators becomes even more important. There's a whole world of possibility and of risk, but.
The role that we can play in shaping that future can hopefully make it more positive than negative. And we can explain what could happen if we let things go too far in one direction, but we can harness all of the ways that this tool can maybe improve our lives.
Jay Howard: Yeah. That's so important because I do think I get carried away focusing on the potential negatives or afraid that the negatives will outweigh the positives without really giving due consideration to the positive ways, prosocial even, ways that these technologies could be used.
Heather Walters: Oh, we also did AI-oriented assignments in the special topics class, which kind of helped highlight for students what AI is currently good at, how it could be an addition to the learning environment.
But some things you might not want to use the technology for as a part of your education, you know, and allowing them to kind of see how the technology works, even introduce to them to the potential academic risks and benefits of it. People are usually just worried about plagiarism, you know, as
Jay Howard: Okay. Cheating. Yeah.
Heather Walters: and cheating as it relates to student use of AI, but there are ways that we as educators can integrate it into what we do to make education practices even better.
Jay Howard: Yeah, let me ask you more about that. And kind of the context is, you know, when I was looking at the courses that you teach, I saw going back through the semesters as many as 15 different course codes. Some of those were the same class on the graduate and undergraduate level. But...
Do you favor the online teaching format or to the in-person teaching format or do you do a little bit of both or where would you put your primary pedagogical focus right now?
Heather Walters: For the past few years, I have taught primarily online and I enjoy teaching online. I appreciate how online education allows less traditional students or students in other environments, states, countries to take advantage of the opportunities that Missouri State has to offer.
And I think that there are a lot of positive benefits to learning online. For example, when I first started doing it, I thought, wow, students are more engaged with the textbook than they are in... my live classes because in a live class people might expect you to tell them what was in the book, but in an online class if they don't read the book no one is really reading it to them. And they would make more comments about the quality and what was in the examples that were in the book, and I thought that was fascinating. And I do think that when properly delivered by people who have online experience that the being in an online class is just as or more valuable than a traditionally seated class.
Jay Howard: I enthusiastically agree with you on that. I know people have all kinds of opinions on online education, whether they want to take them or not as a student, whether they want to teach them or not as a teacher, but I do think there's definitely an important role for online education in that it can be equivalent to or exceed the quality of a seated class. And in your case, you have had the experience of writing a textbook that is new, several textbooks. So what I'm thinking of right now is communication ethics, promoting truth, responsibility, and civil discourse in a polarized age. And the benefit of having a brand new book that you're intimately familiar with as a producer was that there's a robust online components to this book or a whole ebook section or version.
I'd love to learn more about it. And I don't know, I just, I've been out of the classroom for a little bit doing my academic advising thing, but I wanna get back into teaching as well. And I want to learn how to do what you're doing. So what was the process like of developing this book? And I'm interested in the online components as well. I've heard things about it from students where they have said, exactly what you were saying, that like, they're getting more out of the text because it sort of forces them to read it. And I don't know how, like it sounds like you've done a magic trick. Ha ha ha.
Heather Walters: Mm-hmm.
Heather Walters: Well, I'd been teaching communication ethics for quite a while, and then decided I was interested in creating a textbook about it as I became more interested after writing my argumentation textbook. And we were studying disinformation and its role in creating a barrier to
objective argumentation. And then I started thinking about it more and I was like, in many ways, disinformation is just another form of theoretically unethical communication.
Jay Howard: That's a good point.
Heather Walters: So I started thinking more about the role that ethical communication could play in just improving the quality of society. Like if more people knew how to navigate the situations that are created by factors like disinformation, then we could help overcome the theoretical crisis that could be created by stuff like that. And ethics books were notoriously traditional and older. And I was finding that there weren't a lot of current case studies in ethics textbooks that described how Communication ethics was relevant to the current daily life of students considering technology and all of the different things that are happening in the world. So I wanted to create a book that was very current, that had case studies that were a part of it, and allowed students to apply traditional ethical concepts to current events.
So the beginning of every chapter of my book presents a current issue or topic. And then the rest of the chapter kind of gives them some theoretical tools and suggestions for how they might address that topic that are also applicable to lots of other parts of their life. And it was also true when I started researching current ethics books that hardly any of them had an online or interactive component.
Heather Walters: no online tools. So in creating the ethics book I included interactive lessons which students have to like go in and
you know, match words or explain what concepts apply to what theories or fill in the blanks of paragraphs that help illustrate course concepts. And there are interactive videos that they watch and questions pop up as they watch them. And the videos are about.
Jay Howard: Okay.
Heather Walters: topics that are ethically oriented or about maybe the case study that is relevant to the chapter and so they watch videos and then get to interact with them and I think that really helps the text reach the students where they are and that was what I wanted to do because ethics is something that you can think is very traditional and very theoretically ancient.
But you have to bring that to life, to today's current student. And you can't do that without injecting some form of current event and use of technology into the process.
Jay Howard: the videos that have questions throughout. And that must be what the student was meaning in terms of like forcing them to actually watch the video. I assume, can they speed it up, slow it down? Do they have that option?
Heather Walters: haha. I don't think maybe as much as they would want, but it's all in the educational platform of the publisher. So after they have heard a segment of the video, it would ask them a question about something that they've heard so far. And then they answer the question, and the video continues to play after that.
Jay Howard: It doesn't have that, it doesn't have that. Yeah. I'm assuming, and correct me if I'm wrong, that you're an early adopter of Brightspace. Are you teaching in Brightspace now?
Heather Walters: I have been to many trainings about Brightspace and feel comfortable with it, but my current courses aren't on Brightspace. That's primarily because I was worried about students having to interact with multiple platforms at once I didn't want to make that more difficult for them. Since I teach some of that core like.
intro classes in the major. I didn't want students to face a barrier with navigating different platforms and different classes. So I stuck with Blackboard for now, but I'm excited to use Brightspace when it becomes available or mandated.
Jay Howard: I am in a, I'm in a Brightspace learning group, the Faculty Center for Teaching and Learning right now, led by Katie Hogan, and I'm learning a lot, and I'm excited about it. One of the things that I would grumble about as an instructor in semesters past was that the publishers online software didn't integrate very well into Blackboard, the learning management system.
And so, you know, I'd have to manually input the grades on the quiz over in media share or whatever it was called, and put them into Blackboard or take some sort of action to facilitate it instead of it being seamlessly integrated. I know that that's always changing and there's all kinds of different services. What's your what is your experience or what's your take on that ball of wax?
Heather Walters: I wish every publisher could have a seamless integration to a learning management system, but I experienced the same thing, actually. My books are published with Cognola, and their platform doesn't sync directly with Blackboard. So I've been doing what you mentioned for quite a while. I can organize the assignments and then Cognola's platform as one module for students, though, so they can go one place and see everything to do on the platform and then I transfer the grades. But I also use a lot of other tools from McGraw-Hill that integrate.
Jay Howard: That's cool.
Heather Walters: a little bit better with learning management systems. And I have also started using PackBack for discussions, which also integrates pretty well with Blackboard. So while it's not 100% seamless yet, I look forward to the day where all of it will be easy.
Jay Howard: Okay. Nice. Very nice. Well, between now and my second block effective listening class coming up next semester, I need to learn what pack back is and perusal and bright space and figure out what's going to work for me. And I assume anyone listening who is also teaching and trying to wrap their heads around a landscape that's changing and the things they're learning are becoming antiquated the moment they learn them because now there's this new cool thing.
where all our heads are spinning. But I think the key is just not to get overwhelmed and like pick one or two things, figure them out and just do baby steps semester by semester.
Heather Walters: I agree with that largely. Last semester I spent most of my time, you know, in trainings and learning how I will use Brightspace in the future.
This is the first semester I've integrated the PackBack tool into my classes. So now I'm kind of focused on how can I best incorporate that for students. But I think all of these tools are things that, if used effectively, can improve the learning process for students, make them feel comfortable and engaged in the class, and that's my goal.
Jay Howard: Yeah. That's the goal.
You have taught the intersession course code quite a lot going back over past the semesters. I was surprised at how many I had seen. I thought I was like doing something special, creating a special topics few semesters ago. Do you teach the same topic over and over or do you have different ones each time?
Heather Walters: But. Different ones each time and what really speaks to me about the special topics classes are I really like being able to incorporate current events into what I teach. I think that, you know, accessing things that people are interested in and that are relevant to daily life is so important and special topics courses give us a way to do that in a very particular sense. And so past ones I've taught include reviving civility in America.
Jay Howard: That's what they're for, yeah.
Heather Walters: I've taught that one a few times. It was in response to some of the previous election drama and how people should be able to respond to civic conversations.
I've taught public relations law as a special topics class where we can investigate like current issues about intellectual property defamation and those all have a really current component to them like how should we update our practices as public relations professionals related to those important topics and I enjoy, I have a real just like personal interest in current events. I like to read news and be constantly learning things so I think I'm a more engaged teacher when I can speak about topics that I think are just very relevant to our daily life.
Jay Howard: Very cool. Well, and that's built into the life of a debater as well, right? Or a debate coach. Ha ha ha.
Heather Walters: it is. I imagine that's where I kind of got it from. Or maybe they were just, maybe it's what attracted me, but it definitely developed out of that.
Jay Howard: Very nice. Well, before we get into the post-truth argumentation book, let's go into a little bit of your background about your, especially with debate. So I see you have the BS from Missouri State, and then we have a JD from Maryland School of Law, and then an MA from Missouri State as well. Is that right?
Heather Walters: Yeah, I have an MA in COM from Missouri State, and I also have a master's degree in educational administration, K-12 from Missouri State. I just recently finished that degree actually. So I'm just kind of a lifelong learner, as it will.
Jay Howard: Oh well. Well, that's awesome. Congratulations. Uh, that's awesome. I aspire to also be a lifelong learner. So when did you first get involved in debate and forensics?
Heather Walters: That is a pretty long story, but my parents helped me discover my passion for speech and debate. Basically, when I was younger, I was having some teachers who were reporting to my parents that they were concerned that I was extremely shy.
Jay Howard: Okay. Oh, interesting.
Heather Walters: And so my parents' response to that was when I was in eighth grade to require that I enroll in a speech class when it was offered as an elective, which I thought was gonna be horrible. I was not excited to do that, but that class was the spark for me for debate. We had an assignment in the speech class to participate in a debate, and the teacher after we did it was just like, Heather, I saw your eyes light up in a way that I've never seen before when you were debating. And that's when I learned that debate was an activity that valued intelligence and knowledge about current events and constantly evolving to learn new things and also rewarded competitiveness and stubbornness and I guess those were all qualities that I possessed and so I was instantly basically drawn to it and then right after we did that assignment in the eighth grade speech class, the high school debaters came and visited to recruit us to join the high school debate team. And they also mentioned that debate was great training for being a lawyer if that was something you're interested in. And for as long as I could remember when I was little, I always wanted to be a lawyer. I would watch all kinds of court.
TV oriented shows and true crime shows. And I was like, I want to be a lawyer. So when my teacher had said that and someone else told me that if you want to be a lawyer, you need to do this, I was all for that. Also, I ended up taking Latin because someone at one point told me that if I wanted to be a lawyer, I needed to know Latin. So I was like, OK, if that's what I need to do to be a lawyer, I will do that.
Most of the beginning of my life I was doing all of these things to prepare me to go to law school. So I debated throughout high school. I got one of my assistant coaches in high school was a college debater at Emporia State, where I'm from, Emporia, Kansas. And he was coming to join the coaching staff at what was then Southwest Missouri State and said you should consider debating in college and you should consider doing it at Southwest Missouri State.
And I had gotten some academic scholarship offers and at the same time had someone saying I should do college debate and so even though I hadn't heard of the school prior to the receiving mail from them and talking to someone about debate at the school, I ended up here as an undergrad.
and participated in debate the entire time I was here. It was foundational to my experience as an undergrad. I mean, I basically think that no other single activity has been more beneficial for or impactful on my life as my participation in speech and debate.
Jay Howard: Okay. Hmm. Yeah, like you said, it cultivates so many things we think of as virtues. You know, not only like you said, stubbornness and competitiveness, but also, would you say empathy as well because it forces perspective taking with even sometimes you're arguing for things you might not personally, you know, agree with. So you have to put yourself in the mindset of someone who might be on another side of an issue.
Heather Walters: Definitely, I think empathy, I think tolerance. I think that I was able to experience traveling to so many new places for speech and debate competitions in high school and college and just getting out of your comfort zone and where you are directly from and seeing the way other people live, even if it's just in the United States, which is all the debate tournaments I've been to are in the United States. But even that was so profound in my experience, seeing big cities for the first time, flying on an airplane for the first time. I've been to now all 48 states, contiguous US states or whatever, primarily based on participation in debate tournaments. That alone broadened my perspective so much over what I would have otherwise experienced. So, so valuable.
Jay Howard: Yeah. Communication does attract some pre-law students, people who are interested in that, as I'm thinking about the advisees I see, and a lot of them are curious as to whether a debate is a good prep for law school. Would you recommend it? And also, did Latin turn out to be abuse?
Heather Walters: Highly recommend debate as preparation for law school, a little less Latin. But I'm sure it didn't hurt. It definitely gave me a foundation about key terms, but debate was profound in its ability to prepare me for law school.
Jay Howard: Okay.
Heather Walters: One story I tell about that is just what you were referencing earlier, like being able to see both sides of an issue. When I went to try out for the trial team at the University of Maryland, which is like something that people generally think is an honor in law school, people want to be a part of it, at least they did, at the University of Maryland.
They had us come in and prepare a closing argument on one side of a case and the first time we got to pick what side we wanted. And so everybody came in and delivered their closing argument. And then they had a round of cuts. And then for the second round, they said, OK, now for the second round of participation, you need to prepare a closing argument on the other side, not the one you did before. And I saw some people.
kind of crumble, honestly, because they picked the side they agreed with, and they had not as much capacity to give a closing argument for the side that they didn't agree with. And whereas the people I knew in law school, like me, with debate experience, were able to do that basically seamlessly, so give a very similar in quality performance on both sides, because we could understand the value of and ability to speak on both sides of a question.
Jay Howard: Yeah. I mean, that exercise, we may take it for granted, but it really is mind blowing when you encounter the task the first time to like, cause I teach the public speaking class and they do that sometimes in the persuasive speech unit. And it can be a profound experience to like.
Heather Walters: I think it's really such important training for getting people to see the other side of issues. And I wish, and I think that could really benefit society as a whole. So many of the disputes or controversies that exist in the world could be more minimized or tolerated if people just... understood the other perspective and nothing helps you understand the other perspective as much as like I have to speak in favor of it.
Jay Howard: Yeah. It's like I'm going to, you know, I'll humor this person and imagine for one second that they are not a satanic cannibal, you know? Yeah.
Heather Walters: Yeah, they're not the most evil person ever. They are just a person with a different opinion, and you have areas of agreement. And if you can see that, then maybe you can have a conversation with them about the small areas of disagreement that might exist between the two of you, and overall our communication practices will improve.
Jay Howard: There's a term, I think it's from Kenneth Burke, where he talks about technology for living. And I think about just mental technology or just having concepts to think with. And as I was reading through the detailed table of contents of the Post-Truth book, I mean, there were a bunch of sections in there that I was like, man, I need to take this class. I know what inductive reasoning and deductive reasoning, but like, I don't think there used to be a thing called abductive reasoning.
Heather Walters: There wasn't. It's relatively new. I thought it was important to include.
Jay Howard: So it was like, we got, yeah, new reasoning types. And that's just one example off the top of my head of all kinds of just really meaty, interesting things. So like with every other class that our discipline has, it should be a required class. Not for our major, but for every major.
Heather Walters: Ha ha. And more people could take it. In the lifelong learning that we do, we can all learn from each other.
Heather Walters: That's true.
Jay Howard: I agree. Well, so yeah, let's dive in. I think maybe a possible title for this episode is Post-Truth Truth. And I don't know, just something hopeful about the concept of, I don't know, standing up for the thing that facts are still real, even though, even in a world full of faux-tography, which is another new word for me, faux that's spelled with an X.
Heather Walters: Okay, yeah. Yep, F-A-U-X.
Jay Howard: So, yes. So, post-truth, it's a term that's thrown around a lot. How would you define it and what is its, you know, recent history?
Heather Walters: Okay, the title of the book is Understanding Argument in a Post-Truth World. And I think what we meant by a post-truth world is one in which the society tends to value opinions and beliefs and appeals to emotion over objective facts.
And the book is an attempt to help students navigate a world where opinion and belief and emotion is dominating discussion and be able to infuse some objective truth into those conversations. When we do workshops about the book, a section of the workshop is often a pedagogy of truth. And I'm a believer that the reinvigoration of a focus on truth is important and vital to solving a lot of society's communication problems.
Jay Howard: Ah, this is like a breath of fresh air. I'm just really... So, yeah, I know that you mentioned you love case studies. Can you give us an example of... And I, because I know there's endless examples both sort of on the right and on the left of like, we're valuing emotion more than what the facts on the ground may be. But...
So I don't know, what's an example of a time when post-truth has sort of reigned supreme in the narrative? Does that question make sense?
Heather Walters: a little. So one article that I've worked on, some research that I'm presenting at the Western States Communication Association Conference next week is about climate change discourse.
Jay Howard: Okay.
Heather Walters: and the role that disinformation has played in our ability to make effective policy regarding carbon emissions and reactions to global warming, where there is arguably scientific consensus on the issue that human-induced behaviors are causing climate change really effectively act to reduce that because they are concerned about disinformation or opinions that happen on the other side.
Jay Howard: Mm-hmm. Yeah, that's a really good example. And it shows how so many of these big issues are like, our stance on them are an identity marker where it's like, if I identify as a certain, I don't know, political persuasion, for example, I'll be more likely to believe something that substantiates my preexisting beliefs and less likely to believe.
I'll be more critical of it if it challenges them. But that's not on the basis of critically evaluating the information, it's just based on how I feel, or how I feel like it validates myself as a person, and the commitments I've already made, you know?
Heather Walters: But yeah, I think what you're talking about is kind of the role partisanship has played in our society or how society has become more polarized on the basis of.
Jay Howard: Okay. Yeah.
Heather Walters: political affiliation where some people will say, just if someone has a different party affiliation than me, they are a bad person, or they believe things that are totally unacceptable to me. One of the videos that I show as an interactive video in one of my classes is about how there's been a study that there's been a decrease in people's ability to spend an entire Thanksgiving dinner with family members, because political controversies come up. And polarization means that people have to think certain ways about those controversies and they just decide to have Thanksgiving dinner be like an hour instead of an hour and a half like you know it average you know used to be or whatever and what that says about the state of society probably isn't positive but if we could like we kind of mentioned earlier understand that people have more similarities than differences, then maybe the need to divide every issue into, you know, this party believes this, or this group of people believes this, based on identity. We could actually overcome some of the policy paralysis that exists on some issues and make positive changes that need to happen.
Jay Howard: Right. The focus on common ground is, I mean, it's an important thing to insist upon because as I was doing some reading on polarization, I ran across the term affective polarization where it's not that we think differently, it's that we feel like the other person thinks differently. Or I just, if I'm strongly polarized, I have automatic negative feelings towards someone with a different label, regardless of what they believe or don't believe.
regardless of the truth of what they might believe or don't believe or how different or similar it is.
Heather Walters: Yep, you're so right. As it relates to what we were talking about earlier about AI, I'm working on a book chapter about AI activities to engage students in the democratic process. And so one of the things I thought about was, given the rest of my background, was AI-enabled civic debates.
Jay Howard: Hmm.
Heather Walters: And something that AI can do really fast is summarize content and point out issues of similarity, and then is effective at saying, these are the outliers. These are areas of difference. And one thing that I hope, theoretically, that can do is demonstrate to students if this was an activity in a class that
Jay Howard: Yeah.
Heather Walters: There are more areas of agreement than disagreement. Here are the things that we can note as areas of disagreement, and we can talk about those. But if we start from the premise that there are lots of things that a lot of people agree on, we can navigate conversations in a totally new way.
Jay Howard: Hmm, that's really interesting, especially real time, live public forums to apply that to it.
Heather Walters: And that would be really hard. Like a lot of even political debates that people have watched on TV end up devolving over a disagreement about a particular fact. But AI could both like maybe answer the question about that fact and highlight for people that there are areas of agreement.
Jay Howard: Right.
Heather Walters: And so the traditional and stereotypical vitriol that people think needs to exist in order for a debate can happen doesn't need to exist because we can evolve into having conversations that discuss both areas of similarity and areas of difference.
Jay Howard: You know, an email came along the train some about ideas for things that could, that a potential digital humanities lab could do and equipment that such a lab should contain. There could be a connection here between what we're talking about and the digital humanities. It doesn't have to just be X-raying an old scroll to decode what it says. It could also be the sort of democracy promoting materials that you're talking about.
Heather Walters: Yeah, and I think that all of that is evolving in such interesting ways. I've heard, you know, political candidates are making chat bots of themselves to like help interact with constituents and answer their questions and then, you know, so that's a use of the technology.
Jay Howard: Wow. surrogates.
Heather Walters: But another use is like, you know, help both helping constituents answer questions about political issues and helping us have potentially more civil policy discussions about issues using AI platforms. Cause lots of those will be like, I, you know, I'm not gonna say mean things about the other side. I will report, you know, what has been happening, what you've said, the... points in favor or against an issue. But AI, without it being programmed to be as such, isn't mean, it's like a polite neutral platform that can help us have conversations that left to our own human devices might become more controversial than necessary.
Jay Howard: That's really cool. Well, that's all very noble, but I'm now possessed of the idea of like, I want to figure out a way to build and market chatbots that will say mean things about my opponent.
Heather Walters: Well, there are AI platforms to be like in the horseman of the apocalypse sense. People have created LLMs that will only feed disinformation out to people.
Jay Howard: Yeah, yeah. And do they masquerade as some source if they're not or something to get more credibility?
Heather Walters: Yeah, they masquerade as like, you know, I'm going to give you accurate information, but they're 100% programmed to be inaccurate information. And, you know, people with the skills can create LLMs that are, hold all the information like pretty quickly. And unless people know where they're getting their information from in a political sense.
Jay Howard: Yeah.
Heather Walters: Those are the things that kind of put us on the track to more risky AI interventions than potentially what I'm hoping for, which is the use of it to promote rather than destroy democratic engagement.
Jay Howard: Yeah. Hehehehe. Yeah. So much of it is benign to, you know, as I scroll through my social media feeds, I see a lot of like images with quotes over the image that are mostly, like I said, benign. They're not political in nature, but I don't think a human being ever made any of them. I think they're all generated by, you know, just to get engagement. Same with the videos. But, yeah.
I don't know, as we go into the election year, kind of to return to that topic, my first thought was that I should just sort of reduce my consumption of screen time to sort of fortify my soul against the corrosive vitriol to come. I do think it's gonna get bad, you know?
later on this year and because we've lived through some election cycles that have been bad and that was before there was AI like there is now. But just as many people are working to make the world a better place and trying to use the technology in a way that will move things forward in a more positive direction.
And some of these are really savvy campaign operatives who are trying to safeguard put the bumpers up on the bowling alley for us, to kind of get that to a question, and I don't think I've asked this yet, except for maybe before we started recording. What do you expect as we get into the election? What should we guard against? How can we help? And, you know, asking for myself.
Heather Walters: I think that is a great question, Jay. You're right. I think the research shows that 4.2 million people will go to the polls in 2024 in 64 countries. So basically 49% of the world is having elections or participating in elections right after we've had all of the excitement about especially generative AI and what it can do. People, there are lots of headlines that you've probably read too, that say like elections and disinformation are colliding in 2024 in ways like they never have. People, I read a study from the World Economic Forum that said that misinformation and disinformation from AI will be the top global risk over the next two years. And they evaluated that as even ahead of climate change and war and other big problems. Because, yeah, because generative AI has given us the capacity to produce material like deep fake video and audio.
Jay Howard: Nuclear proliferation, pandemics.
Heather Walters: and we can mass produce it. It can be generated really fast for really cheap. It's very sophisticated. It's hard for the average human to discern what is and is not true or real.
Jay Howard: I can't tell.
Heather Walters: And that is really why I think that the focus on ethics is so important for society moving forward because the only way that we are going to be able to counter some of the problems that deepfakes and other disinformation could pose to the election process is if we have educated people to look out for potential disinformation, that there are more people like you who are educated and worried about what they're consuming and where it's coming from. And because it could be very hard for people to know the difference.
And, if we have conversations about these things, then maybe regulation can happen faster about how AI works. Governments and companies are working on things now, like provenance, being able to say when something is authentic and not authentic. And that could be a move forward. Because if media outlets that are traditionally providing objective information can mark their content as this is, you know, definitely authentic and people who, and you can tell that versus the people who are unwilling to mark their things in similar ways, that could happen. But kind of, I think people are worried about 2024 because a lot of the regulations or ways to deal with this information haven't been totally formulated yet. So there is going to be a role that average citizen has to play in being watchful in thinking about what information they are consuming and where it's from and doing their own research to.
Jay Howard: Absolutely.
Heather Walters: determine what their position is on issues and letting them know that vehicles that exist to do that is important and explaining to people why that's important is also necessary. And that's what I also kind of hope my focus on ethics does.
Jay Howard: Absolutely. And just, yeah, and as an educator, that's so important. It brings me back to this article that I mentioned before we got on mic. This is from an Education Week Classroom Technology, I don't know, article. And the quote here says, if educators, this is a quote, if educators aren't already thinking about teaching students about deep fakes, they really should be because this is in the water and that their students are swimming in every day. And so I think all of us who are educators at all levels, we bear a responsibility for cultivating digital literacy related to our discipline. And I don't wanna belabor the point, but I was just scrolling through Instagram the other day and I ran across this video of a duck in a pond, beautiful orange. Iridescent green, yellows and blues on the feathers of this duck. And I was like, I think this might not be a real duck. And so I then set me on this mission of like to find out, is there really a species of duck that looks like this or is or did some AI video generation create it? Turns out the duck was real. So I was actually too skeptical. And there are things that beautiful in the world that I just didn't know about. But.
Heather Walters: haha
Jay Howard: I mean, we don't have time to Google every single thing we see or hear. And so even those of us who are skeptical and vigilant are still going to get some fake stuff in our heads in the shoebox of things that we think we know to be true. So yeah, people, I think I really believe strongly we need to take it seriously and put in some quality checks as much as possible for our information.
Heather Walters: So I think we were talking about AI influencers earlier, which is kind of a different version of deep fakes. And one thing that was notable about that to me in my special topics class is the students who didn't know that was a thing. And so what you talk about skepticism that can get overwhelming for people, but
Jay Howard: Yeah.
Heather Walters: telling students and making sure that they know that it's no longer exactly true anymore that like what you see is what you believe because there are risks inherent in that and we need to be skeptical of content that we consume and try to be vigilant.
Yeah, we can't Google every single thing. But I think that's also where another strategy that researchers are using like pre-bunking comes in, that if we know that there's going to be a message that is given to the public that is inaccurate, countering that in advance, like giving a clear message in advance that this is what people will tell you. This is what is wrong about it. This is what you might consider as an alternative perspective to that information is what researchers say is a form of inoculating the public to kind of the inevitable disinformation that they might encounter in the process. And that has been shown to be more effective than telling people just after the fact that what you saw might not have been a duck. But if you tell people in advance that you might see this picture of a duck and it is not a duck, and here's why it's not a duck, they react better to that. But once people are convinced it's a duck, then you can't do anything about it.
And that's not 100% accurate as an example, because that duck really was a real duck. But I think that it makes the point pretty well that there are a long list of things that educators and others can do to promote digital literacy and everything that we do to help move that along is going to hopefully help somebody not be fooled by information, whether it's in regard to election 2024 or just other forms of sponsored content or inaccurate information that you might encounter on the internet, just always having an eye toward, you know, what is this information? Where did it come from? Can it be verified? Is a little bit about what we just have to do as citizens now. And providing students the tools to do it is part of our obligation as educators. And hopefully, doing that and encouraging people to be ethical communicators in the process.
Jay Howard: Absolutely. I mean, it's both a hope, it's a realistic message and a hopeful message. Yeah, just staying ahead of it, staying on top of it and until and alongside the coming of, you know, regulations. Because
Heather Walters: I mean, did you read that? I think there was a CNN story just yesterday about how one fifth of adults in the United States believe Taylor Swift is involved in a government effort to help Joe Biden win the 2024 presidential election, even though there's no factual evidence to support that theory, but it's moved into the public discourse for...
Jay Howard: Isn't it strange how things take on a life of their own? Yeah.
Heather Walters: Yeah, you know, but the popularity of Taylor Swift right now and its relationship to the NFL. And then if she did say people should vote, you know, and in the past has supported President Biden. But whether that means she's a part of a broad conspiracy to get one candidate elected over another is like taken it to a whole other level, right? But some people obviously believe that. So if we think about what that means to the future of election 2024, you know…
Jay Howard: Yeah. It's bad news. Yeah, I saw what purported itself to be a video of Taylor Swift talking about endorsing political candidates. And I have no idea whether it was real or not. If it was real, it was from 10 years ago, maybe, but I think it probably just wasn't real. And I just scrolled past it, because like I said before, I can't Google everything. But one of the things that is hopeful to me is like,
Since when did Googling something become like the, the weapon against misinformation? Cause there was a time when we were very scared and that Google would be one of the horsemen of the apocalypse. Because I remember this was many years ago and stuff, but we were teaching credible sources to students and we're like, you know, be skeptical of Google and Wikipedia, don't use it. But as years have gone by, these services have matured.
You know, especially Wikipedia as an example, where things that were once new and scary are now normal and helpful, and I can't imagine the world without them. And I assume that like chat bots will get to that point as well, just as long as they don't tear the world apart first, which is always kind of iffy, but you know, we made it this far.
Heather Walters: So I think what you say brings up two interesting things. One is what you're basically describing is the strategy of lateral reading, which is like confirming, when you hear a fact confirming that someone else, besides one website that you found on Google, believes that. But if it's in multiple places, then that's kind of a better sign that it could be accurate.
Jay Howard: triangulate it.
Heather Walters: And then just like this is also back to what I was referring to earlier about the pedagogy of truth, that there are traditional valued media sources that don't spread disinformation. They are largely accurate.
We have used them for years. They have fact checkers. They take great care to make sure that their media professionals are giving us accurate information and people need to know that about those things. You know, that there are trusted media sources you can and should go to them in some instances and like then there are others that are less trustworthy, but knowing the difference is important. But also knowing that there are places you can go that are going to likely tell you the truth. The truth exists.
Jay Howard: Yeah.
Heather Walters: Places will tell you it and give you the tools to get accurate information. You just have to be a part of it.
Jay Howard: Sounds very x-files-y. The truth is out there. We can find it. Yeah, it's just overcoming the sense of overwhelmedness or learned helplessness or disorientation. I mean, I remember feeling that way about trying to understand what my tax return would eventually show, whether I'd get a refund.
Heather Walters: It's up.
Jay Howard: It just seems unknowable, like until I get to the end of the wizard that tells me, oh, here on this screen, you're getting a return, the next screen, yep, you've got something coming. Oh, this screen you owe up like $1,000. Go back, go back.
Heather Walters: Oh, whoopsies. And you wanna hit back, back.
Jay Howard: So yeah, we're all embedded in systems that seem so complex that they're just beyond us. But when it comes to the basic consensus reality of understanding what's real and knowing enough to be able to make informed decisions, the kind of decisions we need to make in a democracy or even for our own health and the well-being of our communities, that there is a basis in reality that still matters and we still do have access to it especially if we hold ourselves accountable, you know, our media accountable, our institutions accountable.
Heather Walters: And you're right, Jay, one of the motivations for creating that argumentation textbook was also information overload. We know that technology exists. We know that it can be an overwhelming amount of information for students. So to try to teach them skills like arguing when you're not addressing some of the barriers to their ability to participate in those conversations is very difficult. So.
The point of the book in some ways is like, yeah, we know technology exists. We know you're getting a lot of information coming at you from places like social media. We know that you get so much more information thrown at you every day than like even when we were kids or college students. And you need to be able to navigate that. And that needs to be talked about in classrooms of books and literature that students read than just like a paragraph about the internet.
which is what we found in some older argumentation books. And so we need to update it for the life of the current college student and recognize what they're dealing with it and helping them respond. And so ideally, we've incorporated a lot of these strategies that we've been talking about into curriculum and helping students benefit from it.
Jay Howard: Yeah. Absolutely. Well, and thank you for doing that on behalf of your students and the world. Well, and so that brings me to the question of, speaking of, you mentioned an interest in current events and that motivated 397 this most recent time. You mentioned a variety of articles that you're writing. So the question is,
Heather Walters: Yeah. that world.
Jay Howard: What's on the horizon for you? Is there anything you'd like to plug or talk about in terms of upcoming classes or projects?
Heather Walters: Well, I do hope to teach the AI class again in a future intersession. So if any Missouri State students are out there listening, and you didn't get to take it the first time, I hope that you can again.
Jay Howard: So you think the AI thing is gonna catch on? It's not just a flash in the pan. Ha ha ha.
Heather Walters: I think it's going to catch on. And, you know, you were mentioning things like other additions that it could bring to our curriculum in the department, you know, whether it be certificate program oriented or otherwise. I really hope that I can participate in some efforts to get things like this more embedded in our curriculum moving forward because I think it's so important not only to the average person but especially people who will be creating future content as communication and media professionals. So I'm excited to teach that. Again, I think I mentioned I am in the process of completing a book chapter for a book that is about encouraging college student democratic engagement in an era of polarization. And the chapter I'm writing focuses specifically on AI activities that will be a part of that democratic engagement. I also have been writing about how generative AI will affect the future of democracy, both in the sense of disinformation. So I'm presenting at the Broadcast Education Association Conference in the research area of the impact of disinformation and misinformation on a democratic society. And my paper is Automating Deception, Generative AI, Disinformation and the Future of the Liberal Public Sphere which is going to be a published collection, and a published collection after the conference.
And I also worry about how generative AI might negatively influence the feature of communication if we don't harness it. So I hope to be able to expand some of my ethics research into, you know, how we can use generative AI more ethically or what will drive societal moves toward laws or regulations that help that and then if those things won't work to completely resolve the problem, what can we do to infuse ethics more into the discussion about how AI as a whole will be deployed in the future of society.
Jay Howard: That's some fascinating stuff. and um... I'll have to ask you for that in writing so that I can itemize it and watch for them as they come out. Like I said, this has been a much more hopeful conversation than I thought it would be. I feel inspired instead of sad.
Heather Walters: Well, I understand and I'm worried about the risks that the new technologies and just the situation society is dealing with, how it could be like one of the horsemen of the apocalypse or multiple of them.
hopefully higher education can play a more enhanced role in kind of navigating people toward the positive rather than the negative. But I do think that it requires us as educators taking on a big responsibility of helping society navigate that.
One thing I talk about is how a lot of the traditional institutions that used to teach ethical behavior have declined and what should the role of us as educators be in helping to re-infuse or revitalize a pedagogy of ethics. I think that will be a topic that becomes even more relevant in the future as well. And I hope to be a part of those conversations, which is what is motivating a lot of my teaching and writing right now.