What if I were to stand up here for 11 minutes and lie to you? You’re the president. And I’m done. But seriously though, what if I spent my time just making it up? It could be kind of fun, actually. I could regale you with stories from my life. The time that I made those two free throws to win the conference championship. I could tell you about my Pulitzer Prize. That night in Amsterdam. I could tell you all about the data over the course of my career that supported every major hypothesis that I’ve ever had. The truth is though, I could probably kind of get away with it, actually. Now, granted, I probably wouldn’t be invited back to Frank next year, year after. But for these few minutes, you’d engage with a little bit of what I had to say. And I probably wouldn’t be pulled off stage here. That’s probably good here, Frank, because who knows how that would happen. I can imagine some kind of dangerous circus contraption or something. Pulling me up out of the roof like Lady Gaga or something. In fact, we could talk about my friendship with Lady Gaga for a while. But the truth is that this is a bit of a problem for us. And this problem is something that we might think about on a more grand scale. Think about the mass communication system that we have right now, that we’ve built together. And it’s not just a matter of pernicious lies. It’s also a matter of earnest journalists sometimes making mistakes. Errors get made. You need no further evidence for this than the correction section in a daily newspaper to know that people trying to do their job occasionally make mistakes. And we have an information environment that’s just rife with noise. Information we might label as being false. Also, as humans, as part of the animal kingdom, we’re in good company with lots of beings that have the potential to be deceived. In fact, it turns out that there’s even psychological research to suggest that pigeons can be deceived. I wonder how that work happened actually. I can sort of think about sitting in the park somewhere, faking out a pigeon. Kind of cruel, but… There’s a larger idea, though, that there’s noise in our system. And maybe we label it misinformation. There’s a distinction that Habermas makes between misinformation and disinformation. Thinking about disinformation, we might think about that type of unethical intent, the intent to deceive. But I actually think we probably need to worry about more broadly this umbrella notion of misinformation as well. There’s a lot of information out there that’s maybe not as factual as we’d like it to be. And this is not a new phenomenon. We can go back in the U.S. to our history and look at lots of examples. Think about the run-up to the Spanish-American War and some of the concerns people had as to whether or not the news coverage of the sinking of the USS Maine was as accurate as it should be or whether it was propaganda to lead us into that war. Move ahead a couple of decades and radio listeners one evening in October heard about the invasion of New Jersey by aliens. I’ll leave aside a comment as to whether that’d be a plausible story or not for my friends from New Jersey. But now you think about the current media environment and we’ve got celebrities touting information about vaccines that doesn’t seem to be based on scientific fact. You can even go online and find all kinds of seemingly useful information that I can trust you. I’ve checked this out. It turns out to be useless. You can’t charge your iPhone with an onion, right? Not something that’ll work. And you think about the past calendar year and it seems like almost every day I was filled with examples of misinformation. What I want to argue here just briefly though when I want you to walk away with are three ideas, three reasons why this is a problem that we need to pay attention to. And it’s a problem that’s not going to go anywhere on its own. The first has to do with our very human nature. We are biased towards acceptance of information. I’ll talk about that in just a second. Second, the regulatory structure that we have set up in a place like the U.S. in a democracy is predisposed to monitor and to detect rather than to prevent. This is post hoc detection aspect of our regulatory structure that we wouldn’t want to do away with and I’ll comment on that, but that nonetheless does not necessarily geared up towards prevention. Third, the remedy that we might think about with regards to misinformation is challenging. It’s hard to do. It’s possible, but it’s not going to happen easily. Okay, so onto this first point. Back in the 17th century, there was a philosophical debate between a guy named Spinoza and this other guy, Descartes. They argued about lots of things, but one of the things that they argued about through letters and writing is what happens when you encounter a piece of false information. And Descartes made this argument that actually people are pretty good at being able to screen out to detect false information. Spinoza argued something slightly more nuanced. He argued that actually what happens is you accept information wholesale and then you tag it as being true or false. That opens the door for some acceptance of misinformation. That opens the door for fatigue, for distraction, for other things to get in the way and for people to be walking around accepting information that was otherwise false. And it turns out that the empirical support in the last decade or so is starting to back Spinoza. It turns out that we seem to process some information in one part of the brain and to assess validity with another. We’re also as humans beings that are emotional, and this is a wonderful part of our existence, but it also complicates things. It complicates information processing. In fact, there’s good evidence to suggest that when you are angry that you’re more likely to accept information that backs your preexisting ideas. Sounds kind of relevant to our current political environment. And we also live in a world where we’re connected to other people. And we know that there are certain recipes for spreading rumors that sometimes characterize our wider information environment. You put people in a situation of a vacuum and uncertainty and I can guarantee you that unless you have good corrective information out there that misinformation will spread, rumors will spread. So we have these vulnerabilities as humans. In the case that we have a regulatory structure, you think about something like the Food and Drug Administration, the Federal Trade Commission does, they detect examples of violations. They ask for remedy to occur, but they don’t prevent it from occurring in the first place. We don’t live in a system that has censorship and that sanitizes the information environment. That’s a good thing. I don’t think we would want anything other than that, but just remember that that leaves the door open for some joker to get on stage and to lie to you for a few minutes. And that’s okay, but we just need to be embraced that this is part of our system. Well, it turns out that there are remedies, corrections possible. But I want to also point out that it seems to be possible only when there’s direct rebuttal, when you’re saying, well, we’re going to correct the fact here. We’re not going to necessarily just clarify something. And it also means you need to fight fire with fire. If you’ve had a multi-million dollar advertising campaign that’s out there spreading misinformation and then you expect to counter it with a post on your website, let me tell you, it’s not going to likely work just simply because of exposure differences. And we don’t pay nearly enough attention to that. I don’t care how great your message is and you all are capable of designing wonderful messages, exposure matters too. And we have to be able to have the resources to fight fire with fire. Okay. So where are we going then with all this? I’ve sort of painted, hopefully, a picture that’s somewhat convincing. I wasn’t lying to you, I promise, about what I had to say. But this is a challenge. This is a challenge that’s going to be with us and it’s going to require work from more than just academics. There is a robust academic arena starting to percolate. I’ve had the honor of being able to contribute to that in a couple of ways. A couple of years ago, we put a special issue of a journal called the Journal of Communication Together all about misinformation. More recently, we put a book together that’s coming out now this year called Misinformation in Mass Audiences that I can tell you was a great opportunity. It was a bit more of an academic exercise when it was first articulated. And now, all of a sudden, it seems to be quite timely for better or probably for worse. But nonetheless, this is a problem that’s going to take more than academic research. It’s going to require certainly work in the professional arena and there’s a lot of great work happening in factcheck.org and fact checking organizations. We have the honor of having Claire Wardle here to talk about some of what’s happening in the journalistic arena. So you should stay tuned for that talk as well. But it’s also going to require that we build trust between each other and between ourselves and our institutions so that we can check each other, so that we can do this monitoring, and that we have places to turn to be able to verify information when we have concerns about it. We also need to worry about the proliferation of the term fake news, which has been used as a political weapon in a way that’s really problematic, given all that I just said. But we can get to that in a different discussion. So if you’re interested in talking more, I’d be happy to talk to you after Frank, but I actually have the great opportunity to also spend some more time talking with you here as well, because there’ll be another scrum that’ll actually happen right here in this room, I believe, on misinformation right at 10.30. But in the meanwhile, thank you very much for your attention. Thank you.