
Fresh From the frank Stage
Standout talks from the most recent 2023 gathering, featuring bold voices, urgent truths and unforgettable moments.

Amahra Spence
Liberation Rehearsal Notes from a Time Traveler

Shanelle Matthews
Narrative Power Today for an Abolitionist Future

Nima Shirazi
Irresistible Forces, Immovable Objects
Why Myth Busting (Usually) Doesn’t Work
Behavioral ScienceCreativityEducationEmotional IntelligenceProblem SolvingTechnology
Transcript
All right. Well, thanks so much for having me here. I’m really excited to talk to you today about the problem of misperceptions, which is what Liz was talking about. So there’s this problem we have that there are so many issues that people believe things that aren’t true or aren’t supported by the best available evidence. And I come to this problem with someone who tried for many years myself to correct those as the founder of one of the first fact-checking websites. But I’m now a social scientist, and so I study these questions using data. And what I want to show you are the results from some research I’ve done over the last few years with my co-author Jason Reifler and other collaborators to try to understand why it is so hard to correct the misperceptions that people have. Okay, so the problem I’m interested in is that people believe so many things about the world that aren’t true, from the belief that President Obama wasn’t born in this country, to the conspiracy theory that 9-11 was an inside job, to the myth that vaccines are poisoning our children. So these beliefs are out there, they’re widespread, and they’re often very confidently held. This is different than simply not knowing things. We’ve spent a lot of time showing that people know very few of the facts we might like them to hold, for instance, in my field of politics. But what I think we haven’t appreciated until recently is how strongly people can come to believe in false information and how difficult it is to correct that information once people believe it, especially when it relates to some aspect of their identity or belief system that is important to them, is meaningful to them, when it involves a controversial issue or political figure. And that turns out to be a really difficult problem. And I’m going to show you some examples of how trying to correct those sorts of misperceptions might not have the effect you expect. In some cases, it’s not only ineffective, but it can actually make the problem worse. Abby’s going to be my slide-advancer. All right. Abby, can you move us forward? Oh. Oh, no. You overshot. Back, back. Okay. Okay, we’re going to try again. Okay. Woo. All right. So the problem we have is this. We often proceed under what’s sometimes called the deficit model. And the idea is people lack the facts they need. And so we just need to give it to them, right? People have an incorrect belief. They’re just waiting for the right information. If we bring it to them, they’ll change their mind. And now, I think very few of us would put it as boldly as this cartoon, but this kind of assumption underpins a lot of what we do in health, in science, in medicine, and in journalism. And I would love to live in this world. This is a world I think we’d all like to live in, but it’s not actually the world we do live in. The world we live in looks more like what’s in the next slide. This. Right? When you give people information that they don’t want to hear, that’s unwelcome. They contradict some attitude of belief they already have. They can be incredibly resistant to that information. And I should say we all do this. We all do this. This is a human tendency. It’s a psychological response called disconfirmation bias. When you challenge views people hold, they tend to try to defend those views. They tend to be skeptical of that information you’re giving them that contradicts that belief or attitude they already have. And the result is that when we use corrective information, we may not be generating the effect we expect if we provoke this sort of a response. All right, so if we go to the next slide, what you can see is just how prevalent these sorts of misperceptions are. More than 40% of Democrats said they believed that the Bush administration aided or allowed in the 9-11 attacks after in 2006. More than 40% of Republicans at 2010 poll said President Obama wasn’t born in this country. So we’re talking about millions and millions of people who believe things that have been widely and repeatedly debunked, for which no credible evidence exists. So this is an incredibly widespread phenomenon. And of course, again, I just want to underscore how often these sorts of misperceptions correspond to people’s value systems. You can see that each of these misperceptions has a steep partisan gradient, where the partisan group that’s most likely to believe in the myth holds that view at a much higher rate than independence and correspondingly for the opposing partisans. So there’s a very steep relationship between which party you’re in and what you believe in both of these cases. Next slide, please. So I mentioned earlier that people often hold these views very confidently. And I want to illustrate just how deep that can go. With data from a study I did looking at misperceptions about health care, which Liz mentioned in the introduction, and the two most prominent myths about health care reform, the two major efforts at health care reform of the last 20-plus years, the Clinton health care reform effort of 1993 and 1994, and the Obama health care reform effort of 2009 and 2010. And the left panel has predicted probabilities of endorsing the most prominent myth about the Clinton health care plan, which is it would take away your choice of doctor even if you paid out of pocket. And the right panel has predicted probabilities of believing in the most prominent myth about the Affordable Care Act, which is that it would create death panels. And what’s on the X axis here is people’s self-evaluated understanding of the plan. So pollsters asked them, and the polls that were taken during these two debates, how much do you think you know about the planning question? And what’s really striking is that in both of these cases, as you move from people who say they know less about the plan to the ones who say they know more, among the partisan group that was predisposed to believe in these myths, which were Republicans in this case, the folks who thought they knew more were actually more likely to be misinformed rather than less. So they think they know more, and they actually know less. And you can imagine why it might be hard to correct these sorts of misperceptions. These are people who think they’re very well informed about the Bill and question. Next slide, please. So what this means is we should take seriously the idea that correcting misinformation might not have the effect that we anticipate. It might be ineffective or even counterproductive. And that includes both the kinds of fact-checking efforts you’ll often see in coverage of politics, like PolitiFact or some of the other fact-checking operations that are out there, as well as the kind of myth-busting approach you’ll often see in healthcare and other areas that feature myths and then tries to say, this is the myth and this is the fact, like this CDC flu vaccine facts and misflyer on the right. Okay. So let me give you an example from my research. This is looking at one of the most widespread misperceptions of the Bush administration, which is that there were weapons of mass destruction in Iraq before the U.S. invasion. Now, of course, this was a claim that was widely made before the invasion itself. It’s understandable why people came to believe it was true. But as we all hopefully remember, after the invasion, there was no evidence of an active weapons of mass destruction program or weapons of mass destruction themselves. This was not only the conclusion of independent observers, but the Bush administration’s own panel, which issued a report, if we go to the next slide, called the Dwellford Report, which actually reached these conclusions officially. Okay. So my co-author and I, wait, you’re jumping ahead on me. Somebody will have a working clicker at Frank. My co-author and I experimentally varied whether people saw corrective information or not in a mock news article, a realistic mock news article, because after the invasion, President Bush was talking about the supposed threat from weapons of mass destruction as if they had been there. And so we experimentally varied after seeing coverage of those quotes, whether people were seeing corrective information saying, actually, there were no weapons of mass destruction, there was no weapons of mass destruction program or not. And then we looked at what effect did that have on people’s belief in this myth. Okay, next slide, please. Okay. So the response to that varied depending on people’s ideology. So among liberals who were, of course, happy to hear that what President Bush was telling them wasn’t true, giving them that corrective information reduced belief in the WMD myth. Okay, exactly as you’d hope. But they were pretty unlikely to believe in it to begin with. But when it came to conservatives who are predisposed to believe in the myth, who generally liked President Bush, this was conducted in 2005, believe in that myth more than doubled. This is what’s called what we call the backfire effect. And the idea is that when you provoke people in this way, when you challenge them with corrective information, that disconfirmation bias we talked about earlier can cause them to counter-argue that information, to think of reasons why it might be true that Iraq had weapons of mass destruction. And in the process of doing that, people can come to believe in that myth more than if they hadn’t seen that corrective information in the first place. All right, next slide, please. So let’s turn, let’s move from the domain of pure politics to the domain of health. Okay. This is an issue that’s been in the news a lot lately. Okay. Liz was talking about an introduction, which is vaccines. Okay. Some of you may be unfortunately familiar with this article up here, which was a now infamous article that has been retracted, it was published in the Lancet in 1998, suggesting that the measles, mumps, and rubella vaccine, which is called the MMR vaccine, causes autism. Okay. When this came out, it was a sensation, especially in the UK where the lead author was a practicing doctor and it had a tremendously detrimental effect. Their measles, mumps, and rubella vaccination rates dipped significantly below the levels we need to maintain herd immunity and contributed to a vulnerability to measles cases that continues to this day. They recently had a measles outbreak in the UK and that cohort that came of age at the time this article came out was especially hard hit because they hadn’t been covered as well as other cohorts that came of age at different times. Okay. So if you haven’t had that same result here in the United States, our MMR vaccination rates overall are still quite good, but as you’ve seen in the coverage of the measles outbreak that started in Disneyland in December and continues to this day, there are places where vaccination rates are lower, where herd immunity is potentially compromised if enough parents opt out of the standard vaccine schedule. And so the question is, is misinformation driving vaccine hesitancy and are the efforts that we’re undertaking to counter that misinformation effective? All right, next slide, please. So there’s reason for concern. One study found that 25% of American parents agree or strongly agree that some vaccines cause autism in healthy children, 25%. So if even a fraction of those parents opted out of the standard vaccination schedule, it would endanger the herd immunity that we need, the level of vaccination we need across the population to prevent the spread of communicable disease. So this is a really serious potential concern. Now most of these parents are still vaccinating, but there’s a real question of does this misinformation, is it causing more people to opt out of the standard vaccine schedule? Okay, next slide. So what do you do? Well, after this article came out, a bunch of scientists went out and they did a lot of studies. And what did they find? They found no evidence of vaccines cause autism. You can fill a shell full of studies finding no relationship between the MMR vaccine and autism. So if you go to the CDC website and say a parent and you want to see what effect vaccines have on autism, you can come to this page and it lists a series of studies telling you that vaccines don’t cause autism. Here are a bunch of studies. And that’s the standard kind of approach when we’re throwing facts and science at people. But what my co-authors and I wanted to find out is what effect does this have, right? Is giving people this information going to be effective at changing their minds about the hesitations or concerns they have about vaccines and make them more likely to vaccinate a future child. So again, we did a nationally representative survey experiment of U.S. parents with kids at home age 18 and under. And we randomized whether some of them saw this information or and compare them to a control group. There were a series of other messages that I won’t talk about now that were also in the study, but the primary comparison of interest for us right now is what happens if you give people that corrective information compared to a control group that didn’t see anything about vaccine. Okay, next slide. So first the good news. So I’ve depressed you horribly, I’m sure by this point. The good news, the belief in the autism myth went down. And that response didn’t vary by how parents felt about vaccines. So the corrective information was effective in reducing people’s misperceptions about the autism myth. So this is an encouraging result. Next slide. The problem, however, is what happens when we ask people how likely would you be to vaccinate a future child? And here it really mattered how people felt about vaccines. Just like before we saw that liberals and conservatives responded differently to that corrective information about weapons of mass destruction. In this case, depending on how parents felt about vaccines, which we measured in a prior wave of the study, they responded very differently to the information we gave them telling them the vaccines didn’t cause autism. So we divided the population of U.S. parents into approximately three equally sized groups. So the middle and right panel here are the parents who had favorable and very favorable attitudes towards vaccines. For those groups, it had no significant effect. Debunking the myth had no effect either way. They’re overwhelmingly likely to vaccinate no matter what. But the group that’s of the most concern is that group on the left, the group of parents who have the least favorable attitudes towards vaccines. This is a group that is not largely hardcore vaccine skeptics. They’re folks who have questions or hesitations. They have mixed feelings about vaccines. So this is, again, this is about a third of U.S. parents. It’s not the tiny fringe. This is a broader group of parents who have more mixed feelings. So if we zoom in on that group in the next slide, what you can see is that their self-reported intention to vaccinate a future child dropped dramatically when we gave them that information, dramatically, more than 20 percentage points. And these are, of course, the parents who are most likely to opt out of vaccinating their children in the first place. They already, they’re entering with more mixed or negative attitudes towards vaccines. And giving them that information is making the problem worse, not better, by significant margin. Okay, so what’s going on? They didn’t seem to resist that information about the autism myth itself, but our interpretation of this result is that people are thinking of other reasons to question vaccines, other concerns or hesitations they have about their safety or efficacy. And in the process of bringing those to mind, they’re ending up in a more negative place towards vaccines than they otherwise might have been. Okay, so next slide. So what can you do better? Right? So this is the question I get most. You’ll be hearing more from me in approximately 21 minutes about what else you can do. So let me close with some recommendations based on our research of approaches that things you might want to consider doing or not doing. The first thing is, it’s important to test your messages experimentally. When I entered this field, I thought, of course all the messages we use about vaccines have been tested. Right? Medicine is all about randomized controlled trials. They’ve tested all of these messages. And I kept looking around and waiting for these studies that I couldn’t find, and it turned out by and large they don’t exist. Okay? So you need to experimentally test your messages because rather than just assuming that throwing facts and evidence on people will have the effect you anticipate. Okay? Second. Oh, let’s go back to that one for one second. Second, it’s important to not reinforce the myths that you’re trying to debunk. So this flyer here is another example of this kind of myth-busting framework. And it actually makes the myth more prominent than the correct information. Right? If you just scan this, you might say, oh, well, smoking is just a choice. Right? Filters make sure you’re at safer. Right? And there are lots of reasons to suggest that not only could this sort of format provoke the kind of backbash that we found in our research, but it makes these myths more familiar. And other research suggests that just making these things more familiar can cause people to be more likely to think that they’re true. Okay? So this kind of format should be used very carefully. I’m not prepared to say you should never be used, but it should be used very carefully and should be tested because it may have unanticipated effects. Okay, next slide. All right. Second, it’s important to think about the credibility of the sources of information that are being provided to people. Okay? So if you’re trying to reach a skeptical audience, there may be sources who are more credible to them than the ones people typically draw on. Okay? So the information we gave to people cited institutions like the Centers for Disease Control, the American Academy of Pediatrics. Those are reputable institutions, of course. But when you ask parents, who do they trust most for information about vaccines, they overwhelmingly say their children’s doctor. Overwhelmingly. That doctor, their pediatrician, has a relationship of trust, hopefully, with that family and could be a much more effective advocate for vaccines than other sources of information. Okay? So healthcare providers in the vaccination domain can play an important role. But they’re not the only kind of credible source we might be able to draw on if we can go to the next slide. This is an article I really like a lot. This is an article that came out during the death panel controversy. And there were a lot of stories about the death panel controversy that basically broke down along partisan lines. They said, Democrats say this and Republicans say that. And of course, if you give people those kinds of tribal cues, they’re going to sort themselves in a tribal way. Okay? What I liked about this article is it took a really different approach. It said, even experts who don’t like this bill agree that there are no death panels in it. So in other words, if you don’t like this bill, this person agrees with you. They’re speaking against their interest in opposing this bill, in conceiving that there are no death panels in it. And that’s an example of a credible source. You can imagine, for instance, Bob Inglis has been a very courageous advocate of climate change on the Republican side of the aisle. There’s a very famous now climate scientist who’s an evangelical Christian who can speak credibly to that community in a way that the typical atheist university professor can. So the sources we use can matter a lot. If you take anything away from this presentation, I hope it’s this, if we can go to my last slide. Throwing facts and science at people just isn’t likely to be the answer. This deficit model where it’s simply a matter of delivering the information to people and assuming that it’ll have the anticipated effect has failed. It’s failed over and over again. We have to keep learning and re-learning this lesson. It’s time to think more carefully about the kind of messages we use and the ways we try to deal with these perceptions. Thanks a lot.
