Fresh From the frank Stage

Standout talks from the most recent 2023 gathering, featuring bold voices, urgent truths and unforgettable moments.

Amahra Spence

Liberation Rehearsal Notes from a Time Traveler

Shanelle Matthews

Narrative Power Today for an Abolitionist Future

Nima Shirazi

Irresistible Forces, Immovable Objects

The Speaker


Claire Wardle Associate Professor of Communication at Cornell University

An Associate Professor at Cornell University, Claire Wardle is a leading expert on social media, user-generated content and verification. Formerly executive director of First Draft and co-founder of Brown’s Information Futures Lab, she has also held key research and leadership roles at Harvard, Columbia and the UN.

Go To Bio

Watch Next


The Speaker


More than fake news: Understanding the Disinformation Ecosystem

CommunicationsEducationJournalismProblem SolvingPublic RelationsStorytelling

Transcript


Hello, Frank. It is so lovely to be here. I’m an academic by background, but I’ve been in the industry for the last eight years, so I like to call myself a boundary straggler. So I feel I’m here in a home where that’s really taken seriously. So it’s been a wonderful three days. I’ve learned so much, and I hope I can add something back into this space. So my name is Claire Wardle. I run First Draft. We are a non-profit. As Liz said, we’ve been around for the last two years, desperately trying to get people to care about this. Now I kind of wish they didn’t care as much because my life is a tornado, but it’s amazing. And it feels a very difficult problem to solve, a wicked problem, as people say. And the only way we’re going to solve that is by working together. So we actually have a partner network of over 100 global newsrooms. All of the social networks are working together. We also have an academic network. So we’re trying to do exactly what Frank tries to do as well, which is bring people together to find solutions. So I actually went to graduate school with Ryan Selfwell. We did our PhDs together. So when I saw his slide, I was like, hmm, do you think we went to the same grad school? But this distinction between misinformation and disinformation is such an important one. I’m going to spend my next 10 minutes talking about the whole scale of the ecosystem. And this phrase, fake news, as we know, is completely pointless now. And what I’m trying to make the point is we have to understand the whole ecosystem. And part of that is misinformation. Part of that is my mum clicking on a link and sharing it without checking. Part of this is disinformation, which is systematic propaganda campaigns. All of that is in the same ecosystem, and we need to understand both of them. So I talk about the types. I talk about the motivations for creating this kind of content and the dissemination strategies. So because Frank is so organized about a month ago, they said, can we have your slides? I was like, OK, I’m a good girl. I’ll send them in on time. And I created this slide, which I’m about to show you. And then last weekend, I was at a conference. There’s a lot of fake news conferences right now called Miss Infocon, upper MIT. So I sneakily used this slide early. And somebody tweeted it, quite a few people tweeted it, and it got picked up by Brian Stelter from CNN. And it ended up on reliable sources on Sunday. I am not a designer. So this PowerPoint slide ended up on CNN. I was like, I thought you had designers in your newsroom. Anyway, so the point of this is to say that when we think about the whole ecosystem, satire is part of it. People don’t necessarily mean to cause harm, but it’s part of the system, all the way through to fabricated content, which is deliberately designed to deceive and is 100% false. But within that, there’s a whole host of other types of content, part of it created by the news industry themselves. And they need to take responsibility for that. So click bait headlines are part of the issue, all the way through to just old content that gets recirculated. It’s not false content. It’s not fake content. It’s just used in the wrong context. So let’s go through some of these examples. You probably saw this for the minute that Sean Spicer retweeted a satirical video by the onion about Sean Spicer, and he didn’t realize what it was. I felt like we should just all go home and give up. But we’re not going to because we’ve got a big problem to solve. But that satire and parody. Misleading connection. First time voter waited 92 years to meet Trump. What happened next is amazing. I doubt it was amazing. But as we know, there’s a lot of this about and we need to take responsibility for it. Misleading content. So you may know that we’ve actually seen a huge growth in hyper partisan sites on both the left and the right. So things like Occupy Democrats. Eagle is rising. Actually BuzzFeed did a great piece this week showing that actually a lot of those sites are owned by the same people. Hashtag capitalism. But what these sites are amazing at doing is using Facebook and its algorithm to get huge reach. So when they know exactly that you need a really captivating headline, a beautiful visual, and it doesn’t matter how misleading you are because nobody’s going to read your text anyway. So often the text actually isn’t that misleading. But it’s the captions and headlines that are. False context. So this was Donald Trump’s first campaign ad and this was footage that appeared within the ad. Now this footage is correct, but it was actually migrants and refugees crossing over into Malia in North Africa, as she owned by Spain. But of course it was used in the ad to purportedly be people crossing the border into Texas. Of course it wasn’t. This footage is correct, but the context is false. It’s part of the ecosystem. Imposter content. This is increasingly a problem because as we know, eight year olds are very clever and can use iMovie and Photoshop in a way that makes our eyes water. So the minute you have a logo, you can create whatever you want. So now this news had a real problem during the election with people taking their logo and creating videos that looked like now this news. And they don’t have a website. They rely entirely on social. They had to push out the correction. And as we know, nobody retweets the correction. But all they could do was like, that wasn’t us that created that video. And on the other side, those of us who have a lazy brain, as Lisa’s still here, you don’t look at that carefully enough, but it’s abcnews.com.co. We don’t look at URLs. The logo is not close enough for ABC to have any legal standing around this. So we have this imposter content. We even have journalists whose bylines get taken and used on completely false articles. Manipulated content. So this was a photo that went viral during the election. It is a composite of two true photos. The first photo of the people standing in line was from Arizona during the primary in March. The right hand you can find on Wikipedia if you just Google arrest. They put those two things together. If you’re scrolling through on your phone very quickly and these are the ideas, the ones that you agree with, you’re not going to critically use your brain to say, how odd. There’s an arrest taking place and people are just looking at their phones. Of course you know that that’s not true, but you’re not using those critical skills as we have heard from our amazing social scientists. And finally fabricated content. So we know about the fake news sites created by Macedonian teenagers in their basements. This is the stuff that really worries me. So visuals, we are a lot less critical of visuals. And these were targeted specifically to certain communities, surprise, surprise Hispanic communities and African Americans who were told you can stay at home and just vote for Hillary via SMS. This is the stuff that worries me. Yes, of course I don’t like fake news sites, but if we don’t actually broaden our definition, we’re in trouble. So to go back to those types, this is why I think we have to think about all of these different types. And we also have to cross reference that with motivations. So I like to think of a kind of a graph with two axes. There’s a big difference between those people who are doing this for profit versus those people are doing it for political influence versus those people are just punking. The only really works in the American context, but as you can see, I love the peas. And this is still a work in progress. But for me, we have to understand those two axes and see how they fall and how they work. This is critical to finding solutions. And finally dissemination methods. There are people unwittingly sharing false information. The weekend of the so-called Muslim ban, a lot of liberals re-tweeting a lot of false information because they were scared and they were angry in the same way as we’d seen just before the election. Journalists are making mistakes every single day still and there is no excuse for sharing information that’s false. Things that really concern me, individuals or loosely connected partisans, passionate individuals or trolls. If you haven’t read this piece by Buzzfeed about U.S. teenagers trying to influence a French election, read it and weep. If you have teenagers, check that they aren’t one of them. Basically, young, mostly boys, connecting using things like Google Forms, Dropbox, basically creating meme shells. So these teenagers saying, we don’t speak French, it’s okay, we’re creating meme shells, which are obviously images and quotes. They just take from this kind of Dropbox platform and dump into the French Twitter system, connecting with people that they think might be influenced by that. And then if you’re not terrified by troll factories and bot networks, please be terrified. We’re in the middle of an information war and whilst I’m really pleased that the spotlight has been shone on this, the rest of the world is like, welcome to the party America. You know, Abba Abzir Vajjan, the Philippines, Bayran, they’re like Bahrain, they’re like, yeah, yeah, yeah, we’ve been here. Bot networks are terrifying and trying to map that is something that we really have to get our heads around and get around quickly. So visuals, visuals, visuals, visuals, anybody who heard this American life clip, we mean him into the presidency. We mean him into power. We should post it our way into the future. This is true. This is true. Because we, we directed the culture. So we directed the culture. This was recorded at the Deplora Ball the night of the inauguration. And they’re basically talking about how they used memes to direct the culture. So if we take a look at Breitbart for a second, and I’m sure I hope many of you do look at Breitbart, we need to. And I’d actually say this is a communications professional. Learn from them. Don’t tweet that. But here’s what I want to show you. Here’s a breakdown by a friend of mine who’s a researcher around Facebook posts on Breitbart. So the donor on the left, you can see the majority of what they post are links. If you look at the other donor, it’s what people share. And people share videos and images because they have an emotional reaction to it. So the term meme actually isn’t that helpful. We need to think about that definition. Memes are playful and fun, but memes are increasingly being weaponized on the left and the right. This is the Trump meme generator where you can basically write your own executive order for Trump, take a screenshot and tweet it out. So the, the role of visuals in this space is something that I’m absolutely obsessed by. So solutions. We are working on a project in France called Crosscheck. Please Google it. This is our workflow. It’s pretty exciting. We’re working with 35 different French newsrooms to create a collaborative real-time verification platform. We’re debunking collectively and then pushing out debunks with multiple logos on, hoping that this might improve trust. It might not, but I’m sick of just talking about this. We have to actually do something in the field and start researching it and work out whether this is going to make a difference or not. So any researchers in the room, let’s talk because we’ve got a lot of stuff to do. Doesn’t matter if you’re an ethnographer or a social psychologist, let’s test this stuff. However, all of this fact-checking, I worry, is for nothing because we’re all essentially toddlers on a supermarket floor right now and we are angry and we are emotional and we cannot be reasoned with. So a big part of this is how do we change the way that people think about this? And a great piece by Craig Silverman who was interviewed for WMIC on the media said, we need to learn emotional skepticism. If you consume a piece of content and you feel angry or you cry or more importantly, you feel smug, something should go off in your brain that says, uh-oh, I need to engage my brain. So in the same way as I like to wait 20 minutes for my second helping of food because I wait for my stomach to catch up, I would love Facebook and Twitter to say, you want to retweet that? Wait two minutes. Come back after you’ve checked it out. They’re not going to do that. We can have a separate conversation about the platforms. And finally, we need to find a way to make it socially unacceptable to share false or misleading content. We need to talk to the people who’ve been successful in anti-smoking campaigns. We need to understand how this works. And very finally, this was a beautiful thing that played out on Facebook, a friend of mine. They gave me permission. It’s since been, um, uh, previewed in a piece, but this was a guy who posted the Info Wars, um, article. He went to high school. He now is in New Mexico. His friend lives in New, uh, in New York and the friend in New York said, so now you’re posting false content. And he was like, well, unfollow my unfollow me if you don’t like it back and forth, back and forth about political facts. I don’t trust political facts. I don’t trust the New York times back and forth, back and forth, back and forth, back and forth. But then Nate says, the only reason I’m having this conversation is because I care about this country and I hope you do too. Back and forth, back and forth. And it ends with anyway, look, looking forward to that beer one of these days. So how do we equip people in their communities to ensure that we’re using peer to peer persuasion to actually change people’s minds? That’s the key to all of this. And that’s why this shouldn’t just be, you know, Facebook’s problem or CNN’s problem. It’s everybody’s problem. And we can go home and stop crazy uncle Bob spreading this stuff today. Thank you very much.

Watch Next