Ctrl+Alt+Facts

Season 2: Episode 7

From campaign bots to conspiracy videos, it’s harder than ever to discover the truth online. In conversation with The New York Times’ Sheera Frenkel, Data For Democracy Policy Lead and Mozilla Fellow Renee DiResta, and DisInfoMedia founder Jestin Coler, we navigate the age of disinformation. It’s the season finale of IRL, recorded live in San Francisco on March 18th, 2018.
Published: April 3, 2018

Show Notes

A recent Gallup survey found that most Americans feel that it’s harder today to be well-informed than ever before. But each of us can play a part in stopping the spread of misinformation. Learn more.

Transcript

Veronica Belmont: Hello everyone, it’s Veronica… this episode is all about misinformation and it’s our last one for this season… but also it’s the first episode we’ve ever taped in front of a live audience. We recorded it a couple weeks ago and since then, a lot has happened on this very topic—in real life. To catch up on the latest, head to our show notes. In the meantime, enjoy this episode. Here it is.

Micah White: Well you know, I’ve been an activist my entire life, and one of the things that you have to learn, I think, is to not get sucked into things that are totally sketchy.

Veronica Belmont: It’s 2016, and Micah White gets an email. Micah is one of the creators of the Occupy Wall Street movement. The email, it’s from a guy who calls himself Yan Big Davis. Yan wants to interview Micah about protest movements in America. He says it’s for a website called Black Matters. He tells Micah they about 200,000 followers. There’s something about the email that just doesn’t sit right. The wording, it seems a big strange, but it seems harmless enough, so …

Micah White: I was like, “Sure, we can do the interview.” Immediately, it was super weird because, first of all, when I got on the phone, it felt like he was calling me from you know over a really terrible internet phone call from very, very far away, definitely international, and he had this really strange accent. His was of speaking and accent was definitely not a fluent English speaker.

Veronica Belmont: A week or so goes by. Yan sends Micah a link to this finished piece.

Micah White: The website he sent me is called blackmattersus.com, which is so suspiciously similar to something like Black Lives Matter that I was like, that’s kind of weird.

Veronica Belmont: Things just get more weird from there. Now Yan has made a connection. He’s supposedly established his credibility. Yan Big Davis starts asking for favors.

Micah White: And then in the days and weeks ahead, he would email me and say, “Oh, can you help me with this protest? Can you promote this protest that I’m working on,” and I just didn’t reply.

Veronica Belmont: It’s annoying, but Micah doesn’t think much of it until a year later. His phone rings.

Micah White: A Russian investigative journalist got in touch with me and said, “Oh, I’m doing a story about this Russian troll farm, and did you know that you actually did an interview with the website, one of the websites that they had,” and I was like, “No, I didn’t know that. That’s so fascinating and strange.”

Veronica Belmont: So, that’s how Micah finds out that a Russian agent tried to trick him into becoming a tool to spread disinformation online, to get him to promote Russian-sponsored protest on US soil, to generally muck around with polarized and politicized American activists. Fortunately, Micah dodged that misinformation bullet.

Micah White: I think that disinformation and misinformation is a kind of mental pollutant because it destroys are ability to see what is true. It destroys our ability to even believe that there is a true.

Veronica Belmont: As an activist, Micah has trained to spot a fake, and yet here he was, a media-savvy, politically-aware person missing the clues that he’d been targeted by a disinformation factory. If that’s the case, what chance do the rest of us have?

Micah White: Oh, geez. How do we live in a world of misinformation and disinformation? I mean, that is a really … I don’t know if I have the answer. I mean, I think that this is just the new reality.

Veronica Belmont: It feels like online we’re all adrift in a sea of false facts, a title wave of deceiving data, and we don’t know what to hold onto. The information age became the misinformation age when nobody was looking. All season long, we’ve been talking about how online life shapes who we are. Our identities are made up of the beliefs we hold, so today’s crisis of misinformation is messing with us on a really personal level. Still, for all the talk about living in a post-truth world, we care about finding a clear signal through all that noise. Today, on IRL, living in an age where we must control alt facts. Live from the Commonwealth Club in the heart of San Francisco, it’s IRL, In Real Life. Thank you. I’m Veronica Belmont, and this is IRL, Online Life is Real Life, an original podcast from Mozilla. Let’s start by testing the limits of our live audience’s ability to tell real from fake. It’s a little quiz segment we’re calling-

Audience: Really Real, or Really Fake.

Veronica Belmont: Nice. All right. So, we need some volunteers from the audience to play this game with us, so raise your hands and our mic handlers will find you. So, I have here a collection of headlines pulled from the web. We used factcheck.org and snopes.com to verify these headlines, so some of the headlines are from real stories, and others are for fakes stories that might seem real to you. Let’s find out. I’m gonna read them aloud, and then it’s your job to use your gut, not your Google, not your phone, not your technology, to separate real fact from real fiction. All right. So, we have - it looks like we might have our first person here. Okay. What’s your name?

Steven: Steven.

Veronica Belmont: Steven? Okay. The headline is McDonald’s is flipping its iconic golden arches upside down in celebration of women, and people are freaking out. Is it really real, or really fake?

Audience: Really real.

Steven: It could be real, really real.

Veronica Belmont: Really real?

Steven: That is exactly right.

Veronica Belmont: Yay. Alright. That was a real headline. That was for International Women’s Day, I believe. All right. So, we’ve got our next one. What’s your name?

Sierra: Sierra.

Veronica Belmont: Hi, Sierra.

Sierra: What up?

Veronica Belmont: So, we’ve got the headline, Oakland coffee shop refuses to serve police officers, and cops say it’s a teaching moment.

Sierra: Really real.

Veronica Belmont: You went right for it, you guys. You feel good about this?

Audience: Yeah.

Veronica Belmont: Really real?

Sierra: That is exactly right.

Veronica Belmont: Perfect. Let’s try another one. Okay. We’ve got someone over here on this side. What’s your name?

Stephen: Stephen.

Veronica Belmont: Stephen, okay. The headline is pet owner is shocked to find out that the two cute puppies he bought and raised for two years are actually bears.

Stephen: I have to say really fake.

Veronica Belmont: Really fake?

Stephen: Yes.

Veronica Belmont: Alright. That’s his final answer. Really fake?

Stephen: You’re the opposite of correct.

Veronica Belmont: I mean, when you hear that, how many people of us at home have bears and we don’t even know? I mean, at this point anything is possible. One more headline today. Hi.

Marcus: Marcus.

Veronica Belmont: Marcus, okay. The headline is man dressed as Elsa from Frozen pushes police wagon out of snow. Really real, or really fake?

Audience: It’s real.

Marcus: Really real.

Veronica Belmont: Really real? That’s your final answer?

Marcus: My final answer.

Veronica Belmont: Okay.

Marcus: That is exactly right.

Veronica Belmont: Have you all seen the actual pictures of this?

Audience: Yes.

Veronica Belmont: They’re amazing. All right. Well, that was Really Real, or Really Fake. I hope your gut did you proud. Nice work, everybody. We live in a world where misinformation and disinformation share top billing alongside actual facts and true stories, so how are we supposed to navigate our way through all of that? To help sort out that thorny problem, I’d like to invite our three featured guests to join me onstage, and a round of applause for our guests as they come up. Hi. Hello. First, here’s Sheera Frenkel, a journalist with the New York Times. Hi, Sheera. So, I’ve asked each of our panelists to tell us three things about themselves, and in keeping with the theme of today, I asked them to make sure that one of those things is a total fabrication. So, we will uncover the lies at the end of the show. So, Sheera, two truths and a lie. Who are you?

Sheera Frenkel: So, I have a piece of jewelry made out of a piece of chandelier from Muammar Gaddafi’s palace, I was once kept in an airport prison for 48 hours, and I am excellent, like world’s greatest, stick shift driver.

Veronica Belmont: Nice. All right, and do you live in San Francisco?

Sheera Frenkel: In Oakland.

Veronica Belmont: Okay, and you drive stick. Nice. That’s impressive. All right. So, your Twitter profile says you cyber all the things for the New York Times, and as a young person in the ‘90s online, that meant a very different thing, I think, than it does now. So, how much does misinformation, disinformation, fake news get in the way of you trying to unearth actual true, real stories in all that cyber?

Sheera Frenkel: It’s getting harder every single day. I would say that I spend a good, I don’t know, two thirds of my time trying to discern what’s real and what’s not, and that goes to everything from people sending me emails with all sorts of claims, and sources who want to pass things off as real that are not.

Veronica Belmont: And our next panelist has been called the fake news king because he used to make a living spreading fake stories. Jestin Coler, thank you so much for coming.

Jestin Coler: Thank you for having me.

Veronica Belmont: All right. Let’s hear your two truths and a lie.

Jestin Coler: Mine are very boring now, of course. I am a happily married father of two children. I was originally from the Midwest, but I’ve lived in 10 cities and now call Southern California home. In my free time I enjoy jogging and long walks on the beach.

Veronica Belmont: Nice. Setting a romantic scene. I like it. So, Jestin, for people in the audience who may not be familiar with you, could you walk us through how much of your life was about making fake news before you quit?

Jestin Coler: To be honest with you, it was a very small fraction of my life. I mean, I have a day job that I’ve had for the past decade. I have, like I said, two kids at home. That might’ve been the lie. So, not a lot of free time, necessarily, but this was really something I did kind of as a hobby, and it didn’t start with let’s start making fake news. It started in a very different place than it ended, and it was the wildest ride along the way.

Veronica Belmont: So, what led you to stop?

Jestin Coler: Well, in January of 2015 Facebook has made some changes to their algorithm that really changed my business model. At that point, I’d kind of built this group of contributors that I cared for, kind of a family, and we decided at that point, let’s just kind of ditch this kind of fake news stuff and let’s get back into what we had kind of started with doing things that were entertaining, slapstick, satirical, just entertainment kind of stuff. So, for quite a while I just had walked away. It wasn’t any fun anymore.

Veronica Belmont: And last on our panel is Renée DiResta. She’s a leading voice of fighting propaganda and disinformation. She’s also the newest fellow in residence at Mozilla. Hello, Renee.

Renée DiResta: Hi.

Veronica Belmont: You know the drill, two truths and a lie.

Renée DiResta: Yeah. So, umm I once failed a Petco honesty test, I climbed Mount Kilimanjaro, and I have three kids.

Veronica Belmont: Did you say a Petco?

Renée DiResta: Petco. That might be an East Coast store. It’s like where you go to buy your goldfish.

Veronica Belmont: I just didn’t know they had honesty tests. That was the new part to me.

Renée DiResta: I was applying for … Well, that might be a lie, but …

Veronica Belmont: You helped the U.S. Congress prepare for hearings when they confronted social media platforms about the Russian disinformation. What was the most important thing you wanted to get across to Congress, the most important questions that they had to ask?

Renée DiResta: Well, the thing that I wanted to get across was that this was a systemic problem. Russia was one actor of a whole litany of folks who could use the same systems to do similar things, and that since it was a systemic problem it required a systemic solution. What they wanted to know about, it really ran the gamut. It was everything from teach me about YouTube AdRoll, are they monetizing terrorist content, to walk me through the specifics behind how this particular campaign spread.

Veronica Belmont: So, as we explore this today, I’m going to inject a few perspectives to motivate the conversation and see where it takes us throughout, so let’s start right away and hear from Mandy Jenkins. She’s the editor in chief at a company called Storyful. Storyful’s job is to fact check social media stories for their clients. Here’s what she has to say about why we make mistakes when trusting our social media sources.

Mandy Jenkins: I often joke that the misinformation age has been good for business. Social media has made it so much easier for anyone to be able to spread information or misinformation to wide audiences. It’s that one to many ability that really captures people’s imaginations and makes them want to, in turn, take something they learned and pass it on to their big groups of people; much like a virus, in a way. Maybe your brother, your cousin, your college roommate sent that to you. Why would you assume that they didn’t know where that originally came from? Trust is a very complicated and difficult goal these days, because it seems to mean something different to everyone.

Veronica Belmont: So Sheera, Mandy says that trust is a difficult goal because it means something different to everyone. Does that ring true for you?

Sheera: Yeah, I suppose it does. I do think that social media has injected some interesting questions into our discussions of what to trust, and when, and who to trust. We recently had a story about Facebook and how they were going to be removing a lot of news platforms from within the main Facebook feed. You were going to be seeing more content from your friends and family. Facebook was doing this because they were like, ‘This is what people want. They trust their friends and family. That’s how the platform came about. It’s going to bring us back to the old days of Facebook.’ What we did is we looked at countries where this has already happened. In those countries, what people noticed was that there was more falsehoods on their platform because their aunt, or their grandmother, or their best friend that’s prone to believing conspiracy theories, shared all this kind of misinformation.

Veronica Belmont: We all have that aunt.

Sheera: Right; and because the news organizations weren’t there to counter it, there was no voice. There was no sort of voice of reason. And so the newsfeed just became this ongoing barrage of the most fantastical, sensational headlines people had curated and then shared.

Veronica Belmont: And Renee, how much of the problem lies with outright propaganda machines?

Renee: I think it depends on what you mean by ‘propaganda machine.’ Propaganda has historically meant content put out by a state government. In some parts of the world, propaganda is a very big issue. In others, like what we’ve saw with the Russia thing, that was … not propaganda in the traditional sense of the world because it wasn’t overt. It was kind of a subversive effort to put out content that looked like it was from organic pages, actually that looked like it was coming from your neighbor, right? From your friendly Texas secessionist down the street. Then, this notion of the fifth estate, the citizen blogger, the citizen journalist who’s just authentically reporting what he sees. They’re not tainted yet by the kind of aspersions that have been cast on media as partisan entities in quite the same way.

Veronica Belmont: So yeah, Jestin, then, why do you think you were able to capture an audience the way that you were?

Jestin: Confirmation bias is really key to it all, really just feeding people what they already want to hear. Once you’ve kind of studied what audiences want and you are writing to get to those biases, and kind of create those emotional connections with the reader, it’s quite simple, to be honest.

Veronica Belmont: Can you give me an example of a story that you did that really took off based on some of that research that you did?

Jestin: Well..let’s see, several stories. I guess the one that probably landed me in this seat was the story about a conspiracy-type story that we ran on the site called Denver Guardian right before the election. It was a story about an FBI agent who had been found dead. He was investigating Hillary’s email. He was found dead in a murder/suicide. One of these kind of stories, they kind of, again, there’s this large group of conspiracy types on the internet that want to believe that there’s a Clinton crime machine that’s running around killing all their foes. That was just kind of a very simple example of that, just kind of giving somebody something that they already wanted to read to know.

Veronica Belmont: And Renee and Sheera, how much of this is about confirmation bias and the kind of continually propping up these ideas over and over by people you trust?

Renee: I think there’s a … It was an interesting concept really to confirmation bias called majority illusion, which is actually the idea that if … it’s if you believe that a majority of people in your community hold a belief, you’re sort of more likely to feel that there’s some credence to that belief. So one of the things that’s interesting about social media is you can fake majority illusion by having a sufficient number of people kind of pushing out a message, the idea being to like manufacture the perception that a large number of people believe that a story is true, or something like that.

Veronica Belmont: Yeah, we saw that. We talked a lot about that, actually, in our bots episode about how you can get something trending and then suddenly it’s hundreds or thousands of people are tweeting the same hashtag. And you go and search it, and they look like real people. You start to think, ‘Oh, this is actually … This seems like a real thing that’s going on.’ Like, ‘Wow, I had no idea.’ That just keeps going over and over.

Sheera: Something else I was going to say was that we’ve seen what’s happened in societies where there aren’t strong journalistic institutions that people trust. I point to Myanmar as a really good example of this where the independent media was decimated over decades of military rule. When Facebook entered the country and that became the way people connected and got information, and anybody could become a source of news, there was the rampant spread of misinformation, largely against the Muslim community in that country. Now we’ve seen a great deal of violence against the Muslim community as a result of that. I think that you have interesting examples out there of where this road leads us. If we decide that we don’t trust our established news sources if we have a continual onslaught by politicians on institutions, which really try hard to be objective. As a society, we’re convinced, ‘Okay, we don’t want to believe these. We’re just going to believe our local celebrity, or the Facebook page that speaks to me,’ or whatever it is that supports my confirmation bias. Where does that take us as a society?

Veronica Belmont: We really can’t talk about this subject without acknowledging the fake news elephant in the room. People, of course, upset with the election of Donald Trump will often blame disinformation and misinformation tactics for influencing the vote. Andrew Guess wanted to know if that was actually true. He started with a study. Andrew is an Assistant Professor of Politics and Public Affairs at Princeton, so have a listen to what he learned.

Andrew Guess: Trump supporters, we’ve found something like 6% of the kind of news-related pages that they were visiting were related to fake news. And for Clinton supporters, it was under 1% of their overall news diets consisted of fake news. The numbers of the proportion of people’s media diets tell me is that fake news is being consumed by people who read a lot of other stuff about politics and news. I think that people seize on fake news as the explanation for what happened because it was surprising and fake news was a sort of new phenomenon that seemed to offer an easy explanation for the outcome. But I think it’s impact has definitely been overstated. I mean misinformation is … it is a big problem. We’re just trying to show that the people who are being exposed to misinformation the most are also the people who are probably least likely to kind of change their views as a result.

Veronica Belmont: So Jestin, how surprised were you by the election results?

Jestin: I was surprised, absolutely. To that point, you have to wonder how much of this fervor around this subject is because of the election of Trump. These things have been going on for several years. I’ve been working on this since 2013. 2012 was kind of when I first started noticing that there was this problem with this kind of information on Facebook. I know that the platforms are familiar with these issues. They had not done anything prior to this large public outrage, and this narrative that was born that this kind of information had influenced the election when … Again, I think that the studies will show and have shown so far that this kind of information doesn’t necessarily change votes. I would be open to a discussion about whether or not it had anything to do with turnout. As far as changing someone’s actual vote one way or the other, I think is very insignificant.

Speaker 3: Andrew suggests that we’re disproportionately panicking about the impact of misinformation. Sheera, how do you react to this?

Sheera: I think that there is a great deal of panic specifically about Russia and Russia’s role. I think that it’s an easy boogeyman to point to. It does concern me as a journalist when … people’s default reaction is, ‘Oh, well, Russia’s coming for us.’ My answer to them is not exactly. Russia has been doing this a long time and in a lot of countries, more successfully than in the United States. You know what? This US has done this to some extent. There’s a number of other countries I can list that have done it. We should be worried about misinformation as a society, but maybe not for the reasons that we’re currently talking about.

Veronica Belmont: Renee, to me, this suggests that the real challenge is filter bubbles or echo chambers. How likely is it that misinformation spreads less because of the content itself but more because of the way we’ve built our information platforms or our social media platforms to spread any and all information?

Renee: We always try to evaluate, to kind of tie your prior question to this, and then there’s that kind of critical piece, which is absorption, which is just like what are the metrics we can use to gauge how much of an impact it’s having. This is where I think filter bubbles are, of course, in my opinion, a problem. Facebook’s been actively promoting its groups feature. It thinks of it as a way to build communities and encourage people to find each other around common interests. What I think a lot of what we miss is these are great ways for people to connect, but every now and then, unfortunately, you start to see what we saw in Myanmar and other places where the recommendation engine begins to suggest things that are actually highly toxic and corrosive, or blatantly false. We were talking backstage a little bit about the YouTube recommendation engine where you start out wondering if 911 was an inside job and wind up with the lizard people. It’s a conspiracy correlation matrix that they’ve built, and they just keep serving, and serving, and serving it. That’s because the recommendation engine is phenomenally effective and at the same time, a little bit dumb. It doesn’t know, perhaps, that ethically, we might not necessarily want to be serving anti-government conspiracy theories to people who are prone to be receptive to them, that sort of thing.

Veronica Belmont: I am curious from the audience here. With a you know - show of applause, who in the audience has ever shared a story, and be honest, without fact checking the source? … Or, even better, how about sharing a Facebook story, a Facebook news article without reading much past the headline? … So yeah, I mean I’m willing to bet we’ve all done that at least once or twice, especially in the past few years. So how much responsibility does lie with the consumer? Jestin, is a buyer beware world out there for online information?

Jestin: I think it should be known by now that not everything on the internet is true, right?

Veronica Belmont: No.

Jestin: I that’s sort of the thing we’ve gone through for a long time now. My wife and I talk about this often. She brings it up to me in this way. You could go to McDonalds every single day and have every single meal there. We all know what that will do to your body and your health and you know content online is kind of similar. You can go to the garbage sites. You can read all the garbage stuff that you would like all day long. But you have to have some responsibility if you want to maintain a good personal health. Fact checkers right now seem extremely ineffectual. By the time that a fact check is actually completed, the story that is being fact checked is likely … has gone far more viral than the fact check will ever get to.

Veronica Belmont: Yeah so Sheera, that feels really true to me that the fact checkers just can’t keep up with the rate of the news. I know we’ve seen some of that happening in real time on during debates, and things like that. Does that ring true for you as well?

Sheera: Yeah, definitely. I’ve been involved in those fact checks. I’ve been in the newsroom fact checking a debate as it’s happening in real time during the 2016 campaign most recently. It’s hard, man. It’s, by the time you’ve written that two paragraphs and you’ve checked your facts to make sure … The worst thing, the worst thing really is to have to fact check on your fact check. By the time you’ve gotten it up, you’re lucky if you know five or ten percent read your fact check, as opposed to the sensational headline that was spread.

Renee: We need to be more responsible consumers of information and it’s only going to get harder. I don’t know if people here have recently been reading about how video can be manipulated, how audio can be-

Veronica Belmont: It’s terrifying.

Renee: Right.

Veronica Belmont: Yeah.

Renee: We’re just on the precipice of all this. What do you do when you can’t believe what you see with your own eyes? That’s the world that I think about my daughter growing up in and how am I going to teach her to be a responsible consumer of news?

Veronica Belmont: Yeah. You said something that really spoke to me as well, which is the sensational headlines, because even then, even the same news article or the same source can be so twisted just by changing the headline and cherry picking the information from the same article. I’m sure, Renee, you’ve also experienced this kind of phenomena before, where they just literally take the same facts and make it about something completely different.

Renee: In the real world, we go through life with the expectation of trust, and we don’t assume constantly that the person speaking to us is lying to us, and that’s because you can’t make it through life if you live - assume that every interaction you have is false. If when you’re in Starbucks, you’re like, “Well, is it really one shot or is it two shots?” I mean, these are the kinds of things where it sounds trite, but society functions in part because of that expectation of truth, and I feel like increasing the cognitive load to the point where the assumption that every single thing you see, everything you read is fake, every video you see is fake, the person you’re speaking to is fake. This is - I remember AOL and my parents telling me that, “Everyone online is lying to you. They’re all child molesters, and thieves, and you can’t tell them your name-

Veronica Belmont: I learned that they were just dogs-

Renee: This was like … Right.

Veronica Belmont: On the other side of the computer. Yeah, that’s my understanding.

Renee: Dogs, criminals, God knows what, you don’t trust anything. Wikipedia, God knows you don’t cite Wikipedia. Now we have YouTube deciding that it’s going to use Wikipedia articles as a fact check, right? We’re in this weird place where the platforms don’t want to take responsibility and they continue to use the line, “We don’t want to be arbiters of truth,” but I think this is not a matter of truth, so much as it is a matter of information integrity, and we for some reason are abdicating the idea that we can … That it is even possible to ascertain what is true without YouTube saying, “Well, is the earth flat? I don’t know. I mean, maybe. Who are we to tell them what to think?” This is where I actually do feel like there needs to be a pushback against this, just because you can’t have a society that functions when the base assumption is that everything is fraud.

Veronica Belmont: Sure. You’ve reported recently on this really … The fascinating situation in Sri Lanka where social media platforms, Facebook in particular, were actually banned because authorities felt that they were spreading too many lies. How effective is that kind of crackdown?

Sheera: It’s not. I mean, it’s effective in the short-term, sure. Keep people off of Facebook and Twitter for a couple days, because you’re in the midst of an election, or whatever but Facebook has already saturated the market in most of the world, and if they haven’t saturated already, they’re trying to very, very quickly. Maybe five years ago, 10 years ago we could be having a discussion on whether it was smart to have Facebook grow at an unchecked rate the way it has been, but we’re past that. There are many parts of the world where Facebook is the internet. People rely on it for business. They rely on it to connect to their friends and family. We’re very privileged here in the west, I can go to my phone right now and use Signal, WhatsApp you know I can name 10 other mediums to communicate with other people, but in much of the rest of the world it’s just Facebook. We’ve gotten past the point of saying, “Let’s ban Facebook,” and we need to get to a point where, as I said before, it has to be multifaceted. It’s not just educating people, but as Renee said, these platforms themselves need to figure out what’s their responsibility. As a society, I hope at least, that we’ll continue to press them for answers.

Veronica Belmont: Jestin, governments like Canada, for example, have threatened social media platforms to fix the problem or we’ll do it for you. How do you feel about threats of regulation like that?

Jestin: That would make me nervous, absolutely. I certainly wouldn’t want any administration to be in charge of … I’m not sure that we all kind of trust the other side enough to make it a good faith attempt to any sort of regulation that would matter.

Veronica Belmont: Renee, what do you think? Oh yeah, go ahead Sheera.

Sheera: I wanted to push back for a second, because Europe has regulation coming down the pipeline right now. In a couple months time, the GDPR will go into effect and a lot of tech companies, especially Facebook, are having to prepare for that. We’ve seen them begin to introduce things like the right to privacy, which in and of itself is like, “Sure, right to privacy. That sounds super vague,” but actually it’s having real consequences on what Facebook can and can’t do. I think we’re beginning to see what regulation might look like.

Jestin: I think that’s fine, but that doesn’t address the monkey in the room here, right?

Sheera: The fake news?

Jestin: Yeah, the privacy is certainly an issue, but with regards to the information on the platform, that is what I’m saying is where I’m not sure whether regulation will come into play.

Sheera: One of the things that’s been interesting to me reporting this, is that when they started to talk about privacy, then they started to talk about what kind of data can Facebook collect on you. The less Facebook knows about you, the less effective it is at delivering that algorithm that’s going to target you specifically. If Facebook is not allowed to know that I’m a new mom, and maybe I’ve had a baby a month ago, and so maybe I’m having postpartum depression, how do they market happy pills to me? How do they market baby products to me? By giving me some semblance of privacy of my own data, you’re potentially limiting their ability to then find me as an individual. I don’t know if that’s good or bad, I’m just saying that’s one interesting model that we’re seeing emerging.

Renee: Yeah, because I think a lot of the conversation we have here is fake news has always existed, but the vector of reaching people is different, in the sense that these are virality engines with remarkable targeting capabilities, so they have a First Amendment right to moderate content on their platforms. Unfortunately, a lot of the narrative around that has been that moderation is censorship. The only difference between YouTube taking down the Parkland conspiracy theory videos about crisis actors and YouTube not doing the same thing with Sandy Hook, we’ve seen this for a long time. This is not new. What’s new is that people are pissed off about it now. I think there is a very you know plausible case to say that the election was what led to a lot of soul searching by the engineers at the companies. You saw that on election night. Twitter engineers saying, “What did we build? What did we do?” I think the great reckoning around what do we want the platforms to do actually did come out of the fact that the election really just blew people away.

Veronica Belmont: Jestin, would you say that we’re trying to basically kill a cockroach? Is this problem unkillable?

Jestin: I think that it is a hard problem to address, absolutely. If you’re not able to take the financial incentives out of this sort of thing, you’ll never stop it.

Veronica Belmont: Yeah. Sheera, you touched on this a little bit earlier, where all this is headed technologically. How do we know what’s real anymore? Is this a functional democracy at this point even possible?

Sheera: Everyone should definitely trust the New York Times. Yeah. Look, I think we have - there’s no easy answers here. I don’t think we’re past the point of no return. I just think that all this pressure is good. I think that people coming here and having these discussions is good. I think Congress holding hearings is good. We’re definitely finally having a conversation about something that we should’ve probably been talking about a long time ago. I mean, looking forward I think it’s really important that we’re having this conversation ahead of 2018 midterms, because if we thought that disinformation, fake news was bad in 2016, I think what’s going to happen in 2018 is going to be so much more interesting. Interesting for a generalist perspective, probably terrible from a democratic perspective. It’s not just going to be a one sided, “Here’s a sensational story.” It’s going to be three or four people on your social media platform, so Facebook or Twitter, having a discussion that looks real, but will all be manipulated, it will all be created and they’ll argue both sides, and they’ll get on each other’s nerves, and they’ll inflame all the emotions that they can possibly inflame, and the goal will be that you at home watching it will either have apathy, or you’ll lose your temper and chip in yourself. I mean, it’s going to be a real sort of nuanced campaign to influence our democratic institutions.

Veronica Belmont: Well, thank you to all three of our guests for this incredible conversation. Before we wrap up, though, we have one last order of business to get to, your two truths and your lie biographies. All right, so Renee, spill it.

Renee: I forgot what I said. I think I said …

Sheera: I know which one.

Renee: I don’t have three kids, I have two. Is that what I said? Yeah.

Veronica Belmont: Okay. That’s fair. Jestin?

Jestin: I despise jogging and long walks on the beach.

Veronica Belmont: I was like … In my mind I was like, “Oh, I hate jogging.” Who even likes it anyway? Perfect. Sheera?

Sheera: I cannot drive a stick shift and I would never-

Veronica Belmont: Really?

Sheera: Drive a stick shift in San Francisco. That’s a horrible idea.

Veronica Belmont: I had so much faith in that one. I was really impressed.

Jestin: She really has the Gaddafi earrings.

Sheera: I really have a Gaddafi necklace.

Jestin: Or necklace.

Veronica Belmont: That’s awesome.

Sheera: Yeah.

Veronica Belmont: I mean, I think that’s awesome, right? That’s pretty awesome.

Sheera: It’s a broken thing. I found it on the floor, it’s not a real … Yeah, sorry. I didn’t steal a necklace from Gaddafi’s palace.

Veronica Belmont: You’re just making that … The lie of that truth was better, I thought that was pretty cool. Some applause for our guests please. Thank you. This episode marks the end of our second season, but we’ll be back a little later this year with plenty more storytelling for you. In fact, you can help us make the show. We want to hear from you. What kind of burning questions have you had about how the internet works, or doesn’t work. The web is a deeply woven part of our lives. What do you need to understand better about how we can build a healthy internet for all? Head to the IRL website and let us know, IRLPodcast.org. And thanks to all of you, our audience, for coming IRL’s first live podcast taping. I’m so glad you guys all came out. I want to leave you with one last thought. Though it’s not always easy to separate fact from fiction in social media, it’s not impossible to stop the spread of misinformation, at least from your own account. Check out the show notes for more on what you can do to promote information integrity online.

IRL is an original podcast from Mozilla, the not-for-profit behind the Firefox browser. I’m Veronica Belmont and I’ll see online until we catch up again, IRL.

You guys got to hear a mess up, wasn’t that cool? It was like really real. It’s cool, we recorded another version earlier. It’s fine.