Wednesday, February 1, 2017

UC Berkeley hosts panel on fake news









- Hello everybody..
I think we should get the show on the road. I want to welcome you and give you some information and admonish you to please if you have smartphones, turn off your ringer and even if you are going to be using Twitter or Facebook Live during the event, which is okay, it's a public event, our Twitter handle is @FakeNewsPanel, that's capital F, capital N, capital P, fake news panel, and this event is being recorded by C-SPAN,.
And by others and it.
Is also being live cast and there'll be a video of it, and there will also be a podcast that will be part of the the brand new Graduate School of.
Journalism podcast series which is called On Mic, that's M-I-C, mic,.
And that is available on iTunes or wherever you get your podcasts. I'd like to start by saying that this is event was a collaboration between the Office of Public Affairs, the library and the Graduate School of Journalism. I'm Deirdre English.
From the Graduate School of Journalism and I worked with Kathleen McClay and Cody Hennessy to invite our panelists, and invite all of you. We really wanna thank Marlena Telvick for helping to invite the press here. A lot of members of the media are here, and Julie Hirano, who just does everything with the publicity.
And the logistics and all the hard work of getting us to all.
Come together as we have. Let me welcome the panelists now. We have a really distinguished panel. We have right here to my left, the first panelist here, is Laura Sydell, and she is the well-known voice on National Public Radio's Digital Culture Correspondent, and I hope many of you heard or will listen to her amazing story on Disinfomedia, one of the stories.
That really brought this issue alive in my mind. She tracked down a company with many fake news sites and that aired for the first time last November and has been listened to many times since. Adam Mosseri, we're very happy to have somebody here who is very high up,.
Is the Vice President of News Feed at Facebook, and 1.8 billion people are using Facebook now, and Adam manages the team responsible for delivering relevant content, news content, to all those Facebook users. And recently, Facebook has taken some important steps to address the problem of fake news on their platform. And we're delighted to have his presence. We have Craig Newmark with us. Craig is a web pioneer, the founder of Craigslist. He's a speaker and a philanthropist who often introduces himself, modestly, as a news consumer, and can also claim to be one of the internet's best-known nerds..
(gentle laughter).
But all of this comes right out of his own self-description. He recently generously donated a million dollars to the Poynter Institute in order to promote.
Verification, fact-checking, and accountability in journalism. So, as much as anyone I know, Craig has taken steps to address the problem. And we're joined.
By two members of the UC Berkeley faculty as well. Catherine Crump is a law school professor, and she's the co-director of Berkeley law's.
Samuelson Law, Technology & Public Policy Clinic and she specializes in first and fourth amendment and media issues, and all about censorship and what you can and cannot do. And Jeffrey MacKie-Mason is UC Berkeley's university librarian, and he is a professor at the School of Information. His scholarly work focuses on the economics of the internet, online behavior,.
And digital information creation and distribution. Finally, our moderator is Dean Ed Wasserman. He's a professor.
And he's the dean of the Graduate School of Journalism, and his specialty is media ethics. He blogs, perhaps very appropriately titled.
Blog called Unsocial Media (laughs) And you can find that at ewasserman.com And I wanna thank you, the audience, for your interest in this hot topic. With that, Ed Wasserman. - Thank you, Deirdre.
And thank you all for coming out tonight on this chilly evening. (applause).
I want to also welcome a number of tech reporters in the audience from Reuters, New York Times, Mother Jones magazine, The Guardian, KQED, and The Daily Californian. We have a strong interdisciplinary panel here tonight,.
And thank you all for participating. The format, we have roughly an hour and a half to play with, and I figured we'd divide it approximately in half. We spend 45 minutes.
With the discussion confined to the panel. I'm hoping for a lively discussion, not necessarily an orderly one, so you're welcome to talk to each other, interrupt each other, to move the conversation along. I'll be tossing out.
Questions and goading you when I'm not happy with your answers. Then, after 45 minutes or so we'll open the floor to questions. Opening the floor, as Neal Conan observed in a talk here not long ago, always a troubling concept in seismically active California. Let me just kick this off with an opening thought, cuz I was thinking back to when I started getting interested in the media, and this was late '60s, early '70s, in the shadow of McLuhan and a great deal of very excited and very much utopianist talk about the world.
Of democratized discourse that the media would enable. And if you had told me then that 40, 50 years hence I'd have this device.
That would give me access to bigger audiences.
Than the widest-circulating newspaper on Earth had, and would give me access to more information.
Than the best-sourced reporter on Earth had, I would say,.
"Well that sounds like paradise. "It sounds like that would be what a democratized "communications sphere looks like, "when people are communicatively enabled, "and we would have then exceeded paradise.".
And instead, here we are, and we're finding that there is a dark underside to that, and we're finding when we look around that people are, in fact, laboring, more people believe.
Things that are not true than perhaps ever before. And more people are acting on beliefs they either misunderstand, or understand and are untrue than ever before,.
And we find that this wondrous world of technologically-enabled communications paradise has now turned around and is biting itself in the backside. So lemme start by asking, and I guess I would end with, we're finding more people than ever are enthralled by the shadows on the cave. So what do we do?.
Lemme start with this question, I'm gonna invite Laura Sydell to weigh in on it to get us started..
Fake news now has become a big, messy topic..
There's not even really agreement as to what it is. In fact it's being brandished as an all-purpose slogan to describe everything from errors to deliberate falsehoods. It no longer is agreed upon as identifying a unitary phenomenon, so what are we talking about, and what conclusions can we draw about the way the term is now being fought over, and the elastic way it's being applied? So Laura, why don't you start us off? - Well, I guess I wanna say there's a difference of intent. And there is a big difference. People who are in the fake news business, they know what they're doing, they know it's fake, as opposed to when a journalist who's trying to get it right makes a mistake..
So I would argue, for example, some people have said, "Well Judith Miller's reporting "on the weapons of mass destruction was fake news." It wasn't fake news..
She made a horrible, I mean horrible, mistake. But the guy that Deirdre mentioned, that I found, this is real fake news, and it's very profitable. I mean, we decided we would take one story, this was in a meeting and I got the assignment, to take one story and trace it all the way back. One fake news story that got a lot of attention, and in this case it was the story of an FBI agent dead in apparent murder-suicide, and supposedly this FBI agent had been investigating Hillary Clinton's emails. And so the implication was that somehow this was part of, if you know something about the alt-right conspiracy theories about the Clintons, they murder people off. And this appeared on a site called the Denver Guardian, which sounded like a legitimate site. It was not..
So trying to find where this came from was the idea. Who was it that was behind this? It was, initially, not that easy because usually you can go to GoDaddy, and you can discover.
That there's a website, and that website, somebody owns the domain name. In this case it was anonymous, I enlisted a very smart tecchie to help me basically look at the internet a bit like a paleontologist, just looking for fossils. And he was able to.
Eventually get me a name, we got an address,.
And I decided the best thing to do was just to go knock on his door. It turned out he was in Huntington Beach, California, and I had no idea what we were gonna find. I took a male intern with me, cuz I was a little nervous about this, but we went to his door and I held the story in my hand and there he was, his name is Jestin Coler, knocked on the door, and I said, "Did you write this?.
"I'm from NPR, we wanna know if you wrote this," and he said,.
"No.".
I said, "Do you own the Denver Guar—" And he closed the door in our faces, and we left him an email. Turns out he's an NPR fan. (laughter).
Seriously..
He gets back to us and says, "Alright, I'll talk to you. "Yes I know about it, "and yes I do own the Denver Guardian website." And he absolutely knew he was doing fake news. In his case, he was a Hillary Clinton supporter too. He said he started this whole thing as kind of a joke. He wanted to show how crazy the alt-right was, and how easy it was to spread fake news in the alt-right echo chamber. However, as I did point out to him, it was lucrative..
In fact, he told me he was making between 10 and 30,000 dollars a month. And he had a whole little empire, it wasn't just this, he had a whole bunch.
Of other websites, too where he was putting this stuff out there. But it was absolutely intentional. Everything, he said,.
"Yes, everything about that Denver Guardian story "was totally false and we knew it was totally false." That is fake news..
And I really do think there is a big difference between a reporter making a mistake, and what this gentleman was doing. I guess lastly on this topic, I would say I feel like one of the things that's going on is there's a sense of wanting to make everybody confused. And I think that works in some people's advantage. To have the world be confusing. And I have heard people talk about Steve Bannon's interest in certain far-right.
Groups in Europe and Russia who actually do use this tactic. It is a political tactic. And so, I'm not saying he is, but I think it's something to think about. What is fake news, what is it about, what is its intent? And I think it comes down to that. - I wanna come back to how you make money with fake news, but first, you have identified a pure case of deliberate fabrication. - Yes..
- Which everybody can agree is fake. But the term is being applied far more broadly to capture a sort of underlying, simmering dissatisfaction with the quality of information and the trustworthiness of information people are getting. And I'm wondering how this is now, it's playing into the political arena in somewhat unfoerseen ways, and I wonder what sense we make of that. Jeffrey, you have thoughts? - Well, I don't disagree with what Deborah said, but I do think that for a lot of purposes when we're talking about information distribution and people wanting to get information out there as providers of it.
And people wanting to take information in as consumers, it's often useful just to think about quality as being the dimension. And there's high-quality news, there's low-quality news, or information. It's a spectrum, of course. For some purposes,.
I sometimes think of there being negative-quality news. There's certain cases where people are.
Intentionally manipulating, intentionally, as you say. But even there, there's a little bit more nuance to it. I think the case you.
Just described he said A it was a lark, and B he was making money on it. It doesn't sound like he was trying to actually persuade anybody to change their behavior, he wasn't trying to manipulate people. But sometimes people are trying to manipulate, and trying to use lies, essentially fraud, to manipulate. So there's a malevolent intent that can matter. But I think, first, we think of it, especially if you're a platform provider, for instance, as a platform provider, you care about the quality of the news or the information that's being distributed through your platform, and you want more good quality because you want people to come to your platform, and you want less bad quality. That spectrum is very hard to draw any lines on. And sometimes platform providers want different things than their consumers. We might say the platform provider is in it for the money, they just want eyeballs, and as long as they can attract eyeballs they're selling those eyeballs to advertisers so they may care about a different aspect of quality. On the other hand, they also want repeat eyeballs. They care about reputation, and if they keep.
Delivering bad information, they're not gonna get repeat eyeballs. So to think about how to design systems and how to understand behavior in this business, I think first, I like to think about it just as a spectrum of quality, with certain special cases where the problem is not just that it's low-quality, but it may actually have a malicious or negative quality. - But you're not suggesting it's quality that's driving the traffic? - Well, to some extent. I mean, people want information for different reasons. Some people want information just for entertainment, in which case they may want things that are actually fake, they find it more.
Amusing and entertaining. So it's not a single dimension, but I think in repeated use there is a correlation, certainly, between quality and what's driving the traffic,.
That people are going to recognize that certain sources are more reliable than others. And the content provider, if it wants to develop a significant business and keep that going,.
Is going to care about that quality, yeah. - Can I just interject one thing, if I may,.
Which is that part of the problem is Facebook, because it is an environment in which you are.
Looking at all kinds of things that your friends share. And so it's not the same as going to the New York Times website versus going to Breitbart. You're in an environment that feels comfortable and safe. And I didn't mean that - It's OK..
Just as a total criticism but that's part of the issue, is that you're not now going to all these separate, credible publications. - I'm going to stand up for the platforms, and I'm not one of them in any sense. I'm just acting as a news consumer, and I just would like news I can trust. These are really tough problems. One part of it is trolling, harassment. I've been trying to deal with that on a professional basis for over 20 years..
All the platforms are taking steps to address this, it's just really tough. For example, Facebook is working with the International Fact-Checking Network and are trying to work with people who are signatories to that agreement like PolitiFact and Snopes. Google is working with The Trust Project, which is about means by which news organizations can say "Hey, here's what's trustworthy behavior." And to oversimplify that, it's about having a code of ethics and being serious about it. I've spoken with Twitter directly about the problem of dealing with trolling and harassment. These are really tough problems, the platforms are standing up for them, hopefully in the really near future I'll be able to announce with Wikipedia new steps and serious funding about dealing with.
Harassment and trolling. So the platforms are standing up, but these are really old, really tough problems to deal with. Last week someone reminded me about a fake news attack from Octavian who faked a will from Mark Antony because he wanted to raise military funding and support to go after Antony and Cleopatra. This is not new stuff, it is really tough,.
And they are actually serious about it and doing something. You also wanna be quiet about how you talk about it cuz when you talk about techniques, the bad guys are listening to what you're saying. You'll see it pop up in black hat discussion boards, so you really don't wanna leak stuff before you're ready to do something. - I take your point, no it's not new, and I wanna hear from Facebook, but what has changed? In 2004 we had the Swift Boat versus the Bush-National Guard story. Both were stories that had some factual basis, they were important,.
They were fiercely disputed, the veracity was disputed, and yet each side accused the other of proffering phony, fake news..
What has changed now? What is different in the news environment now from 2004? - Well I think some parts of this are new and some parts of it are old. The problem of gullible people is timeless..
There have been gullible people for a long time, there will always be gullible people. Anyone who has email.
And has received a forward from a relative understands this, right? It's hard to get those things to stop. But I agree with Laura. I think one of the things that's new here are the platforms and the ease with which someone can create a news story which, although it may sound fantastical to many of us, appeals to people's,.
A Trump supporter may be inclined to believe things that enhance a particular narrative, and you can easily create something that enhances that narrative which then gets propagated. And I think the speed with which that can happen is something that's new and we don't have the same gateways controlling the media that we traditionally had. - So Adam, you've been mentioned. How does it look from Facebook's side? - Two different things. One of what's changed, I think the nature of how people consume information is continuing to change. In news, specifically, you're seeing more and more publishers, there's less and less barriers to entry. The cost of distribution is going closer and closer to zero. And there's more and.
More competition, too, and it's anybody..
The guy, where was he? Somewhere in Southern California? - Yeah, he was outside L.A. - Yeah, so you can do that in a way that was harder 12 years ago and much harder 12 years before that. And that's continuing to change. But I do think, in general, it's important to separate issues, cuz there are a bunch of different issues. Fake news is an issue, I think what we're.
Really talking about here is confirmation bias, is another issue. Hateful speech, which we almost touched on a second ago is another issue..
And so I think that how we think about things, at a high level we're trying to nurture an ecosystem, so that means to create value for people, but also to create value for publishers so that can be symbiotic in some way. On the people's side, we try to connect people with stuff that they find interesting, which is sort of our.
Definition of quality. And on the publisher's side we try to create tools, and that's the Facebook Journalism Project recently to create value for there. But in pursuing both, there's really two sides. There's trying to nurture the good, right? Helping people find stuff that they find meaningful by ranking things better, better design, helping people connect with sources, this guy hopefully was following NPR on Facebook, but also to reduce the negative. So fake news is one type of negative content..
There's clickbait, there's nudity, there's hate speech, bullying, violent content, et cetera. And so we try to divide things, or we think about things in those two different ways, and then we pursue those problems very differently because the nature of how you make progress is very different..
- So lemme ask a crude question: Does Facebook make money from what we would consider fake news? - No..
I think there's three things to be concerned about from Facebook's perspective around the financial side of fake news. I actually think it's super important because from what we can tell in our research, a lot of fake news publishers are financially motivated. There's spammers,.
They actually sometimes switch from one party to another. So one thing that we worry about but doesn't seem to be a real issue is people don't use Facebook to advertise fake news very much. It's just not an effective advertising platform.
For fake news..
The cost of advertising's very different, et cetera. Two, we also wanna make sure that they don't use our ad networks to sell ads on their sites. That also doesn't happen very much cuz we have strict policies and people in place..
We could actually manually approve advertisements, which is what we do..
The thing that I think is where the financial value gets shifted to the fake news publishers using Facebook, and this is something we need to further reduce, is getting free distribution. So posting something that's crazy, getting a lot of clicks on it, that takes a bunch of people to a website that's, I mean, you guys have probably seen this before, maybe it's a paragraph and then 80 or 90% ads. We think of those as ad farms. And that's not financially benefiting Facebook,.
But it is shifting financial value to fake news publishers, which is a bad thing. So we need to do what we can to reduce the distribution that fake news publishers get as close as we can to zero. And that's what we were starting to try to do in December, and we have more work to do. - Can I just add something on the financial front? This was an interesting thing that Jestin Coler told me which was, one of his sites was caught by Google and they stopped running ads on his site, but the minute that happened, his inbox was filled with literally hundreds of offers from other places that would run ads on his sites. So, unfortunately,.
The opportunity to run ads on your site is vast, it's not just the big companies. - You can tell it's profitable because of the secondary effects. For example, there's a group, I think it's Sleeping Giants, they've identified what they think is a fake news site, and every time they see an advertiser pop up on it, they contact the advertiser asking them to stop advertising there, and they claim it's working. A lot depends on how you define what fake news or a fake news site is, but that seems to be working. Plus the ad networks, the bigger ones like.
Google's, in particular, they're being asked to stop allowing advertising to be placed on fake news sites. There's a new ad network, an aggregator that's focused on avoiding this thing that Ken Doctor just reported on. I'm giving Ken credit because I forget the name of the network, and so things are happening which are improving things. I hate to be so critical as to name ad networks by name, but I'm really tired of seeing ads from Taboola and Outbrain, and if that stopped appearing in my reading on my phone I'd be pretty happy..
- So help me..
Somebody on the panel help me with this. I wanna understand,.
If I'm an enterprising young person in Macedonia, and I wanna make a bundle. So I come up with, I find some trending terms from Google, some things that are clearly of interest to vast numbers of people, and so I run a few stories, and one has Kanye West and Hillary Clinton in possibly a love triangle, somebody else I can't think of at the moment, and I post this story, and it's a complete fabrication, nicely done, though, I get pictures, I can do that too, and next thing you know I have 500,000 people streaming there through somewhere. And at that point, I.
Have a serious footprint. Who's making money from that? Is this Google, Google AdSense sending this? Is there some automated mechanism? It would be helpful, I think, if we all got to the same point in understanding the mechanics of how illicit gains.
Are made on the internet thanks to fake news..
- I can take a pass at this. If you are trying to make money off of fake news, you actually probably won't start a website, you'll start many websites. You'll create many pages on Facebook, you'll create many accounts on Twitter, et cetera. You do this to diversify your risk, cuz if you get shut down in one place, you don't get shut down everywhere else. You then try and create essentially an engine that churns out a lot of content. It's usually very short, it's usually very sensational, often it's deliberately fake and false. You actually can sometimes, there are markets for this, you can actually go and pay $20 for a paragraph. And then you use an ad network, which is basically a middle man between you and advertisers, and you basically then use that ad network to get ads on your webpage. Usually very low (mumbles) usually very low-cost ads, we're not talking about a brand advertisement for Coca-Cola— - If you go to a page and there's a story and the ads are for special face cream that Ellen is using or, I dunno, just weird - Weird, wild stuff..
It's actually low-quality advertising. And then what you do is you just keep creating content. You do that in a bunch of different ways and you try to build up followings, and get clicks any way you can, on any social media platform, or through email chains, which we don't talk about a lot, and other things. What you need to do,.
Is you need to have, on average, you need to make more money per visit than it costs you to create content for all the visits you get for that piece of content. So if I paid you 20 bucks to write something crazy about Kanye and Hillary Clinton, or whatever it was, and that costs me $20, I need to make more than $20 from all the people who visit that piece of content. On average..
It's just a machine..
And what you're always looking for, this is Craig's point before, is how to game all the platforms you can. So it's a somewhat.
Adversarial relationship. It's spam, actually, is really what it is. And so, like any other spam, if you prevent one type of behavior, they usually come up with another type of behavior. So one thing that could happen, theoretically, is if we managed, all the platforms, managed to completely reduce fake news to zero, it's not like the.
Incentives would go away. They'd just find new ways of making money. That might not be fake news, that might be some other form of problematic content. So it's an ongoing,.
Never-ending relationship. Is that—.
- It's a help..
It sounds as if what you're describing are elements that are fundamental to the way the internet pays its way. These are not,.
The fake news purveyors have identified things that are not just incidental, they are integral to the way the internet is monetized. If I could just read this quote from Evgeny Morozov in The Guardian, "The problem is not fake news, "but the speed and ease of its dissemination, "and it exists primarily because today's digital capitalism "makes it extremely profitable. "Look at Google and Facebook" sorry.
"to produce and circulate false "but click-worthy narratives." - I'd say it's a bit more general. The cost of distributing information has gone almost to zero. And by and large that's a good thing, right?.
You can learn about,.
I have a kid, he's about to turn one, he was colicky. I actually spent a lot of time on the internet figuring out how you.
Soothe a colicky baby, which by the way, is not possible. So there's all sorts of good— - How bad was the advice you found? - Well, eventually I got to some poor I think it was a father who was just like,.
"Look, you're just gonna have to deal with this. "By the way, it gets way better at three months, "you'll be fine, so just hang on," which was the best advice I got. So in general I think it's good that information is easier to access. But there are also negative repercussions to your sort of introduction. So the question is, how do we address the negative without reducing the positive effects, which I think are also very real. - I think this is one of the fundamental things that's different now. We've talked about what's different. Fake news, misinformation, disinformation, that's been around forever, and it always will be. What has changed.
I think is precisely the fact that the cost of distribution has gone to zero..
Basically silicon and sand now cost us nothing, and that's what we make CPUs and fiber optics out of. And so we can distribute information, and what that's enabled is that anybody can be a publisher. The world now is, anybody can have a platform,.
And be a publisher, and distribute their information to anybody in the world at essentially zero cost. Not quite zero, but very low cost, and that's created a number of things. Let's think about the big platforms. I don't actually think the small fake-news websites are that big a problem, because they don't actually make that much money, and they're not probably having that much influence. It's when they start to get distribution through the bigger,.
More reputable networks, when they start to take advantage of Facebook and others, Twitter and others, to get their distribution, that's when we start to be concerned, I think, much more. Whether you call it Web 2.0, social media, user-contributed content, those platforms that depend on the users, as Laura said, bringing the content to it, that's really different than the way publishing used to be done. And it's different in an important way, because the content platform providers now want to actually lower the barriers for people to bring content to them. They want to make it as easy as possible for people to publish. They're basically providing open-publishing platforms where you can publish anything you want for free, and you want to attract that content. If you didn't have that coming in, you wouldn't have anything. At the same time,.
You want to keep out the manipulations, the spam, the disinformation,.
But telling the difference is very hard. It's very costly..
That's why I say quality is so important. You need human intervention, and when you've got 1.8 billion people putting content on the platform, figuring out how to.
Screen out the bad content is very difficult..
And that's what's changed. - It seems to me, you have, what I was trying to say about Facebook being the problem, it's more that it's got an environment that's kinda squishy and nice, and you got the baby pictures and the dog pictures, and then somebody posts a fake news story amidst this very friendly, warm environment..
And I think people's guard is almost down in an environment like that. Because it feels friendly, right? You got you friends there, you got your family there. And I don't know what you do, what Facebook can possibly do, when it is meant to be a platform where you can share.
Things with your friends. And if you happen to be somebody who has bad information... it can easily spread like wildfire. - Focusing right now on, less fighting fake news, and more on supporting trustworthy journalism. There are trustworthy news sites out there which do a good job..
There's ProPublica,.
Mother Jones is actually much better than people know, and even more centrist than people know. Consumer Reports is really good. And I'll disclose that I'm on that board. So on the one hand,.
You do what you can to support trustworthy journalism, on the other hand there are pragmatic things you can do to strike at fake news. Again, the Sleeping Giants approach is one approach to fighting fake news, depriving fake news operators of advertising dollars. Another thing you can do, in other ways, is frankly, cutting the cord with respect to cable TV. There are fake news networks which rely not only on advertising, but on cable franchise fees. And if cable franchise fees, which sometimes run into the billions of dollars, if they don't have access to them anymore, then that deprives them of a big source of revenue, so that fake news is no longer as profitable at it used to. In the process.
We need to help reporters and news organizations provide trustworthy news. That's part of my.
Relatively new obessession about helping protect reporters from harassment and cyber bullying..
We also need to help.
Trustworthy news organizations, the smaller ones,.
In the case of media lawsuits, and so people are beginning to float the ideas.
Of much more affordable media lawsuit insurance. It's not a very exciting topic intellectually, but if you're a reporter who's sued, or potentially sued by bad actors, you really want affordable And I'll stop there even though I can go on and on. - I'll add two real quick, just one to speak.
Directly to your question, and one that's related. One thing you can do, my feed is a lot of baby pictures, cuz I have a kid, I have a bunch of pictures of my kid, et cetera. But actually, how people use Facebook varies a lot from market to market, from community to community. So it's not always that, either way, though,.
I think one thing platforms can do is provide more context about what people are reading. And fundamentally,.
I think that's part of what we're trying to do with the third-party.
Fact-checking program, which is, let you know that Snopes disputed this, but there's more types of context that we can surface to help people make informed decisions about what to trust, what to share in the first place. So that's an area we're gonna continue to work on. Another, though, is to try to go further upstream, and try to prevent.
The quantity of fake news from entering the system in the first place..
And I think this is where, and Craig just touched on this, disrupting the economic models are so important. If you can make it uneconomical, most of these spammers will go away, they'll do something else. And there's a bunch of different ways that I think platforms can make it uneconomical for fakes news publishers. A lot of them use different tactics like domain spoofing, like abc.co instead of abc.com - That was actually one of Jestin Coler's sites. - Yeah, there you go. Or redirect cloaking, so it says one link, and it takes you to another, takes you to another, so those types of tactics, I think you can build policies around and automate. The other thing you can do, which we haven't done a lot of yet but it's an area I'm excited about, is take a look at the landing pages. If you go to a page, and it's actually just 90% ads, that's a sign that it's probably not a real publication. The Denver Guardian isn't actually a real publication, it was just a made-up website. So these are the types of areas where I think we've done some work, but we've got a lot more work to do, and I think the other platforms are looking at it in a similar way. - Lemme pivot off some of the things that you've been saying and let me suggest this: Facebook and these various sites that we're talking about and deploring, they can post stories, they can draw readers, but they can't set agendas, and they have been reliant on mainstream, actual established news media to essentially weaponize fake news, and to give it significance, and give it public importance. And I wonder if we could talk a little bit about what the media,.
By media I mean the media that we at the Graduate School of Journalism.
Train our students to take part in, the news media, what can they do,.
What are they doing wrong with respect to fake news, how should they be handling it differently from the way they are, and how can they avoid being unwitting accomplices in the pollution, if.
You like, contamination of public discourse?.
- I think one problem I see in the media is that sometimes when something is floating around they report on it,.
Or they even give it attention, and I think that doesn't help. This is my opinion,.
That you actually start to give it credence.
When you report on it. So that would be one place. But I think part of the problem too, is you have a public that wants to believe certain things. Did fake news sway the election? Or did the people.
Who read those stories— - That sorta depends.
What you mean by fake news. - Yeah, well I mean stories like the one that I tracked down that fed into this.
Narrative that's out there that the Clintons were responsible for the deaths of all these people, which is in fact a narrative that is floating around on the right in this country. And if you're inclined to believe that and you see this story, it just reaffirms what you believe. Or, if these stories weren't out there, would you no longer believe that? Were you gonna vote for Trump no matter what? And that's a question that I still have, y'know? What impact is this.
Phenomena actually having? - Yeah, I have to say I tend to think that our concern over this is overblown, and driven in part by the fact that these stories like the ones you reported are just so shocking. It's novel that someone can be so morally bankrupt, and have such a power to influence the media narrative. I think the other thing that it's worth paying attention to is what we really want companies like Facebook to be doing here..
I happen to like.
Facebook's relatively gentle approach to this. To try to label certain stories as potentially fake without taking them down, because I think if we'd been having a conversation about Facebook and free speech a year ago, the conversation we would have been having would be quite different. It would be about how much power do we want a company, a corporation, with an algorithm that's not public, to manipulate what the public can see. And now we're all sitting here in Berkeley with concern over conservative media fake stories.
Potentially influencing the election in a way that we may not like. But is that the bigger concern? And then, how much do we really want corporations to use the tremendous power they do have over what people read, to manipulate that content because of what it says? - But haven't they been doing that forever?.
I mean, we've gone from basically a world where there were gatekeepers and that's the way it was to one where there are no gatekeepers, and so I'm not really sure that what's new is that we don't have gatekeepers, and I think we're trying to look at what is the impact of that? - I think part of the problem is that we spend so much time wishing or thinking we're still in that world where we have gatekeepers, and so we look to the gatekeepers for solutions. Ed, you asked "what can the traditional media do?" So, for instance,.
A responsible journalistic organization like NPR, I think there's some things they can do, but I'd actually like to expand that to, for instance, what can the Graduate School of Journalism or Berkeley do?.
We've been talking.
About the news providers and the industry that flows information. I think a lot of the.
Thing we need to address are the consumers, the readers of information. We're going to have fake news always, and because of the zero cost of information distribution, it's not going to go away. We can always raise the costs of providing fake news and lower the benefits and moderate it to some extent, but there's always going to be disinformation out there, and manipulation, and infomercials, and so forth and so on. What we need to really do is educate folks much better to be better consumers of information. We have not been, in this country, addressing information literacy nearly as much as we need to given the flood of information. We've gone from a world of scarce information controlled by.
Largely-responsible gatekeepers, whose reputations depended on it, to overly-abundant information where everybody has to now be their own filter, has to be their own editor. And we haven't been teaching our students, at any level, our population, to be good self-editors. There's a Stanford study that probably most people are aware of that came out a few weeks ago that looked at high school students and found that, going into college, most of them, 85% of them couldn't tell the difference between a genuine news story and a paid promotion. There are always going to be paid promotions out there, we can't make those go away. We have to make sure.
That citizens can distinguish between them and recognize what is paid content and what is actual journalistic reporting. - That's a good example. There's a whole industry out there that's devoted to.
Obfuscating that distinction. So that kind of ignorance on the part of the reader is a produced outcome. - Speaking on behalf of news consumers I wanna be able to pay for news that I can trust. What I'd like to, in a news aggregator, to see, let's say a check box which says "only show me news from news organizations "that have publicly committed to trustworthy behavior." That would be like an ethics code, a diversity policy,.
And committed to a good accountability and corrections policy. Cuz people do make mistakes no matter what happens. And then I want an.
Organization of fact-checkers, maybe an international network like the one run by folks like Poynter Institute and the American Press Institute. So I wanna be able to say "only show me stuff from organizations "that promise to do trustworthy news "and that have a good record" and that's enough for me, speaking in a simple-minded way as a news consumer. That's what I wanna pay for. I already do pay for that, I'm looking to pay a lot more for that in a number of different ways which I don't want to prematurely announce. I would do things like sponsoring more in the way of pledge drives if only they would use my favorite theme, but the idea is.
That I am putting my.
Money where my mouth is. A lot of other media-impact funders are looking to do this in conjunction with the API code where they're looking into the ethics of funding non-profit journalism. That's actually a thing, and it's a very recent thing, weeks ago. But the idea is that I do think people are willing to pay for trustworthy news, and frankly there's a lot of people who are willing to put their money where their mouth is in a big way,.
And that's supporting groups like ProPublica and NPR. - But some people think Breitbart is trustworthy news, or Fox is trustworthy news. People have different ideas. And other people think NPR is trustworthy news. - Then let's see.
Who will put their money where their mouth is. On the other hand,.
Again, there's the folks, including Sleeping Giants, who are telling advertisers "Hey, do you wanna be associated with untrustworthy news?" Let's see how that works out, and again, let's see how well a pledge drive works out, particularly if KQED will allow me to use my favorite theme. - So Craig, the idea is— - Can you hum it?.
- The idea is to threaten advertisers with reputational harm— - Doesn't anyone.
Wanna ask you what that joke is? (crowd chattering).
- [Person In Crowd] What is your favorite theme? - Well, what I proposed to KQED and others for pledge drive my theme would be.
"Please dear God, make it stop." But no one'll go for it. Yet..
- So the idea of Sleeping Giant is to threaten advertisers with some sort of reputational harm in exchange for advertising on these undesirable sites. - Nothing that traumatic or negative. They're just saying to news sites, "Hey, do you really wanna be associated with this?" And that's constructive and positive. And I'm Mr. Positive. - But it does sort of open the door for criticism and action taken against sites.
Because you don't like what they're doing, you don't like the.
Messages they're showing, not because those messages are corrupt, or flawed, or deserving of a lack of trust. People would go after Breitbart, a good bit of what's on Breitbart is reported. It's reported out, it looks like journalism, tastes like journalism, it comes from a different ideological perspective, but that doesn't meant it is something that should be destroyed. It's part of the landscape of public discourse..
Yet, I can imagine a good many people, some of whom are here tonight, who would disagree with that, and would think that shutting down a site like Breitbart is an awfully good idea. So I worry about the ethical, I worry about using.
Ethics as an instrument of political reprisal. - Ed, you seem to have this obsession with Bretibart, and you're the one who's brought it up. Me, I'd say boycotts of any sort, that's a two-edged sword. People do have to make ethical decisions about that. Me, after doing customer service on the net.
For over 20 years,.
I can assure folks.
That there are a lot.
More people of good will than there are with bad will out there. You'll forgive my faith, it's somewhat naive,.
But the deal is that I've been observing human behavior on the net for a long time, and I actually have a lot of confidence in humanity. But the thing is that we need to give people the tools that they need to act out of good will. - Understood..
Before we go to questions from the audience,.
Lemme ask about the.
Dangers of overreaction. And I think, Catherine you brought up the possibility this is sort of overblown. I'm not sure I entirely agree, but I have some fear when I read that Google says it will take steps to keep its ads off, quote: "Pages that misrepresent, misstate, or conceal information "about the publisher, the publisher's content, "or the primary purpose of the site." Now, if you go back and carefully kinda parse that sentence, that's a fairly broad mandate to basically perform capital punishment on sites that might expose Google to criticism,.
Or might embarrass it in a corporate way. And I just worry a little bit that there might be such a broad brush, and the kind of public unhappiness with what they're seeing in fake news might motivate and might propel a reaction that goes considerably farther than any of us are comfortable with. - Boy, you're good at the sound bite. "Capital punishment" on websites. (Craig laughing).
- If you're on the third page of Google results, forget about it..
If you're not on the first page you're in trouble. - Well, now you're talking about where you're located on the results. But if you're talking about the advertising model, as was pointed out earlier, there are lots of advertisers out there and lots of advertising channels. Google is not the only one, and if they start excluding large swaths of content from their advertising, there are gonna be plenty of other people who swoop in as long as people want to go to those sites. So as long as it's private individuals and private organizations exercising their right to decide what information they're going to value, that doesn't worry me. What worries me is if we start to say "Oh, the government should come in "and should decide what news is good for us." We've seen countries.
That operate that way, I don't want to be living in one of them. - I had a Chinese general, I was in China, and I got an opportunity to have an off-the-record conversation with this Chinese general, and I asked her.
"How much does the.
People's Liberation Army "concern itself with social media?" She said,.
Gave me some vague answers, and then looked at me and smiled and said, "Ha ha, what do you think of Twitter revolution now? "Ha ha ha ha ha ha!".
And I think from the perspective of some of these other countries, they are terrified of exactly what this unleashes. And there's part of me, I have moments where I'm like, "I kinda know what they're saying." I mean, there is a sense that this does create an awful lot of instability, and uncertainty of what's true, and it can be used against people. This is a hell of an issue. I think it's a really difficult, difficult, challenging issue for our time. - If you would put up your hands, we have two people with microphones and we'll call on you for questions. In the front, one..
I'm gonna do three at a time. (Craig laughs).
Two, and three, in the back. Yeah, you, the one turning, yeah you..
And while you do this, I just wanna share something that the Chinese stated, their cyberspace authority stated, issued a policy, "It is forbidden to use hearsay to create news, "or use conjecture and imagination to distort the facts." Which is,.
- Hearsay..
- Pretty broad..
Yeah, hearsay..
Okay, yes..
- [Woman In Front] I was hoping you could discuss the fact that the term "fake news" is already being perverted. President Trump accused some major media outlets of releasing fake news. There's been a sort of a, now the term "fake news" is being used for.
Articles that are written that a politician doesn't like. And I think that makes everything more complicated, because it's just a way of sort of eliminating.
People's paying attention to real fake news,.
And just being able to dismiss hard reporting going on. And I think in some ways that's even more of an issue than fake news on the internet. Because, unlike what Laura said, I think that people who get fake news on Facebook, sometimes in their communities they're alerted to the fact it's fake news..
I don't think consumers are 100% stupid, I think they recognize it a lot, so I don't think it's, clearly it's a big issue, but I do think people are aware of it out there and are taking steps to point it out. But with Trump going around saying "bad news about me is fake news," I think that's a huge issue. - A constructive approach is, again, to promote the trustworthy stuff. A couple weeks ago I.
Remembered from Sunday school, "It's better to light a candle than curse the darkness." So the term "fake news" is abused. It has emotional resonance that doesn't come from, let's say, "gullible news" or anything like that, so we're still gonna use the term "fake news," but let's support the trustworthy stuff. Like pledge week..
- I think that this is a tremendous problem. It's sort of like,.
I was saying this earlier, my older sister used to play this game with me where I would say something and she'd say it at the same time, and then it would just make you shut up. And I feel a little bit like it's like, "You say 'fake news' I'm gonna say 'fake news' too," and now everybody's saying "fake news" and that is kinda crazy. I think some of it,.
I would like to see,.
Perhaps, a different approach sometimes to coverage. We tend to move from.
Event to event to event, and recently I was with another group of people who were talking about this phenomena, and maybe we need to think about sticking with stories and covering them as they unfold instead of just jumping from that day's thing. So if we're going to follow the Health and Human Services Department just make that an ongoing thing that we're covering, rather than just going for the day's soundbite. And maybe some of this is going to involve a different approach to news, and rethinking how we cover it as well. - One small thing I'd add is I think that it's definitely true the term or the phrase "fake news" is being wildly it's being distorted, it's being used for more and more things, and your example's a good one, but there's actually many I've seen over the last couple weeks or months. I would say that it's not totally new. One of the things we look for, so on Facebook we let people report things as fake news, we've done that for a long time. And if you go in and look at the reports, people have reported things that they disagreed with long before this election. So that distortion,.
Though, I think, significantly larger, to your point, today, is not new entirely..
So I think what's important is that if you are involved in the issue in some way, as a publisher, as a.
Platform, or as a consumer, it's just critical that you're very clear about what you mean.
When you say "fake news" and that you don't let all of the conflation prevent you from making progress in whatever it is you feel is your responsibility. Because it is being distorted, but it is still an issue, and I think that each of us in our different ways need to continue to pursue. - Go to the second question. - [Man In Audience] Actually, I was gonna say something very similar to what that other person just asked, but just a quick comment and a question. The comment is, I too have some optimism and faith that people who innocently pass fake news will get better at not doing it. Just like we don't send emails anymore about Nigerians wanting to send us fortunes..
As we become aware of the issue, I think it will get better. The question that I had is similar to what that person said, that when Trump called out CNN, and called it a "fake news site" I thought that was really dangerous. Clearly, the story that he was referring to,.
The one about the Russian report of Trump's behavior while in Russia, the story was, in a large sense, true. That is, there was such a report, the origins of the report were as CNN stated it, the general nature of the allegations were what were stated. The idea that is was.
Unverified and possibly, if not probably, largely untrue, was clearly stated, and so there wasn't really anything untrue about the story. But by characterizing it, by focusing on the fact that the report itself might have been untrue, and saying therefore it's fake news, sets us in a situation where now, not only are you muddying the meaning of the term "fake news" but you're possibly,.
To be extreme about it, castrating a news site like CNN to the extent that Trump can get people to believe that CNN is culpable of fake news, it then makes the entire network untrustworthy. And you can then say.
"Oh, they're the network that posts fake news, "we don't have to believe anything they say,".
And it becomes very.
Difficult then to decide who you can believe..
I guess my question is, what can you do about that? Especially when it's the president that's involved. - I agree that this is really awful, and it goes to the problem I mentioned before, I think. We're going to have disinformation and manipulative information, and bad persuasions always. In fact, we're going to have more of it, because it's getting cheaper and easier to distribute it. What we need are discerning, critical-thinking citizens, people who actually pay attention to where the news is coming from, where the information's coming from, and make judgements about that. And something like what Trump did is anti-literacy. It's telling people.
"You believe what you wanna believe," standing up representing an institution, or about to represent an institution that's very highly-trusted, the US government, he's saying, "Don't worry about it. "If you don't like it, it's not true." This is a really serious problem that we've been facing for some time in this country. Anti-science is another part of it, we have an enormous number of people who tell us "We should be anti-scientific "because we don't like what science is telling us, "so it must be false." This is terrible.
If our institutions are telling us to be anti-literate and anti-science and anti-knowledge, that's really dangerous. The solution isn't for the government or any other institution to come in and tell us what news is correct or what information's correct, it's to help people.
Actually value and celebrate information literacy and critical thinking so that people learn early in school and throughout their lives to actually make judgements and not accept these statements. And for our own institutional leaders to be actually trying to undercut that I think is horrifying. - I would just dissent a little bit, in the sense that what Trump was saying was that the underlying veracity of the report the intelligence people were passing along was non-existent..
So he can't very well welcome the fact that they were briefing the president-elect on reality that did not, in his.
View, it didn't exist, and that the media has some responsibility to determine not just whether somebody said something to someone else, but whether what they said was true. And he's taking the.
Position it wasn't true, so he, y'know,.
He didn't have a very strong hand in this one, but he tried to play it. - I think America's.
Foremost media ethics critic provided a commentary and solution to this. In the one case he points out that when you're talking about politics in the press it's kind of like visiting the monkey cage at the zoo. I won't deliver this very well, but y'know, you look at the monkey cage and they're flinging feces at each other, and you think,.
"Well, they shouldn't do that," but what you really think is that what the zookeeper should say is, "Bad monkey.".
And that's the role of the press in this environment. He got to the point.
Where he provided the solution to the problem in a segment called CNN Leaves It There where he pointed out that a politician just came out and lied to the reporter. It was already well-known to be a black-and-white lie. The reporter was taken aback, obviously knew.
That this was a lie,.
But he said,.
"Well we gotta leave it there." And so you can look up this segment called CNN Leaves It There, Daily Show, about eight years ago. And yeah, this guy saying this is probably the most effective media ethics commentator in the country..
And if you don't get the joke, it's Jon Stewart. - I'm Tracey Taylor from Berkeleyside, a local news site,.
Which I hope would be qualified as not fake news. Catherine brought up the question about Facebook, if we'd had this conversation a year ago we'd be talking, maybe, about them censoring.
Or curating our news in some way. I think that argument has passed a long time ago because, as has been well documented, we all only see a very tailored, customized news in our own feeds..
What I, as a Hilary Clinton supporter was reading on Facebook was totally different from what a Trump supporter was reading, for example. But I'm interested to know from you, when did Facebook start seeing this rise in fake news on Facebook, how long ago before the election, and why did it take you so long to actually address it? I think, at least publicly, you only started saying you were addressing it after the election..
- Sure..
We've been working on fake news as part of a broader effort around quality and.
Integrity work on News Feed for a long time..
So for instance,.
You've been able to report something as fake news for, I think, years..
I think two years—.
- I did a report on it, so I know this is true, yeah. - [Craig] This is true news. - He's not just spinning it, it's true. - We can fact check in real time. - I appreciate it..
I think about a year and a half ago, or actually about two years ago. We actually didn't announce publicly a change we made to try and address fake news. Now, that said, the amount of attention in the wake of the US election has been enormous. And we try to always.
Listen to our community, so the amount of intensity and the amount of work has increased. I just wanna be super transparent about that. But in terms of how much we've seen, we actually haven't seen a ton of increase around the election. The amount of fake news on the platform, actually, and I'm not trying to diminish the importance of the issue, is relatively small..
It's a very small percentage of what people see,.
It should be smaller, we should get it as close to zero as possible..
I agree there will always so I wanna be realistic, but I think as long as there's an issue we should try to address it as much as possible. But more broadly, I think the question is how can we make sure that we are creating value for the people who are using our services on a regular basis, and create value for publishers, and I think fake news is an important issue, but sort of part of a bigger set of challenges that we have. Though I appreciate.
The amount of attention fake news is getting, cuz I like the fact.
That there is scrutiny, and that motivates us, that makes us excited to go to work, and it took us a few months, about two months after this swell of activity to really get something out there. I also want to make sure we don't lose sight of the other things that are important. And then to address your specific question about why did it take so long, actually for us, two months in the wake of the election, or actually even less, because it was before the break, the election was the 9th, the 11th, does anybody remember? - [Catherine] 9th..
- We wanted to be really careful. Getting third-party fact-checking organizations in line, working out that the Poynter Institute would have the right policies that people would have to adhere to, making sure the system would actually do the right thing, and we wouldn't just start marking the wrong things as fake news..
All of that was stuff that we wanted to be really careful about. So we were trying to balance speed and responsibility, and that's always our challenge. And the last thing I'll add is, there's two sides to.
All these things, right? Which I think you were alluding to. Which is, if we start having more stringent policies around what content is shared, we start to get dangerously close to impinging on speech, and other issues. And I actually don't think it's gone away. We actually received a lot of criticism just a few months ago about mistakenly taking down the Terror of War photo from a Norwegian publisher, I believe. - Which photo was this? - There's a historical photo of the war in Vietnam. - Oh, okay..
- It's a series of children, and one of them is— - The Nick Ut picture. - Yeah, and so, again— - Won a Pulitzer Prize. - Took it down because it was a naked girl..
- A naked underage girl. - Right..
- And that was a lot of criticism on the other side just recently. So we're always gonna try to balance that, but we're gonna err.
On trying to let people express themselves because we're concerned about the same things that you're raising..
But we're also gonna try to be as responsible as we can about addressing problematic— - Was that taken down by a person or by an automated response? - The way is works is very different than, I think, than publications, for example. A publication, like if you were The New York Times, the decision about posting that photo would be made.
In the page one meeting at 9:30 in the morning, probably, and there'd be 10, 20 people that would argue about it. For us, that gets posted, could be tens of thousands of times. In this case no, just a few. Then it gets reported, and then we have people actually review the reports and make sure the reports actually violate our community standards, which are public, and they're around things like no nudity, no violence, et cetera. And that can happen relatively quickly, whereas changing the policy itself can take a little bit longer. So in this case it was reported, it was taken down, and we've since essentially changed our policy.
To make a newsworthy exception for photos like this photo. And so we're always learning, and we wanna hear more, and always just trying to get better at these things. - Can I just ask a.
Quick follow-up on that? Didn't Facebook take.
People out of the equation after there was criticism from conservative groups that there was a bias within Facebook of taking down conservative news? And they took people out of the equation and just brought it back to algorithms? - Not for News Feed,.
Not for problematic content and reports. What happened is we,.
For Trending, which is different than News Feed, it's that sort of box on the left side of the web page, we used to have people write these summaries for each report, and we tried to find a way to do that more algorithmically. So what we do now is.
Actually source a headline from an actual publisher, as opposed to having people write those headlines themselves. - Questions..
One, two,.
And three..
So Sherilyn first..
- Okay, my question.
Is sort of like what.
Francis said at first, but it's really an incredibly brilliant strategy it seems, and I hate to say even the T-word, but the idea of saying, "Oh well, this is fake news," it's almost like a playground strategy, where someone's doing something and you call them that, and then they call it to you back, and then if you react you're doing the same, and it's just this sort of cycle, and so it seems like the only way that it's being responded to, the people who are on most of our sides of the aisle, we're appalled, right? We're all appalled,.
We're saying what I'm saying, and on the other side, they just completely believe it, and so I suppose media literacy, is that the way to do it? Can that happen quickly? Fast enough?.
It's just that everyone's talking to its own side, and of course there is a confirmation bias,.
But they're really.
Playing on that big-time. So that was my question, I don't know if there are further comments on it, but there's other questions as well. - I think I see a tremendous hunger in the news industry to do this. The thing holding it back, maybe, is a bit of fear, of harassment and.
Bullying, and also against, well you could call it harassment and bullying in the legal system,.
The litigation stuff. So I see people beginning to move on it. I think it's happening faster than I thought, that's the silver cloud in the electoral lining. The news industry has the idea that something needs to be done. There's one guy who was involved in an early phase in this, who said that "if we.
Don't all hang together, "we're gonna hang separately," and so I do see hope with this. Things can happen, for that matter. One of the efforts in this, again, is The Trust Project, and if anyone here wants to talk to Sally from The Trust Project, she's running it, Sally Lehrman, we can get you an interview with her fairly quickly, especially if she takes that empty chair up there. things are happening, and that's why I'm beginning to obsess about the issue of harassment and cyber-bullying,.
Not only in news media, but in general, but in news media in particular. - Monica..
- I wanna pick a tiny bone with you, Ed, because I think what happened at the Trump presser last week was actually something qualitatively new on this front, and it seems to me.
That at this moment,.
On January 19th, 2017, we're dealing with.
A lot of things we've dealt with in the past but also new problems. What happened there was CNN and Mother Jones had covered the story well before the election, reported on the existence of these memos from this former.
Intelligence professional. Did not say what they contained, but said that they existed, and that they had been passed to the FBI. BuzzFeed published the memos. What Trump did at the presser was, when Jim Acosta got up to ask his questions, said, "Not you..
"Not you, you are fake news. "Your organization is terrible. "Quiet.".
All these things in a row sort of deserve a little unpacking. And so what I'm curious from each of you is what you identify.
As the qualitatively new thing that's going on in this space.
And whether you see a qualitatively new, do-able remedy. Craig has already laid out his so he gets a pass, but I'm curious what the rest of you have to say. - Something, when we.
Were talking about it, came to mind..
The fact that he called CNN out in a press conference was definitely new..
However, the Obama administration initially didn't wanna give, as I recall, interviews to Fox News, because they didn't like Fox News and people were all over the Obama administration for this. So in that sense, it's just a degree of difference. I don't think we've ever seen that, where a president called somebody out like that, and from what I'm hearing, CNN at one point came to Fox News's defense over something with Trump, and Fox News has come back and been supportive of CNN, and so it seems like there is an effort right now for journalistic organizations to stick together.
To defend each others' right to report the news and that seems to me, right now, a very important thing that is going on at this moment, to resist that kind of singling out that Donald Trump did. - I guess I agree..
It is true, of course, that administrations have played favorites forever. It did feel as if.
What Trump was doing was criminalizing editorial judgement. And in a way to banish, "You're off the table, "I'm not considering you a journalist anymore, "you're proffering something else," and that seemed to me to ratchet up the combat, if you like. Question?.
Yes..
- [Student] Ma'am, you recently mentioned China a minute ago and I recently, with other students from the journalism school went to China on a.
An exchange.
With other journalists and entrepreneurs, and one of the things that they talked about was the great firewall, where Google and Facebook are blocked within that entire country. You can't even access that type of information, and now earlier we spoke of groups, organizations, that would help filter, or create filters for, the consumer on what media they may or may not be able to consume. Where do we lie in getting the right media and trampling on our own first amendment rights to ensure that we are blocking the wrong people and the right people in. - Well, one of the things that I found heartening about this debate is virtually no one has suggested that a government-imposed solution is a good idea here. We have a robust first amendment protection, and although there are narrow categories of speech that are unprotected, the Supreme Court has said, quite recently in a case dealing with the Stolen Valor Act, a case involving claiming that you have won an honor that you did not win, that speech cannot be outside the first amendment merely because it is false. There has to be something in addition to that..
And I think anyone who doubts that the government should not be involved should at least try to go through the exercise of what exactly it is that they want the government to do. Do you want Congress to pass a law prohibiting falsehoods in certain areas? If not, do you want them to say false speech is impermissible, leaving it to prosecutors to decide what to do with that? When you go down that thought exercise, I think it's clear that a government solution doesn't make a lot of sense. And so we're left with intermediaries trying to deal with this. And I personally like the approach that Facebook has taken of labeling speech.
Rather than censoring it, because I think we do have a different first.
Amendment tradition here. - Just to goad you a little bit on this, if my website is starved of traffic and forced to shut down because of something Google does, how much comfort am I supposed to derive from the fact that it wasn't the government that did it? - But I think,.
At least I hope I'm saying that that's a concern. I think we should be concerned, as I was saying earlier, about how much we rely on intermediaries and how much we want.
Them to suppress speech in the name of combating false news. Because there are these choke points for speech. Internet intermediaries or then payment card companies are the same way,.
If you use them to try to cut off payments as was done with WikiLeaks to certain news sources. So yeah, does it matter to you as the website owner whether it's the.
Government or someone else? You still can't get the traffic that you were looking for. - Three more questions. You, you, and in the back, yes, the man with the glasses. Thank you..
Sort of like a precision machine. - It's hard..
- If you have a microphone go ahead. - [Man In The Back] In his 2015 book The Devil's Chessboard, David Talbot, the founder of salon.com uses declassified public records to trace a long history of The New York Times basically allowing the CIA to plant stories..
For example, they hand-picked a journalist for The Times to send to Congo to cover Patrice Lumumba who invented stories.
About torture chambers and political assassinations and stuff like this..
I think you could also draw a pretty straight line from their coverage of the Gulf of Tonkin incident to their coverage of WMDs in Iraq, for example, so I was wondering first of all, on what grounds can you really draw a distinction between those falsified and planted stories,.
And what we're discussing here today as fake news, and secondly what can we do to combat or push back against false stories coming from mainstream or trustworthy news organizations? - Well that's a good question. Who wants to have at it? - If a news organization signs up for The Trust Project to follow through with corrections and accountability, you tell 'em.
"Hey, this was not factual, here's the evidence," and then you see the results, give them a chance, and then publish it. Normally, of course,.
You'll just be publishing something which say.
How wonderful they were about fixing the article. Cuz I'm Mr. Positive. - You've identified an area of real frailty and.
Vulnerability of the press. If you've been lied to by a source you've pledged confidentiality to,.
Blowing the whistle on that source becomes ethically extremely problematic. And so, government lies, delivered under those terms, become extremely difficult to expose. But I think,.
It's still worth, I think, distinguishing between what you're talking about, and the fake news that we've been discussing. This falls within the daily work of reporters negotiating and trying to sort of confirm the veracity of information they're getting from sources. What we're talking about now is the deliberate.
Fabrication of information by, essentially, the.
Equivalent of reporters acting for personal benefit. Because that fabrication is extremely profitable, extremely useful, and as a civic consequence of that, a great many people go around believing things that aren't true. So, the ultimate result of what you're describing and what we've been discussing is very much the same, which is people believing things that aren't the case. And they have a common dysfunction, I agree with that. But I think that reporters are in a far better position, even though they are hoodwinked, they are misled, they are deceived, they're in a better position because that's their job, is to try to determine the veracity of information they're given. So I see where you're going with it, but it's kind of a different and continuing problem that journalism faces. - Can I add to that real quick? I think that it's fundamentally a more challenging issue. It's a related issue, we can argue if it's.
The same or different, but (mumbles) the most important thing the question is how do you address it? I think what you're pointing out is that, saying that The New York Times is trustworthy in this case wouldn't have helped. Which just points to the fact that we need multiple approaches to the same problem. I think information literacy is an important one, right? A skeptical reader can ask questions, look at what the sources are, which were cited, et cetera. I think skeptical journalists, as well, whether they're blowing whistles or just raising questions, is also an important piece. And then I think dialogue. You can talk with people that you know or people you think are experts and see whether we facilitate it on a platform like Facebook or just in person,.
About why you may or may not wanna believe this thing, and an active debate there is really really healthy. And those are parallel tracks to trustworthy labels or reputations at the publication level. And I think a lot of this is because you need a multiple-prong approach. - Question..
Does anybody have a microphone? - [Man On The Left] Yes, I wanted to know if it's true that right-wing fake news generates more revenue than left-wing fake news. - I will say that Jestin Coler, the gentleman that I tracked down, claimed that it did,.
That he tried to come up with fake left-wing news stories and they just didn't do as well. So he stopped doing them. How do you prove that? I don't know..
That's just what he told me. - [Man] Hello..
Beyond the financial motivations there can be broad political and intellectual impacts of fake news,.
So it can extend virally far beyond any retraction or correction that's issued afterwards. So one issue, or one idea is to message people who saw the original story or a derived story.
To try to curtail the impact of fake news, and I was wondering if you had any ideas about this or plans for trying to help this issue. - Is this for me?.
- [Man] Yeah..
- Yeah, I actually think that there's two ways to approach that. One is to let people know retroactively, maybe if they read it, maybe if they shared it, et cetera, we're looking into ideas like that. The other is to just try and move further upstream. Either react more quickly, or prevent it from entering the system in the first place by disrupting the financial incentives. So I think you wanna make sure as little comes into.
The system as possible, and then when it happens you need to react as quickly as you can, and if you didn't find it until later then you need to consider letting people know,.
And the question is who and how. I don't know if we'll do that, but it's certainly something that we're considering. - [Sally] I'm Sally Lehrman and I lead The Trust Project, thanks for the shout-out, Craig, he's a funder,.
And we're trying to.
Work on the positive end in the sense of helping people identify quality news, but I wanted to ask you about was today BuzzFeed reported on a survey that they had done of over 1,000 people asking where they got their news and whether they trusted it. Most people in the survey, as you might expect,.
Do get it from Facebook, get their news from Facebook. Most people also said that they do not trust the news they get on Facebook. So my question is,.
Is this a good or a bad thing? Could this be part of what is undermining trust in the news? And if it is,.
Then either way, what can we do about it? - I can take it, but— I'm happy to..
- Well, I mean—.
- I think you have an opinion. - No, I don't actually on this. I'm not really sure, it's an interesting conundrum. - I don't know what to make of survey results like that. Because there's clearly cognitive dissonance here. "This is where I get my news, but I don't believe it. "What kind of a fool do you take me for?" And I wonder whether that's become a cultural trope, where, "Of course I consume lots of news, "but I don't believe a word they say. "And yet it shapes my actions, shapes my worldview. "It has all the effects of news "that I would consider trustworthy." Somehow, anybody who would say, "Of course I believe what I read in The New York Times. "Great newspaper,".
People would look at.
You, they'd walk away. If you're in a bar they'd move down a couple stools, cuz they'd think you're a fool. So I just wonder whether there is a larger kind of mistrust in major institutions that you're seeing a portion of. It doesn't have to do with the fact that what they read on Facebook, they fundamentally disbelieve. - [Sally] I should clarify, that's an important point, but the numbers, the percentages of people who trusted news.
From their newspaper.
And from their television news was much higher..
And I agree, I think the survey had some faults, but still, there's something there that seemed noteworthy. - Yeah, I'd love to speak to it. I think there's two sides. A lot of people are.
Considering news on Facebook. I think it's important to note that we don't write news. We're a platform,.
And we connect people with sources of news that they find interesting. I think that by and.
Large it's a good thing, because I think that we've helped a lot of people discover a lot of content that they may not have discovered otherwise. We're not the only reason this happens, I think the internet has done that more broadly, in a much bigger way, and I think that's good. The publications I read are not from San Francisco. In the middle of the 20th century, you basically could only read one, two, or three papers if you lived in most.
Cities in the country. So I think by and large it's a good thing. And I think skepticism is a good thing, and so I don't really think trust is binary. I think there's a certain amount of skepticism, a lot of skepticism or a little skepticism,.
And I think that if people, I don't want Facebook to be a place where people don't trust anything, so that means that we've got work to do, but I also think the fact that we have a skeptical set of people who use our platform is good, to your points about information literacy. - Yeah, I think that's actually terrific news to the extent that it was a good survey, and was measuring what it says it's measuring. A large fraction of our population hears more of their news stories from Facebook than anything else..
It's hardly surprising, because that's where they're spending their time reading. So, that they see more news there than elsewhere, completely unsurprising. That they actually recognize that it's not a platform that is a editorial platform, that they aren't selecting the news, they aren't verifying it, that the reliability of it is lower, and that you should be more skeptical about it, that's also great news. I wish I believed it more. But I think it's great news that people recognize that there are different qualities of information and that they should pay attention to the source, and they should recognize that if they see it on Facebook, just like we tell our students, if you read it on.
Wikipedia, it may be right, but don't assume that it is. Go and check a little further if you're going to actually rely on it to make a decision or make a judgement..
- Jeremy..
- [Jeremy] Yeah, this is for Adam. Since so many people get so much of their news through the throughput of Facebook, and I think a large.
Contingent of the population that's all they receive. And while most people here might visit news websites or watch television news and get kind of a package and a nice diverse mix of stories that are oppositional to maybe what they feel, and some stories that are feel-good, and some things that are upsetting, Facebook algorithm seems to be designed for pleasurable experiences. You're seeing things that reaffirm your beliefs, and because content is so atomized based on just sharing stories, and if I go an Facebook and all I see are things that I agree with, it seems that's what's incentivizing this fake news and the economic model of fake news. How does someone like Facebook, how do you present content to people that's maybe upsetting or oppositional to maybe their beliefs? I mean, I agree I wouldn't even wanna see things that I disagree with on Facebook, but I have to admit there might be a certain value to that. I think a couple of things. One is our mission on News Feed is to connect people with stories they find meaningful, not stories they find pleasurable. But we can't really know for every individual.
What they find meaningful. And some of the things that we look at, which are essentially proxies like the "Do you like it?" are gonna correlate with "you agreed with it." But some of them aren't. And we're moving more and more towards longer signals, I'd call them, things like did you spend a lot of time reading the article? Did you have a conversation about the article?.
By the way,.
People on the internet have long conversations about things they disagree with all the time..
We find it's actually it's one or the other. For videos, did you full-screen it, how much of the video did you watch, et cetera. And more and more we're moving to those longer signals. That said, there are.
Multiple forces at play here. One is that people do self-select into the friends that they have on the platform and the publishers that they follow. And they're gonna self-select into somewhat like-minded friends and publishers. The same way they do offline. And that is a force towards less diversity overall. Not always, but overall. There's also a force.
From the other direction which is people tend to be friends with a lot of people, hundreds of people is the average, and they tend to follow lots of publications. So, for instance, in Europe the average person has over 50 friends from outside of their country, and it's hard to find 400 friends and 50 of them who live in another country that are all like-minded. And so that pushes for diversity. And we find, by and large, these forces roughly net out to, as best we can tell, and we continue to look into this cuz this is important, it's a really important question, that people are exposed to about as much content that they disagree with on Facebook as they do off Facebook, and that's important, and we wanna make sure that that stays that way. - We have about five minutes left, so we'll try to get a couple more. - She's really been.
Waiting in the back there. She might hurt the guy next to her if she doesn't get to go. - [Woman In The Back] Thank you. I guess I think fake news is a really important topic and I'm glad we're talking about it, but it also seems kind of premature to me to talk about fake news when millions of people in America don't have any trust in the media establishment at all. And so I guess my question is is our concentration on this a little bit misguided? Should we be focusing on fake news, or instead should we be focusing on building up trust in America in the media in general? - And let me ask you, do you think that fake news is another way of describing this larger sense of.
Mistrust in the media? Is that the reason it's taken on, it's been so widely adopted by people, and applied to wrongdoing that has nothing to do with fabrication? - [Woman In The Back] Well, I think the issue is that some people might not trust anything any media mainstream.
Establishment has to say. They don't care if.
Facebook or somebody else flags it as fake, they think, "Okay, if it's coming from a quote-unquote "establishment news source, I'm not gonna trust it "because I don't trust the establishment." So what I'm wondering is why are we focusing on this kind of small subsect, and why aren't we looking at the bigger picture? - I actually think it's a really important question, and I think, unfortunately, we haven't talked much about just the problem of financing media right now. And there used to be armies of local news reporters who were within their communities and out there representing their local community, and so you could have a relationship with them. And we also are at a moment when local newspapers are going under, and I feel like somehow, to address this question, we have to come up with a way to have more local news that's interactive with its community, that has boots on the ground. I mean, the first news I reported was going to the local community board meetings, and looking at issues over like, double-parking. But that kind of being there in the community is really important,.
Because then people feel like it's their media. I live in the Mission in San Francisco, and Mission Local actually has become a source about my community and it has flaws or whatever, but I kinda trust 'em cuz they're right there, and I think that problem right there is huge,.
And we need to do something to address it and it's probably bigger and beyond this panel. - One more question..
- Just claimed it..
- [Young Man] Hi, I wanted to ask about public data and accountability..
My data analysis actually showed an increase.
In fake news shares from hundreds of thousands in May to millions in November, per day, and that's using what data I was able to find. And I don't know if people trust me or Facebook, right? And on a broader level, Facebook and Google.
Can instantly restructure the entire incentives around the media ecosystem, right? They can decide what news gets attention toward it. Do they have responsibility to provide transparency about how their code impacts that? For example, they could provide aggregate data about how much attention is being directed to what content and sites. That would be one example of the sort of accountability in public data..
- I should probably take this. I think we have a lot of responsibility to be as transparent as we can. Most importantly to communicate our values and our standards which guide all of our decisions. And that's important not only from public accountability, but also from a (mumbles) for publishers so they know what we're doing and what we care about so they can decide to align or not align, et cetera, and for our users,.
So to your point about literacy, so they can understand how the system works. I think that when it comes to issues like fake news it gets challenging,.
And you can do things like aggregate data, but it becomes challenging because of the adversarial relationship. The more specific we are about what we're doing, the less effective it becomes. And so it's a balance. And I think that we're getting better. My time at Facebook over the last three years, we've announced every major ranking change proactively, which wasn't the case when I started at Facebook eight or nine years ago. So we do put a lot out, but I think where we have the most room to improve is to figure out how to more effectively scale that outreach..
Because I regularly meet people who think things are.
Confidential that are not. And so we have to figure out how to more effectively scale our outreach,.
And it's a problem that I'm taking pretty seriously. Starting last year, but definitely this year. - I'm talking to people who are doing similar work analyzing networks of fake news, and for that matter, bad actors, harassers, and all that. That stuff's ongoing. What a tech platform can say about that is sometimes constrained by law or regulation, particularly in Europe. I'll add, since, again this is based on talking to people who are doing this,.
Be careful, because if you're doing analytical work which can expose networks and some bad actors,.
They fight dirty, and so be careful. - Well with that we've run out of time. I wanna thank our panel. From my left Laura— Laura Sydell, National Public Radio. Adam Mosseri, Facebook. Craig Newmark, founder of Craigslist, internet pioneer. Catherine Crump, Litigator and Professor of Law.
At the University of California Berkeley. Jeffrey MacKie-Mason, economist, and Professor of Information, University of California Berkeley. I'm Ed Wasserman,.
Dean of the Graduate School of Journalism at UC Berkeley. Thank you very much..

No comments:

Post a Comment