Yamiche Alcindor (00:00):
Welcome to the Washington Week Extra. I'm Yamiche Alcindor. Tonight let's continue the conversation about Facebook whistleblower, Frances Haugen's testimony before Congress. She told lawmakers, the company put profits before public safety.
Frances Haugen (00:14):
The company's leadership knows how to make Facebook and Instagram safer, but won't make the necessary changes because they have put their astronomical profits before people.
Yamiche Alcindor (00:25):
She also told lawmakers if the company, Facebook knows their products harm kids and teens.
Frances Haugen (00:31):
It's just like cigarettes, teenagers don't have good self-regulation, they say explicitly, "I feel bad when I use Instagram and yet I can't stop." We need to protect the kids.
Yamiche Alcindor (00:43):
Now, Facebook founder, Mark Zuckerberg responded in a note to his employees posted on his public Facebook page. He wrote quote, "We care deeply about issues like safety, well-being and mental health. It's very important to me that everything we build is safe and good for kids."
Now joining us remotely, Cecilia Kang, Technology Reporter for the New York Times and co-author of An Ugly Truth, Inside Facebook's Battle for Domination. And with me here at the table, Nancy Cortez, CBS News, Chief White House Correspondent, Eamon Javers, CNBC Senior Washington Correspondent and Marianna Sotomayor, Congressional Reporter for the Washington Post.
Thank you all for being here. Cecilia, you get the first question because of course you are a Facebook expert at this table. Tell us a little bit more about this whistleblower. Who is she? What did she say? And what's motivating her, you think?
Cecilia Kang (01:32):
Yeah. Frances Haugen spent nearly two years at Facebook on a team called the Civic Integrity team. That is a team that basically tries to fight off misinformation and other harmful content. Her background and her expertise is actually in the technology behind the newsfeed and how the company determines what it wants to rank highest and lowest in terms of engagement. So she's really deep into the system. She understands the technology and she's also a Silicon Valley Veteran. She's worked at Google, Pinterest and Yelp as well.
And what motivated her was a decision in December 2020, when Facebook decided to disband her Civic Integrity team. This was right after the election and certainly when there was still certainly a lot of unrest in the country about the election results, and to her that was the clear sign that the company was not serious enough about protecting its users and making sure that misinformation about the election as well as a slew of other types of harmful content was not on the site.
And she was seeing internally practices and a struggle with really important issues internally that the company was not admitting to the public. So what she did was she quit in December and before she left, she copied off tens of thousands of documents of internal research. That's actually available to many employees, but she copied it off. And this is a kind of research like the teens and Instagram research that you mentioned, Yamiche earlier. And she decided that she would take those documents once she left and she brought them to a reporter at the Wall Street Journal. And the Wall Street Journal has since begun a series of stories. They and other journalists are now continuing to report on all these documents that the whistleblower has brought to the public.
Yamiche Alcindor (03:20):
And Cecilia, one of the first times I really understood the sort of backdoor things that happen in Facebook is when you started reporting on him, when you wrote your amazing book that everyone of course should get. I wonder if you can talk a little bit about how your reporting connects to what this whistleblower's saying?
Cecilia Kang (03:36):
Yeah, we really feel like the whistleblower's testimony certainly and the reporting from her documents confirm absolutely the main theme of our book. The book theme and the book title, An Ugly Truth, comes from a memo from a very senior executive named Andrew Bosworth, where it's called The Ugly, where he says, Facebook believes so much in connecting the world, that it believes that even though there will be a lot of collateral damage because of its quest to connect the world, that damage can be terrorist attacks, it could be bullying, it could be deaths even, but in the end, the goal of connecting the world, will be better for the world and it will be net-net good. And we're willing to absorb those costs. That's the calculus that the company has.
That's sort of the thrust of what the whistleblower's documents show, is that growth is the most important thing because the memo said connecting the world, but we've come to realize that, that's actually sort of a euphemism for growth and engagement and growth and profits. And the whistleblower's main argument is that the company is so bent on growing and keeping its site very relevant, that it is making decisions that has not just small collateral damage, but enormous collateral damage.
Yamiche Alcindor (04:51):
Mm-hmm (affirmative). And, Cecilia is talking about this sort of idea of Facebook putting profit before everything. Eamon, I wonder when, Cecilia is also talking about how we rely on Facebook. What did this outage this week, which is if people don't realize it was Instagram, it was WhatsApp, it's Facebook. So when we say Facebook, we're talking about multiple platforms. What did that outage show about how much people rely on Facebook, especially around the world?
Eamon Javers (05:15):
Well, in multiple countries around the world. And also you're talking about businesses that do all their advertising on Facebook, that communicate with their customers through WhatsApp. I think Facebook is the service that we use to keep in touch with those people that we went to high school with, who are too lazy to actually pick up the phone and call, but actually a lot of businesses are done on Facebook. And you saw this enormous impact globally on all of those people. And take a minute to step back and realize the impact of what the whistleblower did here.
I mean, first of all, serving as sort of an undercover anti-Facebook agent inside the company, stealing those documents, Facebook says those are stolen documents, then leaking them out to the Wall Street Journal in a very tactical way for a devastating series of blockbuster articles in the journal day after day, after day with revelations, then coming forward on 60 Minutes with a big reveal of her own identity. And then two days later, Capitol Hill Testimony that riveted the country, this rollout of what the whistleblower did, this operation undercover inside of Facebook was devastating for Facebook. This was a very tough week for them.
Yamiche Alcindor (06:15):
And Nancy, you're nodding your head. I want to bring you in here. I was going to ask you, what does President Biden think about all this, but really Eamon just also talked about this PR rollout that I hadn't really even put together.
Nancy Cordes (06:26):
It was impressive. [crosstalk 00:06:27]. And I want to know who was behind it because they're going to get a lot more business.
Eamon Javers (06:30):
The Board is, Bill Burton was behind that. I mean, there's some Washington insiders who might've had a hand in this.
Nancy Cordes (06:36):
And they know how the Washington ecosystem works certainly. I think the President and the White House had made no secret of their disdain for Facebook, right? I mean, it didn't the president have to walk back his comments after he said that they were killing people? And then he clarified. He said, "well, no, it's not Facebook itself that's killing people, it's people who post on Facebook."
But they've been very outspoken about the fact that they think that a lot of social media platforms, but Facebook in particular have a responsibility that they're not meeting right now. The problem is, and Marianna really hit on it earlier, that they've got a very crowded agenda. They've got a lot of things they'd like to accomplish. And so while this is one of those issues on which Democrats and Republicans agree, something needs to be done, you wonder when it is going to rise to the top of the agenda, especially because I don't know if you've noticed, but lawmakers, some of them tend not to be all that technologically savvy. You've noticed that?
Eamon Javers (07:35):
That's a very generous way of putting that.
Nancy Cordes (07:36):
In some of their questioning and hearings before. So it seems that there's some things, they know something needs to be done, but they're sometimes a little bit tentative to say definitively, "And this is what I think should be done. These are the new regulations I want to see."
Eamon Javers (07:51):
When are you going to ban [Finsta 00:07:53], was one of the questions, right?
Nancy Cordes (07:53):
Right. Exactly. So that's another reason why you'll continue to see a lot of agreement that something should happen when we will actually see that happen, that's an open question.
Yamiche Alcindor (08:03):
Marianna, what are you hearing on Capitol Hill from these lawmakers about Facebook, they're time for trying to regulate this and also just their understanding of what needs to be done?
Marianna Sotomayor (08:13):
Yeah. There's been many years where there's been these oversight hearings, not as blockbuster as this one where you do have members, you can tell and senators, they don't really know which way to question someone like they get-
Yamiche Alcindor (08:28):
In that exact tone.
Marianna Sotomayor (08:30):
Yeah. Exactly. There's a lot of hesitancy of, "I hope I'm getting this right." But then you get the Finsta commentaries and things like that. So there's still a lot of people who are looking at this. And one thing to note too, is that there's probably going to be more investigations or hearings before there will be any kind of legislation proposed. And one thing to note is the January 6th Committee, for example, they really want to talk to this Facebook whistleblower because she has also mentioned the fact that Facebook had a role in potentially allowing or not doing enough oversight to allow these people, these insurrectionists, to communicate on all these different devices and social media networks. So that is something that it's likely we might be able to see in a couple of weeks or so she might come back and testify. And before that, committee behind closed doors.
Yamiche Alcindor (09:22):
Yeah. And Cecilia, it's a question that my producers and I were thinking through. What makes Facebook so different than other social media platforms when you think about Twitter or other things, what sets them apart, what possibly make them worse than these other platforms?
Cecilia Kang (09:37):
Well, I think one very distinguishing factor is that the company is basically Mark's company, it's Mark Zuckerberg's company. He owns 55% of voting shares. He makes the decisions and Frances Haugen, the whistleblower said the buck stops with Mark. And I think that's absolutely true in my reporting.
The other thing that's really different in relation to the research that you mentioned, Yumiche on teens and Instagram and the harms, the toxic harms, and sort of the negativity that a lot of teenagers feel, from using the platform. One really interesting finding from that research, Facebook's own internal research is that Facebook believes that Instagram is different in some ways worse than TikToK and Snapchat. And just in a very small, interesting way that Instagram has these sort of beauty filters. And there's also this culture of trying to curate this vision of who you are in your life. There's a lot of focus on the full body. TikToK and by the way, TikToK and Snapchat definitely have their problems. They're not completely immune to problems.
But TikTok is much more of a sort of performance-based fun app. It's what a lot of the teenagers who took the surveys for Facebook said, they feel like it's a little bit more humorous, just different kinds of challenges, dances, a lot more lighthearted.
Snapchat, interestingly has these face filters that are really sort of goofy cartoon animated filters, that are just supposed to also be fun. And the focus is on the face. And so the body image issues that Instagram users reported to Facebook and his own research, one of the three teenagers said that because of using Instagram, they feel worse about their body image. 14% of teens in the UK said that they had suicidal ideations and they could trace it back to Instagram use.
I mean, those are the kinds of feelings and anxieties and really harmful responses that didn't exist with these other apps. And I thought that was a really important distinguishing factor. The other last thing I would say is Twitter is very interestingly, more willing to experiment with ways to try to fight misinformation and also to try to protect its users. And one thing that they do is, I'm sure we've all gone through this. When you try to Retweet something that you have in a story that you haven't read and actually opened up, you get a pop up box that says, "You sure, you really want to Retweet this? It looks like you haven't read it." Facebook doesn't have that kind of feature. And that feature is known as friction. It provides friction between you and sharing, you in other words, you in amplifying more of that content and Facebook just doesn't do that. So they're not making the same kinds of decisions as some of their competitors are that arguably could be good solutions to at least start the solving this misinformation problem.
Yamiche Alcindor (12:27):
It's such a comprehensive answer. And one that I think so many people really need to hear about just the difference of Facebook with all the other social media platforms. Eamon, I'm going to come to you for the last word here. Is this all about money? Is it this all at the end of the day, end up about profits and where do we go from here?
Eamon Javers (12:43):
Look, Facebook has grown so fast over such a relatively short period of time and you think of the past 15 years or so. The question for Facebook is how can they keep growing? I mean, the law of big numbers suggest once you have almost everybody on planet earth who's connected to the internet as part of your service, how can you continue to grow, right? And so, one of the things that they're trying to do is keep all those people on the service for even longer amounts of time. That's what engagement is. And the idea is that all these angry things that we are seeing on Facebook are enticing people to stay on the service for a longer period of time, that represents more ad dollars, more revenue for Facebook.
So the more engagement they get, the more profit they make. And in a world where it's going to be very hard for them to find new customers, because they already have just about everybody on the planet well, engagement is the answer. And so if they dial back on some of these things and dial back on some of the angry content, they're also going to be dialing back on profits. And that's a real problem for a public company.
Yamiche Alcindor (13:36):
Yeah. We'll have to leave it there tonight. Thank you so much to Celia, Nancy, Eamon and Marianna for joining us and sharing your reporting and make sure to sign up for the Washington Week Newsletter on our website, we will give you a look at all things, Washington. Thank you so much for joining. I'm Yamiche Alcindor, goodnight.
ncG1vNJzZmivp6x7sa7SZ6arn1%2BssrWtjrCYrKCZo7S1u82wnJ6jX6u2pbHOaGlpamFkfnF7xZqanpqfpLhuwc2dnKtllp6%2Fpg%3D%3D