© 2024 Hawaiʻi Public Radio
Play Live Radio
Next Up:
0:00 0:00
Available On Air Stations

How Facebook Is Trying To Control Election Disinformation, Part 1


The social media giant Facebook has spent much of the 2020 election cycle trying to control the spread of disinformation on its platform. It's also spent a lot of time responding to criticism that it doesn't do nearly enough to stop the flow of that disinformation. Sheryl Sandberg is Facebook's chief operating officer. We spoke to her about this earlier today.

You know, the management of the pandemic and the management of the economic fallout as a result have made this a very high-stakes election. And I want to ask you about Facebook's role in that because you have rolled out some new policies. So no new political ads a week before the election, for example, is one that was announced by your CEO earlier this month. Here's a sample of an ad that will stay up. It features the president's son.


DONALD TRUMP JR: Their plan is to add millions of fraudulent ballots that can cancel your vote and overturn the election. We cannot let that happen. We need every able-bodied man, woman to join army for Trump's election security operation at...

CORNISH: So this morning, you had civil rights activists saying this is a call to arms. This is voter intimidation. And it's one that Facebook is enabling.

SHERYL SANDBERG: So I want to be really clear where we stand on this election, which is that we are doing all we can to protect this election. We are taking action on anything that is voter suppression. This ad is claiming that the election mail-in ballots might be invalid. When anyone says that in an ad or a post, we're saying, mail-in ballots have been (ph) accurate. We expect the same in this election, and we're linking to the bipartisan voter information center.

I think what people are worried about in this ad is that he says army of supporters. If - and there have been other ads, by the way, where people have said, we're calling for violence in a direct way. We have taken those down. We believe the language army of supporters is not really calling for an army but is calling on people who are normal campaign volunteers. But there have been other instances where people very senior have called on real violence. Those come down immediately.

CORNISH: So you're making a judgment call there - right? - that in this sense, army doesn't mean real army. As the president makes more and more allusions to fraud as he talks about the need for people to go to the polls, when does it start to cross a line to you?

SANDBERG: So it crosses a line very often. And again, we've done something we've never done before, which is we are linking from every single post that's about voting to the official voter information center. We are...

CORNISH: But why not just take it down, I guess? - because you hear an ad in which there are very specific calls about the election, and then you see a link that just says voting by mail has been pretty good the last couple of years. It doesn't seem like a balance in information there. And I guess I'm not - if it's bad enough for you to label, why isn't it bad enough for you to take down?

SANDBERG: Well, we're labeling. So let's be clear. We are taking a lot of things down. We obviously have to let candidates speak. The other thing we've done is we're worried about two things - making sure people don't get bad information or calls to violence, but we're also very focused on people getting accurate information.

We've done something we've never done before. We've put up the voter information center. We are linking from the top of Facebook, the top of Instagram to help register people to vote. And we're getting accurate information. We're not just saying, oh, this looks bad. We're saying, here's how you vote. Here's your state's link. Here's how you can make sure your ballot's counted. And we're doing that very aggressively across the platform.

CORNISH: One question I have to ask following this idea - if there is unrest, which is something that Mark Zuckerberg has referenced, has said that he has a concern that that's a possibility - if there's unrest or violence in the wake of an unclear election outcome, are you able to shut down Facebook - meaning, like, is there a kill switch or a circuit breaker, which is, you know, something even the stock market has to prevent crashes?

SANDBERG: Well, look. We have found that in moments of crisis, particularly violent moments in different countries, actually, Facebook is a great place for people to get the information that keeps them safe. We would do everything we can to protect people. And we've already said if any candidate does inaccurate information, inaccurate ads after that election, we are going to take them down. We are going to link to the Reuters center, which is the official poll results. And we are very focused on getting all mentions of hate and violence down before the election and after election. And we are on high alert.

CORNISH: It's interesting because you have a climate information center. You have this voter information center. There was a COVID information center. Does it feel like whack-a-mole trying to keep up? It feels like essentially you guys have to keep coming up with good sources of information, so to speak, because so much false information spreads so quickly.

SANDBERG: Yes, it is whack-a-mole, and that's why we're learning. We learned from coronavirus to set up the voter information center, to set up the climate information center. Audie, I don't know what's next. I don't think you do either. I don't think any of us do, but we know that...

CORNISH: But at your scale, can you do it? Like, is this actually reflecting the fact that you're so big? Like, is Facebook truly governable in that way?

SANDBERG: Well, when people get to say what they want, there's going to be good, and there's going to be bad. And we need to enable the good, enable the good people want to say to each other. And there is an awful lot of good that happens on these services. And our goal is to take down the bad and continue - and to enable the good. And that's what we're going to keep doing.

CORNISH: I don't think anyone disputes that good can't come from Facebook, right? I think people are starting to question whether the bad is starting to outweigh it, especially when they are feeling like the democracy is fragile here in the U.S. And I know that Mark Zuckerberg has said that he believes the democracy is strong enough to withstand this challenge, deliver a free and fair election. But a lot of people are seeing straining at the edges of our institutions and look at Facebook and say it's not helpful when it comes to truth and trust.

SANDBERG: Any technology that's ever been used enables both. We have to - and I think we have taken very aggressive steps to get the bad down and very aggressive steps to put out the good. So our voter election information - again, we're going to register, we think, 4 million people to vote. That's going to be the largest effort of its kind in history. That is enabled by a service where people post and where we can link them to their states to register.

CORNISH: Sheryl Sandberg, thank you so much for speaking with us.

SANDBERG: Thank you.


That was Facebook's chief operating officer, Sheryl Sandberg, speaking with our co-host Audie Cornish. And we should note that Facebook is among NPR's financial supporters. Hear Part 2 of their conversation on how the pandemic is affecting women in the workplace on tomorrow's ALL THINGS CONSIDERED. Transcript provided by NPR, Copyright NPR.

More from Hawai‘i Public Radio