© 2024 Hawaiʻi Public Radio
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

'Misinformation' On Facebook: Zuckerberg Lists Ways Of Fighting Fake News

Facebook CEO Mark Zuckerberg says his company is responding to sharp criticisms over fake stories appearing in its news feeds. He's seen here speaking Saturday at the APEC CEO Summit, part of the broader Asia-Pacific Economic Cooperation (APEC) Summit in Lima.
Rodrigo Buendia
/
AFP/Getty Images
Facebook CEO Mark Zuckerberg says his company is responding to sharp criticisms over fake stories appearing in its news feeds. He's seen here speaking Saturday at the APEC CEO Summit, part of the broader Asia-Pacific Economic Cooperation (APEC) Summit in Lima.

Facebook could start labeling stories that might be false, company founder Mark Zuckerberg says, laying out options for how the site handles what he calls "misinformation." Other ideas include automatic detection of potentially false stories and easier flagging by users.

"While the percentage of misinformation is relatively small, we have much more work ahead on our roadmap," Zuckerberg wrote in a posting to his Facebook profile last night.

Zuckerberg outlined seven projects his company is working on that could undermine fake news stories. The approaches range from consulting with journalists and fact-checking organizations to disrupting the flow of money in the often-lucrative online fake news business.

"We are raising the bar for stories that appear in related articles under links in News Feed," Zuckerberg wrote of one initiative. Of another, he said, "A lot of misinformation is driven by financially motivated spam. We're looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection."

The idea of using software to classify misinformation is sure to generate discussion. Zuckerberg says it would bring "better technical systems to detect what people will flag as false before they do it themselves." He didn't specify what the effects of that determination might be — whether it would mean the removal of the content from certain news feeds or from the site altogether.

Several of the highest-rated comments on Zuckerberg's post were positive, with this idea from George Papa ranking near the top: "If people had a bit of brain and did some research on their own when they read something that does not sound right...we would not have this problem."

Together, the projects signal another step in Facebook's evolution from its start as a tech-oriented company to its current status as a complex media platform. The company has come under criticism that its news feeds and ad payment systems are too welcoming of fake news, particularly after a contentious presidential campaign season that culminated in last week's upset win by Donald Trump.

Trump's Nov. 8 election left many pollsters and pundits mystified. It also prompted social media users to complain that Facebook and other sites had kept people in bubbles of like-minded opinion; some also said that fake news had influenced the vote.

Days after the election, Zuckerberg sought to allay those complaints, saying that fake news makes up a "very small volume" of the content on Facebook, as NPR's Aarti Shahani reported. And he said hoaxes existed long before his site went online.

"There's a profound lack of empathy in asserting that the only reason why someone could have voted the way that they did is because they saw some fake news," Zuckerberg said last week.

As Aarti reported Thursday, Facebook has long relied on users to flag suspicious or offensive stories — and it relies on subcontractors in the Philippines, Poland, and elsewhere to make quick yes-no rulings on those cases, often within 10 seconds.

With last night's announcement, Zuckerberg gave a glimpse of how Facebook is wading into an area that's often fraught with controversy: verifying or censoring content.

"The bottom line is: we take misinformation seriously," he wrote. "Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information."

Here's the list of steps Zuckerberg laid out (here we're quoting his post):

"- Stronger detection. The most important thing we can do is improve our ability to classify misinformation. This means better technical systems to detect what people will flag as false before they do it themselves.

"- Easy reporting. Making it much easier for people to report stories as fake will help us catch more misinformation faster.

"- Third party verification. There are many respected fact checking organizations and, while we have reached out to some, we plan to learn from many more.

"- Warnings. We are exploring labeling stories that have been flagged as false by third parties or our community, and showing warnings when people read or share them.

"- Related articles quality. We are raising the bar for stories that appear in related articles under links in News Feed.

"- Disrupting fake news economics. A lot of misinformation is driven by financially motivated spam. We're looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection.

"- Listening. We will continue to work with journalists and others in the news industry to get their input, in particular, to better understand their fact checking systems and learn from them."

Several items on the list hint at how daunting the task of silencing fake news may be.

Exclusive breaking news stories, for instance, could have trouble getting the green light from either an algorithm or an independent fact-checker; and both the reporting and warning features could become new tools in advocates' fights to push their own views — and reinforce the bubbles that have prompted Facebook users' complaints.

Zuckerberg has spoken about the difficulty of bursting those bubbles in the past. As Aarti reported last week, "The problem, he says, is that people don't click on things that don't conform to their worldview. And, he says, 'I don't know what to do about that.' "

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Tags
NPR News NPR News
Bill Chappell is a writer and editor on the News Desk in the heart of NPR's newsroom in Washington, D.C.
Related Stories