Stand up for the facts!

Our only agenda is to publish the truth so you can be an informed participant in democracy.
We need your help.

More Info

I would like to contribute

We started fact-checking in partnership with Facebook a year ago today. Here's what we've learned

Aaron Sharockman
By Aaron Sharockman December 15, 2017

A year ago today, PolitiFact joined a coalition of fact-checkers who agreed to work with Facebook to try to slow the spread of misinformation in people’s news feeds.

The update is we’re making progress, but at a rate that probably isn’t satisfying to anyone.

To refresh your memory, our partnership with Facebook is part of a push by the social media company to clean up its news feed and become a more trustworthy platform. Facebook introduced a new tool after the 2016 campaign that allows users to mark a post as a "false news story," and if enough do, the post is sent to fact-checkers like PolitiFact, Snopes and Factcheck.org.

If the fact-checkers find the story to be "false," a warning label is attached to the original post in Facebook’s news feed, and Facebook’s algorithm makes it more difficult for the disputed post to spread virally.

PolitiFact has been doing that work for a year now, and we have attached that false label to at least 1,722 individual URLs. That’s nearly five articles debunked a day, every day for a year.

Yet, the project has seen substantial criticism. It has been called censorship of the Internet (it’s most certainly is not). It has been seen as being unable to deal with the scale of the problem (a point that has merit, and that I have some thoughts about). And it has been labeled ineffective to cope with the lightning speed in which false news stories start and spread (more work is definitely needed).

As we mark the conclusion of the first year of this effort, I wanted to update you on our progress and share our perspective.

 

Strengths  

Before this all started, we did try to identify propaganda or false news appearing on the Internet. But without Facebook’s cooperation, there was little way of knowing what stories needed to be debunked and what stories we’d only be adding fuel to if we covered. It’s not enough to find and fact-check a false news story; we need to find the stories that are spreading.

The Facebook partnership is an excellent lead generator for stories that are appearing in your news feed and might need debunking.

In the hours after the Las Vegas shooting, a spate of false news stories starting appearing. As they started populating Facebook’s news feed, users flagged them as potentially fabricated. And indeed they were:

The same happened in the days surrounding the landfall of hurricanes Harvey, Irma and Maria.

And it happened just in the past week as false news spread surrounding the Alabama special Senate election.

In the early days of PolitiFact, we hired interns whose job was to scour different sources looking for claims in need of fact-checking. This partnership essentially outsources that work to the Facebook community, which reduces the time it takes, allows our interns to work on other tasks and exposes us to a tremendous amount of content. Which brings us to …

Weaknesses

The reality is, there is too much content for us to check, and we imagine there is plenty more material in need of fact-checking that we aren’t seeing.

As I type this column, there are 2,300 possible false news stories that our team of fact-checkers could be scrutinizing. Now, of that number, many of the stories are not "false" as much as they’re misleading or partisan.

Regardless, the fact is fact-checkers are only scratching the surface of the material people are asking to be reviewed.

Let’s do an exercise: For most of 2017, there were five fact-checking organizations participating in this project in the United States. (It’s also running in France and Germany.) Let’s say they all fact-checked as many URLs as we did -- 1,722, that’s 8,610 articles fact-checked a year.

Now let’s say the universe of false news stories of 1,000 a week -- that’s 52,000 articles a year.

In that scenario, the best we’re doing is covering 16.5 percent of the field. In all likelihood, the percentage is lower.

In addition, we know little about what happens to story after we debunk it. Facebook has said that a story’s future impressions drop by 80 percent after it is debunked by fact-checkers. But they say the process also takes "over three days" before a story gets debunked. What happens in the meantime? Well, the story and misinformation spreads, and the author makes money.

The time lag is particularly important.

Opportunities

The opportunities here are in many ways the whole ball game, and it’s why we’re participating in this project -- and while we’ll continue doing so in 2018.

As has become quite clear, misinformation travels farther and faster than a single fact-check. One of the ways fact-checkers can begin to keep up, however, is to try and attach their fact-check to the falsehood in the same medium.

Say you’re watching a campaign ad on television that makes a misleading claim, and your TV presents you a fact-check with the correct information. (It’s closer than you might think.) Or you’re reading a website and a spurious claim appears at the same time as a fact-check. (We can do that today through annotation tools.)

The same thing can happen on Facebook.

What if you shared or liked or engaged with posts saying the Pope endorsed Donald Trump? Facebook could alert you that the information you read or shared might be incorrect. What if, along with likes and hearts and sad faces, you could ask for a fact-check with a single click or a mouse or tap on your phone?

What if Facebook could teach its algorithms to spot when the same false or misleading story is being republished on dozens or hundreds of different websites?

These are the types of conversations we need to be having, with the companies and platforms that could amplify our work and mission 1,000-fold.

Threats

The biggest threat is over-reaction to these efforts.

We’ve already seen what I would consider three significant over-reactions in one year:

The last one is the newest, and worth a bit more explanation. To join the Facebook fact-checking group, an organization must agree to work toward the principles of the International Fact Checking Network, an organization housed at The Poynter Institute, a school for journalists based in St. Petersburg, Fla. (The Poynter Institute owns the Tampa Bay Times newspaper, which in turn owns PolitiFact.)

The principles include things like fact-checking in a nonpartisan manner, being transparent in your process and methodology, and detailing your funding sources.

The Weekly Standard has met that criteria, according to the IFCN. PolitiFact, too, has met that criteria. As part of that certification, both fact-checking sites need to operate with a bright line between them and their more opinion-minded colleagues. For PolitiFact, that’s editorial and opinion writers at the Tampa Bay Times.

By the way, the two most recent fact-checks published by the Weekly Standard debunk a Fox News tweet in the Alabama special election and offer a nuanced look at whether Trump can legally reduce the size of national monuments in Utah.

As we look ahead to 2018, it’s important to remember how quickly this all came together. The Facebook fact-checking program was up-and-running five weeks after the 2016 election, and four weeks after fact-checkers openly asked Facebook to do more.

I say that, because, 2017 was in many ways a beta test.

We learned a lot and still have so much more to learn.

Sign Up For Our Weekly Newsletter

Our Sources

Linked in the story.

Browse the Truth-O-Meter

More by Aaron Sharockman

We started fact-checking in partnership with Facebook a year ago today. Here's what we've learned