“Muslims laughing” at Notre Dame and fact-checking photos in the era of false news

Within hours of the first spark of a fire that damaged Notre Dame Cathedral in Paris, people on social media started sharing an image of two men ducking under what looks like police tape as the cathedral burned behind them.

One Facebook post, which was shared more than 2,000 times, had this caption: "Muslims laughing while Notre Dame is burning." Commenters piled on.

"Islam sucks," one person wrote. "Burn their A-- out," someone else said.

That post was flagged as part of Facebook’s efforts to combat false news and misinformation its News Feed. Editors at PolitiFact, which has a partnership with the social media company,  decided we should fact-check the claim that Muslims were laughing in front of the fiery cathedral.

I started to look into the claim the same way I check other questionable Facebook photo posts. I did reverse-image searches for the picture in Google Images and TinEye. Sometimes those websites can lead fact-checkers to evidence that a photo was manipulated; we’ll find the original, unedited image in the search results. Take this photo that appears to show President Donald Trump in blackface, for example. The real photo sans blackface is on the Getty Images website.

In this case, though, we only found instances where the Notre Dame photo appeared elsewhere. It was posted on the website of Sputnik, a news organization run by the Russian government. The image of the men, which has a Sputnik copyright symbol below it, is part of a 9:29 p.m. update about the fire that says "evacuation underway at Notre Dame Cathedral, Paris’ Ile de la Cite."

We tried to find information about the men in the photo and we discovered nothing that confirmed they were Muslims. The image was taken from Sputnik’s site and was wrongly being used to support claims that the cathedral fire was a terrorist attack, fueling anti-Muslim rhetoric, and it lacked context about what was happening in the picture. For these reasons, we decided it warranted PolitiFact’s Pants on Fire rating.

But what about the photo?

After we made that determination, we heard back from one of several digital forensic experts we reached to out to. The National Center for Media Forensics at University Colorado Denver pointed us to this version of the image, which its researchers said appeared online before the Sputnik story. Analyzing that image, the researchers concluded that the men were inconsistent with the rest of the photograph.

"The two front persons are inserted," Catalin Grigoras, the center’s director, told PolitiFact in an email. He also noted that zooming in on the right cheek of the man standing on the right reveals obvious editing.  

But after our fact-check published on April 16, Sputnik responded in a story saying that "the picture in question was merely one of several photos snapped by a Sputnik correspondent covering the tragedy."

We started to hear from other news organizations that were looking into the photo’s authenticity. Niewuscheckers, a fact-checking organization from Leiden University in The Netherlands, published a story reporting that PolitiFact incorrectly called the photo false. They said they believe the photo to be authentic after consulting with a photography expert who assisted the World Press Photo jury in checking submissions for manipulation. Lead Stories, a U.S. fact-checker, also believes the image is authentic.

We also heard from a man who said he was a lawyer for the men in the photo, and AFP published a May 6 story quoting the men.

"‘How could we rejoice in the Notre Dame fire?’" the headline reads, "Two victims of online hate share their story."

The men are not named in the article. AFP said it did not use their names to protect their identities. But the story says the pair were smiling because the security tape caught on one of their faces as they were going under it.

We at PolitiFact have updated our fact-check. The rating is still Pants on Fire, because we found no validity to the claim suggesting Muslims were laughing at Notre Dame as it burned. But we can no longer support the claim that the photo was manipulated and have removed from the story the suggestion that the image was doctored.

Fact-checking questionable photos

We try to get things right, and in the wake of this fact-check, we wanted to better understand how things can go wrong. We reached out again to Edward Delp, a professor of electrical and computer engineering at Purdue University who originally declined our request for help analyzing whether the Notre Dame photo was authentic. The university doesn’t verify images for third-parties, he said, because "the potential liability issues are too high."

"The downside is too grave for us," he said. "We don’t want to get involved in exactly the situation you’re in."

But, he said, confirming the authenticity of an image or video will be an ongoing problem for news organizations.

Tools to determine whether something has been manipulated aren’t widely available to the public, and they’re generally complicated, Delp said. Meanwhile, tools to make fake videos and images are widely available to the public — "anybody can do it now."

"I don’t believe an image I see anymore unless I know the exact provenance of the image," he said.

There are many ways to doctor an image, Delp said. And how quickly he can determine whether something has been manipulated can range from a few seconds to a couple of hours.

Because the sensors inside of cameras, even cell phone cameras, put a sort of fingerprint on the photos they take, an image with multiple fingerprints is a good indicator that it was doctored, he said. The video that appeared to show Nancy Pelosi slurring was also an obvious manipulation, he said. "It’s an old type of attack that’s been around a long time," he said. "It doesn’t require machine learning."

As photo manipulation tools improve, fact-checking gets harder

Still, the tools to manipulate images are "getting better and better," said Hany Farid, who studies digital forensics at Dartmouth College. "Just about every image you see online has been manipulated, but not necessarily in nefarious ways."

When a photo is uploaded to Facebook, for example, it loses metadata that can help experts figure out if an image has been doctored, Farid said. When the social media platform automatically resizes a photo before it’s uploaded, he said, "they destroy evidence, essentially."

Some photos look like they’ve been altered, but they’ve just been compressed causing color changes or anomalies like lines on the image.

"One of the hardest things of the authentication game is we can’t actually authenticate an image," Farid said. Forensic analysis is "highly technical," he said, and requires "a fair amount of expertise." If he and colleagues find compelling evidence of manipulation — if the shadows or lighting is wrong, say — they can say they think something was manipulated with a certain degree of reliability. Or, they can say that they found no evidence of tampering.

"With enough time and with the right tools, we’re pretty good" at looking at images and videos and determining whether it’s likely they were doctored, Farid said.

But the sheer volume of new images and videos posted online is daunting. Every minute, 500 hours of video are uploaded to YouTube, according to Farid. On Facebook, 250,000 images are posted every minute. Because most digital forensic examinations are done manually, it’s impossible to keep up. Now, he said, the average person on Reddit has access to sophisticated editing tools and an easy delivery system: social media.

He wonders what would happen if today, amid an explosion of machine learning and artificial intelligence, the recording of Trump making lewd comments to an "Access Hollywood" host had been released. When it became public more than two years ago, in October 2016, Trump apologized. Now he might label it "fake news." Because, Farid said, there’s plausible deniability for everyone.