Stand up for the facts!

Our only agenda is to publish the truth so you can be an informed participant in democracy.
We need your help.

More Info

I would like to contribute

The Facebook logo is seen on a mobile phone, Oct. 14, 2022, in Boston. (AP) The Facebook logo is seen on a mobile phone, Oct. 14, 2022, in Boston. (AP)

The Facebook logo is seen on a mobile phone, Oct. 14, 2022, in Boston. (AP)

Sevana Wenn
By Sevana Wenn July 6, 2023

If Your Time is short

  • YouTube said June 2 it will no longer remove videos that include false information about the 2020 presidential election. Some advocates warn that the policy change could have ripple effects across social media platforms and allow misinformation to spread more easily.

  • PolitiFact examined election misinformation policies at YouTube, Twitter, TikTok and Meta. We found that although most policies haven’t changed since the 2022 midterm elections, enforcement has been inconsistent and layoffs have hindered fact-checking operations.

  • Experts suggest that YouTube’s policy change, coupled with layoffs at other companies, signals a broader shift away from misinformation regulation ahead of the 2024 election cycle. 

As the 2024 presidential election comes into focus, tech giants have struggled to balance curtailing misinformation with protecting users’ political speech. A recent YouTube policy change, which affects how the platform handles election-related misinformation, is an example of this tension.

In June 2023, the video-sharing platform officially changed its election misinformation policy and announced that videos promoting election falsehoods will no longer be removed. The move has concerned digital misinformation experts, who warn that misinformation that originates on YouTube could infiltrate the broader information ecosystem, especially in 2024. 

PolitiFact examined the state of election misinformation policies at YouTube, Twitter, TikTok and Meta, which owns Facebook and Instagram. 

We found that although most policies haven’t changed since the 2022 midterm elections, enforcement has been inconsistent, and platforms including Twitter and Meta have reduced  content moderation staff.  

Experts also suggest that these personnel cuts, coupled with YouTube’s policy change, signal a broader shift away from misinformation regulation ahead of the 2024 election. 

What’s YouTube’s role in the information ecosystem? 

Researchers at New York University’s Center for Social Media and Politics found that stricter policies implemented by YouTube in the weeks following the 2020 election had positive effects across social media platforms.

"YouTube is an integral part of the information ecosystem," said Megan Brown, the study’s senior researcher. "It is consistently one of the most popular domains shared across all platforms." 

The researchers tracked election-related content reposted from YouTube on Twitter. From Dec. 8 to Dec. 21, 2020 — about a month after the election — the proportion of false election content reposted from YouTube dropped below 20% of all election-related Twitter content for the first time since the election. This figure decreased again after YouTube announced in January 2021 its three-strike system, which expels channels that violate community guidelines three times. By Inauguration Day in January 2021, the proportion of false election content from YouTube that was reposted on Twitter had dropped to around 5%, with similar trends reported on Facebook.

Brown said these findings were observational, and researchers could not confirm a direct link between YouTube’s policy shift and the decrease in election misinformation shared across platforms. However, given the spread of misinformation in 2020, "we would expect things to play out similarly in 2024 if YouTube is not moderating this content."

Right-wing media platforms such as Parler, Gab and Truth Social could also pose a unique challenge for stopping misinformation because they do not regulate election-related content. Brown said misinformation could foment on these platforms and spread across others.

Former President Donald Trump, who has falsely denied his defeat in the 2020 election, owns Truth Social. 

YouTube’s recent policy change 

YouTube’s June 2 announcement cited the importance of open political debate as a key motivation for its policy change. 

"We find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm," the company’s press release said. 

The platform will keep other guidelines in place that were adopted in 2020, such as prioritizing authoritative search results and removing misinformation about the time, location and validity of voting options. 

The news release said little about how YouTube will approach 2024 election misinformation, but that adjustments to its strategy would be made as needed. YouTube spokesperson Ivy Choi told PolitiFact, "We'll have more details to share about our approach towards the 2024 election in the months to come."

Twitter’s handling of misinformation 

Twitter’s approach to misinformation has also shifted after its 2022 takeover by Elon Musk, who calls himself a free-speech advocate but has sometimes flip-flopped on some free speech-related decisions. 

Twitter has reduced its workforce responsible for handling misinformation. Last November, the company laid off about 15% of its trust and safety team, which moderates content and regulates hate speech. In January, there were further cuts affecting at least a dozen members of these teams. 

The platform's written election guidelines haven’t changed since PolitiFact’s last report in August. The guidelines say that people may not use Twitter to manipulate or interfere with elections, and that posts containing misinformation could be deamplified or labeled using the community notes feature. The notes feature allows users to add context and sources to inaccurate posts. 

However, an analysis on behalf of The Associated Press found that following Trump’s May 10 town hall, the 10 most widely shared tweets promoting a "rigged election" narrative circulated unchecked. The tweets amassed more than 43,000 retweets and no community notes. 

Twitter did not respond to a request for comment.

TikTok’s approach to election integrity 

TikTok’s election integrity policy addresses misinformation in three ways: removing content, redirecting search results and reducing discoverability. Claims that seek to erode trust in public institutions, such as false information about election dates or attempts at voter suppression, will be removed, according to TikTok’s website. 

Tiktok has partnered with 15 global fact-checking organizations, including PolitiFact, that assess content for the platform’s moderators and policy teams. TikTok then uses the information to apply its misinformation policies.

The company’s February 2021 transparency report said that in the U.S., from July 1 to Dec. 31, 2020, 347,225 videos were removed for election misinformation, disinformation or manipulated media. An additional 441,028 videos were ineligible for recommendation into users’ For You feeds after being vetted by fact-checkers. 

For context, in a three-month span in late spring 2022, 11 billion videos were uploaded to the platform, The Washington Post reported

A recent study showed that election misinformation is still an issue on TikTok. In 2022, researchers at Global Witness, a human rights organization, and the Cybersecurity for Democracy team at New York University Tandon School of Engineering, created dummy accounts to submit ads with political disinformation to social media platforms. TikTok fared poorly compared with other social media giants. 

Although TikTok doesn’t allow political advertising, the platform accepted 90% of the disinformation ads, according to the study. By comparison, Facebook accepted about 20% of English-language ads and 50% of Spanish-language ads. YouTube rejected all of the advertisements and blocked the dummy account. 

"TikTok performed the worst out of all of the platforms tested in this experiment, with only one ad in English and one ad in Spanish — both relating to COVID vaccinations being required for voting — being rejected," according to the study. "Ads containing the wrong election day, encouraging people to vote twice, dissuading people from voting, and undermining the electoral process were all approved." 

TikTok did not respond to a request for comment on the study’s findings. A Meta spokesperson directed us to a previous company statement included in the study: "These reports were based on a very small sample of ads, and are not representative given the number of political ads we review daily across the world." 

Meta and fact-checking

In a fact sheet published ahead of the 2022 midterm elections, Meta said it would prohibit advertisements that encourage people not to vote or question the election outcome’s legitimacy. It also said it had "banned more than a thousand militarized social movements, taken down tens of thousands of QAnon pages, Groups and accounts from our apps, and removed the original #StopTheSteal Group." 

The AP reported before the 2022 midterm elections that Meta was "quietly curtailing some of its safeguards designed to thwart voting misinformation." The AP also said the platform had shut down the accounts of researchers examining political ads on Facebook; and that CrowdTangle, a tool available to newsrooms and researchers to identify trending posts and misinformation, "is now inoperable on some days." 

In the U.S., 11 third-party fact-checking organizations, including PolitiFact, are partnered with Meta. Posts that are labeled as misinformation are made less visible in users’ feeds. 

Meta’s third-party fact-checking program does not apply to active politicians, including Trump. 

Like Twitter, Meta has significantly cut its staff in recent months. CNBC reported in May that Meta laid off about 200 content moderators in early January 2023. Meta also laid off at least 16 members of Instagram’s well-being group and more than 100 positions related to trust, integrity and responsibility, according to CNBC. 

Asked whether Meta will implement any new safeguards or make changes to existing regulations before the 2024 election, a spokesperson said, "We continue to enforce our misinformation policies." 

Looking ahead to 2024

Advocates warn that YouTube’s policy change might contribute to election misinformation spreading unchecked. Nora Benavidez, digital justice and civil rights director at the nonprofit Free Press, said in an email that YouTube’s decision "threatens our democracy," and should be reversed. 

Experts in social media disinformation have also questioned the rationale behind the policy change. 

YouTube didn’t provide evidence "whether or not the previous policy curtailed risk, and in that way, this seems arbitrary," said Darren Linvill, a Clemson University professor and a lead researcher at the university’s Watt Family Innovation Center Media Forensics Hub. The new policy likely will benefit YouTube, though, "as years of research support the fact that false stories spread faster on social media than true stories. And that means higher view counts."

Linvill sees YouTube’s move as part of a broader cultural shift away from misinformation regulation, and noted that market factors could be at play. 

"On one hand, you have new platforms like TruthSocial that have been created especially with free speech and lack of regulation in mind," he said. "On the other, we see mainstream platforms like Twitter and Meta actively cutting staff, including in moderation and site integrity. Perhaps we are just seeing a pendulum swing, but it may also be a response to an increasingly competitive marketplace."  

Kate Starbird, a University of Washington associate professor of human-centered design and engineering who co-founded the university’s Center for an Informed Public, also views YouTube’s move as part of a bigger trend. 

"It is interesting to note that, after false claims about voter fraud helped to motivate and mobilize the Jan. 6 (2021) attack on the U.S. Capitol, the reaction from platforms has been to move away from moderating such claims, rather than doubling down on moderation," she said. "We could attribute that to the success of the effort to rhetorically equate social media moderation to ‘censorship.’" 

Starbird added, "All of these changes will make it even more challenging to track and report on rumors, misinformation and disinformation in 2024." 

Sign Up For Our Weekly Newsletter

Our Sources

SOURCES:

YouTube Official Blog, "An update on our approach to US election misinformation," June 2, 2023.

YouTube Official Blog, "Supporting the 2020 U.S. election," December 9, 2020.

The New York Times, "Youtube’s stronger misinformation policies had a spillover effect on Twitter and Facebook, researchers say," October 14, 2021.

Twitter, "Civic integrity misleading information policy," accessed June 7, 2023.

Associated Press, "False claims of a stolen election thrive unchecked on Twitter even as Musk promises otherwise," May 18, 2023.

Meta, "Preparing for elections," accessed June 7, 2023.

Meta, "Misinformation", accessed June 7, 2023.

TikTok, "Election Integrity," accessed June 7, 2023.

TikTok, "TikTok’s H2 2020 Transparency Report," February 24, 2021.

NBC, "Days before the midterms, Twitter lays off employees who fight misinformation," November 4, 2022.

Bloomberg, "Twitter cuts more staff overseeing content moderation," January 7, 2023.

Meta, "Our approach to newsworthy content," August 25, 2022.

Email interview with Ivy Choi, policy communications manager at YouTube, June 14, 2023.

Email interview with TikTok spokesperson, June 14, 2023.

Email interview with Nora Benavidez, senior counsel and director of digital justice and civil rights at Free Press, June 14, 2023.

Email interview with Kate Starbird, associate professor at the University of Washington, June 14, 2023.

Email interview with Megan Brown, research engineer and data scientist at New York University’s Stern Center for Social Media and Politics, June 14, 2023.

Global Witness, "TikTok and Facebook fail to detect election disinformation in the US, while YouTube succeeds," October 21, 2022.

Meta, "A list of our independent fact-checking partners, by country," accessed June 15, 2023.

The Washington Post, "Sorry you went viral," October 21, 2022.

Email interview with Darren Linvill, June 22, 2023.

CNBC, "Tech layoffs ravage the teams that fight online misinformation and hate speech," May 26, 2023.

Email interview with Corey Chambliss, public affairs at Meta, June 30, 2023.

Poynter, "Trump won’t be fact-checked on Facebook, per Meta," November 21, 2022.
Associated Press, "Meta quieter on election misinformation as midterms loom," August 5, 2022.

Browse the Truth-O-Meter

More by Sevana Wenn

How Meta, TikTok, Twitter and YouTube plan to address 2024 election misinformation