Stand up for the facts!
Our only agenda is to publish the truth so you can be an informed participant in democracy.
We need your help.
I would like to contribute
Two women and a child holding an Iranian flag walk toward the Imam Khomeini Grand Mosque to attend Friday prayers in Tehran, Iran, Friday, March 20, 2026. (AP)
If Your Time is short
-
During the U.S.-Israeli war on Iran, Iran has released AI-generated imagery through state-affiliated channels and employed inauthentic social media accounts to spread favorable messaging.
-
Iran has a history of conducting disinformation campaigns, but fabricating information is only part of its yearslong practice to influence public sentiment.
-
Iran’s government tactics also include using fake news websites, hacking and imposing internet shutdowns.
Amid false information spread during the Iran war, the Trump administration warned that the Iranian regime was misinforming people with the help of generative artificial intelligence.
"They are a country that for years — I didn't know this until recently — they're a country based on disinformation. And now they're using disinformation plus AI. And that's a terrible situation," President Donald Trump said during a March 16 event.
On separate occasions, Trump and Defense Secretary Pete Hegseth spoke of fake footage showing the USS Abraham Lincoln aircraft carrier on fire.
"These AI-generated images are meant to make it look like something's happening when the exact opposite is," Hegseth said in a March 19 press briefing. "So they make up fake reports and fake images to lie to their own people."
In 2018, during Trump’s first term, the U.S. State Department’s Global Engagement Center launched the Iran Disinformation Project to surface deliberately false messaging in Iran’s "official rhetoric, state propaganda outlets, social media manipulation and more." The program was shelved in 2019, after the program’s official Twitter account began targeting and trolling journalists, activists and academics. During his second term, the administration has gutted other offices meant to combat foreign influence efforts.
Although Iran’s efforts to influence and manipulate public opinion involve disinformation, experts told us the regime’s operations are more expansive than that. There have been cases of Iran-backed accounts and outlets publishing fabricated content, but the wave of misinformation on social media cannot solely be attributed to Iran’s influence operations. Many accounts that spread misinformation have not been linked to the Iranian regime.
Emerson Brooking, strategy director of the Atlantic Council’s Digital Forensic Research Lab, said Iran distributes state propaganda using covert tactics: through fake news websites, inauthentic social media accounts and proxy media networks that perpetuate talking points from the regime under the guise of independent reporting.
"The content is biased and covertly placed, but it is rarely wholly invented," Brooking said. "Iran is a country that has made clandestine propaganda a core instrument of national security policy. That is a serious problem, but it is a different problem than disinformation in the way most people understand the term."
Mahsa Alimardani, associate director of technology, threats and opportunities at the global human rights organization Witness, said Iran’s mechanisms for controlling information also include censorship and internet shutdowns.
Iran’s influence operation capabilities go back to 2010
Iran’s tactics include disinformation campaigns, propaganda and influence operations. These mean different things but they can also overlap. For example, it is disinformation when an official government account posts an AI-generated image with the intent of misleading people into thinking the image shows a real incident.
Influence operations involve attempts to manipulate public opinion through inauthentic accounts and inaccurate information.
The Atlantic Council described Iran as a pioneer in the construction of digital influence capabilities. Iran has operated inauthentic accounts on Facebook and Twitter, now X, going back to 2010. After the 2009 pro-democracy Green Movement, which was nicknamed the "Twitter Revolution," Iran started building its influence operation infrastructure. By 2011, it had recruited thousands of members trained in blogging, content production and multimedia design, and started creating bots and Facebook and Twitter accounts to spread its message without disclosing they were state-run.
"These campaigns create social media accounts by hand, integrate those accounts into specific online communities, and then leverage the influence they gain over time to push an Iranian agenda and divide the populations of their geopolitical rivals," said Darren Linvill, Clemson University professor and Media Forensics Hub co-director.
The Media Forensics Hub identified a network of around 60 accounts on X, Instagram and Bluesky linked to the Islamic Revolutionary Guard Corps, the primary force protecting Iran’s regime, in March. These accounts built followings based on false identities, such as Latina women from Texas, California, Venezuela and Chile or people from the British Isles.
They shared content on political issues such as immigration and Scottish independence. After the Feb. 28 strikes, they switched to pro-regime messaging.
Iran also pushes propaganda through a state media apparatus that reports in Farsi, Arabic and English, Linvill said. Most of it contains pro-Iranian and anti-American, Saudi and Israeli messaging, occasionally mixing in disinformation, he said.
Iranian disinformation has played a role in current war, but it’s not the full picture
Iran has used AI to spread disinformation in this war. The Iranian embassy in Austria posted an image of a bloody children’s backpack, linking it to the strike on a girls’ school in Minab, Iran. Although preliminary investigations show the U.S. was likely responsible for the bombing that officials said killed more than 170 people, mostly children, the backpack image turned out to be AI-generated.
The state-controlled Tehran Times shared an AI-generated image claiming to show an American radar in Qatar destroyed in an Iranian strike.
Projecting an "oppressed yet militarily victorious" nation is central to the regime’s war narrative, Alimardani said.
"The irony is devastating: the regime illustrated real deaths with fabricated imagery, and the identification of those fakes now provides ammunition for people denying the actual bombing occurred," Alimardani said in an email.
These tactics have precedent: During the 12-day war between Iran and Israel in June 2025, Iranian state media shared an AI-generated image of a downed F-35 jet. Such content "attempts to increase public confidence in its defense capabilities," said Max Lesser, senior analyst of emerging threats at the Foundation for Defense of Democracies.
Meta removed an Iran-linked influence operation comprised of 294 Instagram accounts, eight Facebook accounts and two Facebook pages, the company said in March. The personas in these accounts included an American political scientist, a women’s rights activist and an Albanian satirical cartoonist. Meta said it has not observed new campaigns linked to the U.S.-Israel war against Iran, but such operations typically take time to set up.
Iran employs other strategies to control public perception of the war’s events. Alimardani observed three forms of misleading information: real events with a government-approved spin, state-generated AI disinformation and accounts spreading regime narratives using AI.
She said it’s hard to know the motives of that last group; they could be genuine sympathizers or contracted operations.
Lesser said he has observed accounts in India and Pakistan spreading false claims about the war. Some appear to be monetized and thus commercially driven, he said.
The advent has made it easy for people to dismiss authentic evidence as fake. For example, some dismissed a real photo of the burial site for the Minab school deaths as fabricated.
"The claim that something ‘looks AI-generated’ has become a low-effort, high-impact way to discredit real documentation, requiring no actual forensic analysis to deploy," Alimardani said. "The signals we need for truth are largely failing."
How these tactics have targeted U.S. elections, other global issues, and Iranians
The Islamic Republic targets its audiences using its ideological framework, weaving in political narratives of anti-imperialism, solidarity with Palestine, resistance to Western dominance, Alimardani said. Social media posts elevating those values gain traction with Global South and far-left Western audiences.
After Hamas attacked Israel on Oct. 7, 2023, the pace of Iran’s influence operations surged, Microsoft’s tracking showed. In December 2023, an Iranian cyber group hijacked streaming services in the United Arab Emirates, Canada and the United Kingdom with a broadcast featuring an AI-generated anchor and claiming to show Palestinian injuries and deaths.
The U.S. has typically seen these operations around elections. In 2024, for example, an Iranian group that researchers dubbed "Storm-2035" operated four websites that posed as American news outlets. Some of the sites used AI to repackage information from legitimate news outlets.
The Islamic Revolutionary Guard Corps also hacked the Trump campaign in 2024 and tried to leak the stolen campaign material to journalists.
Alimardani co-authored a 2021 study analyzing more than 9.3 million tweets linked to Iranian influence operations between 2008 and 2020, and found that the primary target was the Arab world, not the United States. The researchers also found that more than 86% of the Arabic tweets did not receive interactions.
"The most consistent target of Iran's information operations is its own population during moments of domestic unrest," Alimardani said.
She said that during demonstrations across multiple years, including the 2026 protests, Iran’s regime conducted campaigns to discredit protesters and make false claims about foreign instigation. The regime also used internet shutdowns to control narratives and provide web access only to those offering favorable messaging, leading Iranians to rely on the government’s version of the news.
On March 10, Iran’s government spokesperson said that internet connectivity will be provided to "those who can carry our voice further."
Brooking said Iran also targets Iranians who are based outside the country. "Threatening and surveilling Iranian Americans is part of the same apparatus that runs the social media campaigns," he said.
During Trump’s second term, the U.S. government’s capacity to monitor foreign influence operations has been weakened. The administration shuttered the Federal Bureau of Intelligence’s Foreign Malign Influence Task Force and the State Department’s Global Engagement Center.
"We have effectively made ourselves blind to this threat, even as the White House seems increasingly set on linking any setback in the war effort to Iranian disinformation," Brooking said.
Our Sources
Email interview, Emerson Brooking, strategy director of the Atlantic Council’s Digital Forensic Research Lab, March 19, 2026
Email interview, Mahsa Alimardani, associate director of technology, threats and opportunities at Witness, March 19, 2026
Email interview, Darren Linvill, Clemson University professor and Media Forensics Hub co-director, March 18, 2026
Email interview, Max Lesser, senior analyst of emerging threats at the Foundation for Defense of Democracies, March 18, 2026
YouTube video by The White House, President Trump Participates in a Lunch with the Trump Kennedy Center Board Members, March 16, 2026
Truth Social post by Donald Trump, March 15, 2026
PolitiFact, Videos do not show missile strike on USS Abraham Lincoln, March 2, 2026
YouTube video from The Associated Press, LIVE: Pete Hegseth holds Pentagon briefing as war with Iran intensifies, March 19, 2026
Iran Disinformation Project, About Us (archived copy), dated June 4, 2019
The Guardian, US cuts funds for ‘anti-propaganda’ Iran group that trolled activists, May 31, 2019
EU Disinfo Lab, Disinformation glossary: 150+ Terms to Understand the Information Disorder, March 30, 2023
Atlantic Council, Iranian digital influence efforts: Guerrilla broadcasting for the twenty-first century, Feb. 11, 2020
Pew Research Center, Iran and the "Twitter Revolution", June 25, 2009
Clemson University Media Forensics Hub Reports, From Texas to Tehran: A Multilingual, IRGC-affiliated Influence an: A Multilingual, IRGC-affiliated Influence Operation on X, Instagram, and Bluesky, March 11, 2026
The New York Times, Iran’s Revolutionary Guards: The Spine of a Militarized State, March 8, 2026
X post by IRAN Embassy in Austria, Feb. 28, 2026
The New York Times, Analysis Suggests School Was Hit Amid U.S. Strikes on Iranian Naval Base, March 5, 2026
CBS News, United States was "likely" responsible for bombing of girls' school in Iran, per early U.S. assessment, March 9, 2026
NPR, Microsoft detects fake news sites linked to Iran aimed at meddling in U.S. election, Aug. 9, 2024
X post by Tehran Times, Feb. 28, 2026
BBC, Israel-Iran conflict unleashes wave of AI disinformation, June 20, 2025
Meta, Adversarial Threat Report, March 2026
The Record, Iranian influence operation using fake personas to deceive US Instagram users disrupted, Meta says, March 11, 2026
X post, March 2, 2026
X post, March 3, 2026
The Atlantic, The Fake Images of a Real Strike on a School, March 13, 2026
Microsoft, Iran surges cyber-enabled influence operations in support of Hamas, Feb. 26, 2024
The Guardian, Iran-backed hackers interrupt UAE TV streaming services with deepfake news, Feb. 8. 2024
NPR, Microsoft detects fake news sites linked to Iran aimed at meddling in U.S. election, Aug. 9, 2024
U.S. Justice Department, Three IRGC Cyber Actors Indicted for ‘Hack-and-Leak’ Operation Designed to Influence the 2024 U.S. Presidential Election, Sept. 27, 2024
Mona Elswah and Mahsa Alimardani, Propaganda Chimera: Unpacking the Iranian Perception Information Operations in the Arab World, Oct. 5, 2021
Carnegie Endowment for International Peace, Iran Wields Wartime Internet Access as a Political Tool, March 18, 2026

