Stand up for facts and support PolitiFact.

Now is your chance to go on the record as supporting trusted, factual information by joining PolitiFact’s Truth Squad. Contributions or gifts to PolitiFact, which is part of the 501(c)(3) nonprofit Poynter Institute, are tax deductible.

More Info

I would like to contribute

Daniel Funke
By Daniel Funke July 15, 2020

ISIS is diminished, but Trump has not banned it from the internet

Donald Trump's campaign promise to close parts of the internet to the Islamic State has not materialized. However, the depletion of the terrorist organization's territory, as well as moderation on social media platforms and international law enforcement action, has resulted in a decline in its online activity during his presidency.

In December 2015, Trump called for "closing that internet up in some way" to impede the recruitment of ISIS fighters. He floated the idea of having Bill Gates, the co-founder of Microsoft, look into the possibility.

Nearly five years later, there have been several successes in the effort to defeat ISIS. By January 2018, the terrorist organization had lost 93% of its territory in the Middle East. In October 2019, Abu Bakr al-Baghdadi, the leader of ISIS, died in a raid by U.S. forces.

Still, ISIS has thousands of fighters and continues to recruit and publish propaganda online. Unlike other countries, Congress has not yet passed legislation that would require internet service providers or social media platforms to remove terrorist propaganda (although some legislators have introduced bills that would address it).

But many of them have done so, anyway.

Since Trump's election, artificial intelligence has helped social media platforms remove content from terrorist organizations. Facebook has reported that it automatically removes more than 99% of ISIS and al-Qaida content published on its platform. YouTube reported that, between October and December 2019, about 90% of videos that violated the company's policy against violent extremism were removed before they had 10 views.

In May 2019, prompted by the terrorist attack at mosques in Christchurch, New Zealand — which was live-streamed on Facebook — several governments and online service providers adopted a non-binding pledge called the "Christchurch Call to Action to Eliminate Terrorist and Violent Extremist Content Online." Tech companies like Twitter, Facebook and Google pledged to take additional measures to address terrorist content on their platforms, such as updating terms of service and improving technology that detects terrorist content.

Although it has endorsed similar intergovernmental initiatives, the U.S. did not sign the pledge "due to policy and legal concerns," according to a 2019 State Department terrorism report

"It is true that the big tech platforms have somewhat cleaned up their acts in this regard, most especially Twitter," said Cori Dauber, a communication professor at the University of North Carolina at Chapel Hill, in an email. "But none of that, as far as I know, is a function of federal action — the platforms were responding to pressure, sure, but they've been under pressure on these issues for years."

Even if the U.S. did pass legislation that forced internet platforms to ban or block terrorist propaganda, the First Amendment could stand in the way. Federal law also generally protects internet service providers from legal responsibility for what users do or say on their platforms, which is different from European countries like France, which has passed legislation forcing tech platforms to remove terrorist content.

Trump has tried to challenge those protections — not to counteract terrorists but as a threat to ensure that conservative voices aren't silenced by content moderators. In a May 2020 executive order, the president called for limitations on Section 230(c) of the Communications Decency Act, the law that grants tech platforms their legal immunity. The order argues that, if platforms restrict certain voices, their protections should be revoked. Law experts have said that, absent legislation from Congress, the order would not be legally binding.

Finally, even if Trump did manage to ban ISIS from certain parts of the internet, it's unlikely that would stop the organization's recruitment.

In April 2018, authorities in eight countries took part in an operation that seized ISIS servers that hosted some of its propaganda apparatus. A year and a half later, Interpol, an international police organization in Europe, announced that it had stripped ISIS propaganda from platforms like Twitter and Google. Telegram, an encrypted messaging service, contained the most offending material.

However, shortly following the takedown, the BBC reported that ISIS supporters were talking about moving to other, more underground platforms to evade authorities and promote propaganda. And ISIS is known to use the darknet, parts of the internet that are anonymized and not searchable through traditional search engines like Google.

"Going after terrorist communications networks is kind of like treating a gunshot wound with a Band-Aid," said Patrick Eddington, a research fellow at the Cato Institute. "The real key to defeating them is defeating the ideology."

Despite American tech companies' in-house efforts to moderate and remove terrorist content, as well as international efforts to take down propaganda, it's clear that ISIS is still using parts of the internet to publish propaganda and recruit new members. And the U.S. government has not passed legislation that compels internet service providers to ban content from terrorist organizations.

"You can't really 'shut down' or 'close down' the internet," said Chelsea Daymon, a terrorism and political violence researcher at American University. "While you can remove content and shut down accounts that are pro-ISIS (which will slow the spread of content reach), the concept of shutting down parts of the internet shows a misunderstanding about what the internet is and how it works."

We rate this promise Stalled.

Our Sources

BBC, "Europol disrupts Islamic State propaganda machine," Nov. 25, 2019

BBC, "France gives online firms one hour to pull 'terrorist' content," May 14, 2020

Business Insider, "ISIS is taking full advantage of the darkest corners of the internet," July 11, 2015

Combating Terrorism Center, "Selling the Long War: Islamic State Propaganda after the Caliphate," November 2018

Congressional Research Service, "Terrorism, Violent Extremism, and the Internet: Free Speech Considerations," May 6, 2019

Cornell Legal Information Institute, 47 U.S. Code § 230 - Protection for private blocking and screening of offensive material

Email from Chelsea Daymon, a terrorism and political violence researcher at American University, July 15, 2020

Email from Cori Dauber, a communication professor at the University of North Carolina at Chapel Hill, July 15, 2020

Email from Patrick Eddington, a research fellow at the Cato Institute, July 15, 2020

Europol, "ISLAMIC STATE PROPAGANDA MACHINE HIT BY LAW ENFORCEMENT IN COORDINATED TAKEDOWN ACTION," April 27, 2018

Facebook, "Community Standards Enforcement Report, November 2019 Edition," Nov. 13, 2019

Facebook, "Hard Questions: What Are We Doing to Stay Ahead of Terrorists?" Nov. 8, 2018

Google Transparency Report, Featured policies, accessed July 15, 2020

Homeland Security Today, "Is ISIS Still Alive and Well on the Internet?" Jan. 14, 2019

NBC News, "Donald Trump Calls For 'Closing That Internet Up,'" Dec. 8, 2015

NPR, "Islamic State 'Not Present On The Internet Anymore' Following European Operation," Nov. 15, 2019

NPR, "Stung By Twitter, Trump Signs Executive Order To Weaken Social Media Companies," May 28, 2020

Politico, "U.S. military fears pandemic could lead to ISIS resurgence in Syria," April 2, 2020

PolitiFact, "Donald Trump: ISIS territory losses near 100 percent," Jan. 30, 2018

PolitiFact, "Mike Pence wrong that ISIS has been defeated," Jan. 17, 2019

Twitter, "Addressing the abuse of tech to spread terrorist and extremist content," May 15, 2019

U.S. Congressman Max Rose, "Rose Introduces Raising the Bar Act to Address Terrorist Content on Social Media," Nov. 21, 2019

U.S. Department of Defense, "U.S. Forces Kill ISIS Founder, Leader Baghdadi in Syria," Oct. 27, 2019

U.S. State Department, "Country Reports on Terrorism 2019"

The White House, "Executive Order on Preventing Online Censorship," May 28, 2020

Allison Colburn
By Allison Colburn September 21, 2017

In wake of London train attack, Trump doubles down on campaign promise

Shortly after news broke of the Sept. 15 Parsons Green bombing in London, President Donald Trump reiterated his campaign promise to close down portions of the internet to curb the spread of terrorism.

"Loser terrorists must be dealt with in a much tougher manner," Trump said in a tweet. "The internet is their main recruitment tool which we must cut off & use better!"

Prior to the election, he suggested U.S. antiterrorism officials would work with leaders in the technology industry to shut down parts of the internet that terrorist groups such as ISIS, or ISIL, use to recruit and share information.

U.K. Prime Minister Theresa May shared similar views in June after a different terrorist attack occurred on the London Bridge. Google, Twitter, Facebook and Microsoft responded to criticisms by vowing to improve detection of extremist content.

But cracking down on unwelcome or violent speech has proven to be difficult for those companies. A ProPublica investigation released in June found that Facebook's algorithm for identifying hate speech wound up removing some posts for nonsensical reasons and allowing other posts that should have been removed. Likewise, Twitter says it has shut down nearly 1 million suspected terrorist accounts since August 2015, but the company has struggled to find the "magic algorithm" to identifying terrorist content.

However, crackdowns on terrorist-related content appear to merely shift the problem elsewhere. Recent research shows that suspected terrorists have moved their online presence from Twitter to Telegram, a messaging app that was created in 2013.

The extent of the U.S. government's power to shut down online terrorist propaganda is unclear. When former Federal Communications Commission Chairman Tom Wheeler was questioned on the topic in November 2015, days after a mass shooting took place at a concert in Paris, Wheeler responded that the commission doesn't have the legal authority to shut down individual websites.

Even if legislation were passed to ban or block terrorist propaganda online, First Amendment rights could stand in the way. In addition, United States law generally protects internet service providers from legal responsibility for what the companies' users say or do on the internet, a marked difference from European countries such as France, which holds internet service providers responsible for removing terrorist propaganda.

Even though internet providers and websites have certain rights, there are still legal methods of taking down online content, according to Daphne Keller, director of intermediary liability at the Stanford Center for Internet and Society. U.S. Immigrations and Customs Enforcement has taken some of those legal avenues in the past to remove sites with child sex abuse material or with large-scale piracy.

"These kinds of blocks have been condemned by human rights bodies because they are poorly targeted and often take down legal speech with the illegal," she said. "I have not heard of this being done for terrorism, though it might be."

There might also be legal ground to compel third-party platforms, such as Twitter and YouTube, to remove content uploaded by known terrorists and terrorist organizations, but Keller said this has not yet been court-tested.

Meanwhile, Secretary of State Rex Tillerson recently accepted $60 million from Congress to continue the United States' counter-Russian and terrorist propaganda efforts (after sitting on the funding as lawmakers pressured him to take it).

For Trump's part, the president signed a joint statement among G7 leaders in May to put pressure on technology companies to eliminate online extremism.

Trump has reiterated this promise as president with subsequent terror attacks, but the chances of shutting down the internet don't look great in the United States based on stumbling efforts so far by tech companies and the First Amendment's speech protections. We rate this promise Stalled.

Our Sources

Perspectives on Terrorism, "IS and the Jihadist Information Highway – Projecting Influence and Religious Identity via Telegram," Vol 10, No 6 (2016)

Federal Bureau of Investigation, "ISIL Online: Countering Terrorist Radicalization and Recruitment on the Internet and Social Media," July 6, 2016

Fortune, "Donald Trump Wants Bill Gates To 'Close That Internet Up,'' Dec. 8, 2017

The Independent, "Facebook, Google and Twitter respond to calls to do more in wake of terror attacks," June 5, 2017

CNBC, "Facebook, YouTube, Twitter and Microsoft join to fight against terrorist content," June 26, 2017

ProPublica, "Facebook's Secret Censorship Rules Protect White Men From Hate Speech But Not Black Children," June 28, 2017

The Washington Post, "One GOP lawmaker's plan to stop ISIS: Censor the Internet," Nov. 17, 2015

International Business Times, "France steps up cyberwar on terrorism with new website blocking law," March 5, 2015

Politico, "Tillerson moves toward accepting funding for fighting Russian propaganda," Aug. 31, 2017

U.S. News and World Report, "G7 Leaders Pressure Tech Firms on Removing Terror Propaganda," May 26, 2017

Twitter Inc., "An update on our efforts to combat violent extremism," Aug. 18, 2016

Twitter Inc., Transparency report - January to June 2017, accessed Sept. 20, 2017

Newseum Institute, "Combatting Terrorism in a Digital Age:  First Amendment Implications," Nov. 16, 2016

Legal Information Institute, 47 U.S. Code § 230

Donald Trump, Tweet, Sept. 15, 2017

Email interview with Daphne Keller, Director of intermediary liability at the Stanford Center for Internet and Society, Sept. 20, 2017

Latest Fact-checks