SP
BravenNow
The AI child exploitation crisis is here
| USA | general | ✓ Verified - nbcnews.com

The AI child exploitation crisis is here

#AI-generated CSAM #child sexual abuse material #artificial intelligence #National Center for Missing and Exploited Children #deepfakes #cyber crimes #digital exploitation #open-source AI

📌 Key Takeaways

  • NCMEC received over 1 million reports of AI-generated CSAM in nine months
  • Reports of AI-related child exploitation increased by 600% in first half of 2025
  • Only 36 criminal court cases related to AI-generated CSAM identified in past three years
  • AI-generated CSAM is becoming increasingly realistic and difficult to differentiate from real images
  • Offenders exploit various platforms including specialized AI websites and open-source models

📖 Full Retelling

The National Center for Missing and Exploited Children reported receiving over one million reports of AI-generated child sexual abuse material in the United States between January and September 2025, as law enforcement struggles to combat a crisis fueled by rapid advancements in artificial intelligence that enable offenders to create increasingly realistic imagery of both real and non-existent children. Law enforcement officials have witnessed a staggering 600% increase in reports of child exploitation involving generative AI in the first half of 2025 compared to the previous two years combined, with Michael Prado, deputy assistant director of Homeland Security Investigations' Cyber Crimes Center, noting that AI-generated CSAM is now commonly found mixed with traditional material featuring real children. Despite the alarming number of reports, only 36 state and federal criminal court cases related to AI-generated CSAM have been identified in the past three years across 22 states, while all closed cases have resulted in guilty verdicts, representing only a fraction of the problem as investigators struggle to match the overwhelming volume of reports to specific individuals.

🏷️ Themes

Child Exploitation, AI Technology, Law Enforcement Challenges, Digital Safety

📚 Related People & Topics

National Center for Missing & Exploited Children

American children's rights organization

The National Center for Missing & Exploited Children, abbreviated as NCMEC () is a private, nonprofit organization established in 1984 by the United States Congress. In September 2013, the United States House of Representatives, United States Senate, and the president of the United States reauthoriz...

View Profile → Wikipedia ↗

Entity Intersection Graph

Connections for National Center for Missing & Exploited Children:

👤 Savannah Guthrie 1 shared
🌐 Today 1 shared
🌐 Disappearance of Nancy Guthrie 1 shared
👤 Missing Persons 1 shared
View full profile
Original Source
The AI child exploitation crisis is here The National Center for Missing and Exploited Children said it received over a million reports tied to AI-generated child sexual abuse material in just nine months. AI has added a confounding element to child sexual abuse cases for law enforcement. Justine Goode / NBC News; Getty Images Share Add NBC News to Google Feb. 28, 2026, 7:00 AM EST By Bruna Horvath Listen to this article with a free account 00:00 00:00 The rapid advancement of artificial intelligence has made it easier than ever for bad actors to create child sexual abuse material, leaving prosecutors and lawmakers struggling to keep up. Despite efforts by tech companies, law enforcement and activists, offenders consistently exploit system loopholes, open-source AI models and ready-made sexual exploitation platforms to generate imagery of both identifiable and nonexistent children, according to experts and law enforcement officials who spoke with NBC News. Between January and September of 2025, NCMEC’s CyberTipline — the official online sexual exploitation tip line in the U.S. — received over a million reports related to generative AI, according to Fallon McNulty, the executive director of the center’s exploited children division. “We often see bad actors at the forefront of leaning into those types of advancements in order to sexually exploit children online,” McNulty said. “The almost indistinguishable nature of the content that is being generated makes it extremely difficult for victim identification efforts.” Law enforcement officials have found that child sexual abuse material created with generative AI can take on many forms. Sometimes people photograph children in public settings or use already-public photographs, and then use AI systems to turn them into CSAM. Other times, people create entirely new sexually explicit material that involves no real child or recognizable face and is completely AI-generated. The material is becoming more realistic and harder to...
Read full article at source

Source

nbcnews.com

More from USA

News from Other Countries

🇬🇧 United Kingdom

🇺🇦 Ukraine