Close Menu
RoboNewsWire – Latest Insights on AI, Robotics, Crypto and Tech Innovations
  • Home
  • AI
  • Crypto
  • Cybersecurity
  • IT
  • Energy
  • Robotics
  • TechCrunch
  • Technology
What's Hot

Investors trust Google more than Meta when comes to spending on AI

April 30, 2026

Paragon is not collaborating with Italian authorities probing spyware attacks, report says

April 28, 2026

Microsoft cuts OpenAI revenue share as their AI alliance loosens

April 28, 2026
Facebook X (Twitter) Instagram
Trending
  • Investors trust Google more than Meta when comes to spending on AI
  • Paragon is not collaborating with Italian authorities probing spyware attacks, report says
  • Microsoft cuts OpenAI revenue share as their AI alliance loosens
  • Robotically assembled building blocks could make construction more efficient and sustainable | MIT News
  • AI showdown: Musk and Altman go to trial in fight over OpenAI’s beginnings
  • U.S., Iran seize ships as war evolves into standoff over Strait of Hormuz
  • Google launches training and inference TPUs in latest shot at Nvidia
  • Zoom teams up with World to verify humans in meetings
  • Home
  • About Us
  • Advertise
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
RoboNewsWire – Latest Insights on AI, Robotics, Crypto and Tech InnovationsRoboNewsWire – Latest Insights on AI, Robotics, Crypto and Tech Innovations
Friday, May 8
  • Home
  • AI
  • Crypto
  • Cybersecurity
  • IT
  • Energy
  • Robotics
  • TechCrunch
  • Technology
RoboNewsWire – Latest Insights on AI, Robotics, Crypto and Tech Innovations
Home » 5 takeaways from CNBC’s investigation into ‘nudify’ apps and sites

5 takeaways from CNBC’s investigation into ‘nudify’ apps and sites

GTBy GTSeptember 28, 2025 IT No Comments5 Mins Read
Share
Facebook Twitter LinkedIn Pinterest Email


Jessica Guistolise, Megan Hurley and Molly Kelley talk with CNBC in Minneapolis, Minnesota, on July 11, 2025, about fake pornographic images and videos depicting their faces made by their mutual friend Ben using AI site DeepSwap.

Jordan Wyatt | CNBC

In the summer of 2024, a group of women in the Minneapolis area learned that a male friend used their Facebook photos mixed with artificial intelligence to create sexualized images and videos.   

Using an AI site called DeepSwap, the man secretly created deepfakes of the friends and over 80 women in the Twin Cities region. The discovery created emotional trauma and led the group to seek the help of a sympathetic state senator.

As a CNBC investigation shows, the rise of “nudify” apps and sites has made it easier than ever for people to create nonconsensual, explicit deepfakes. Experts said these services are all over the Internet, with many being promoted via Facebook ads, available for download on the Apple and Google app stores and easily accessed using simple web searches.

“That’s the reality of where the technology is right now, and that means that any person can really be victimized,” said Haley McNamara, senior vice president of strategic initiatives and programs at the National Center on Sexual Exploitation.

CNBC’s reporting shines a light on the legal quagmire surrounding AI, and how a group of friends became key figures in the fight against nonconsensual, AI-generated porn.

Here are five takeaways from the investigation.

The women lack legal recourse

Because the women weren’t underage and the man who created the deepfakes never distributed the content, there was no apparent crime.

“He did not break any laws that we’re aware of,” said Molly Kelley, one of the Minnesota victims and a law student. “And that is problematic.”

Now, Kelley and the women are advocating for a local bill in their state, proposed by Democratic state Senator Erin Maye Quade, intended to block nudify services in Minnesota. Should the bill become law, it would levy fines on the entities enabling the creation of the deepfakes.

Maye Quade said the bill is reminiscent of laws that prohibit peeping into windows to snap explicit photos without consent.

“We just haven’t grappled with the emergence of AI technology in the same way,” Maye Quade said in an interview with CNBC, referring to the speed of AI development.

The harm is real

Jessica Guistolise, one of the Minnesota victims, said she continues to suffer from panic and anxiety stemming from the incident last year.

Sometimes, she said, a simple click of a camera shutter can cause her to lose her breath and begin trembling, her eyes swelling with tears. That’s what happened at a conference she attended a month after first learning about the images.

“I heard that camera click, and I was quite literally in the darkest corners of the internet,” Guistolise said. “Because I’ve seen myself doing things that are not me doing things.”

Mary Anne Franks, professor at the George Washington University Law School, compared the experience to the feelings victims describe when talking about so-called revenge porn, or the posting of a person’s sexual photos and videos online, often by a former romantic partner.

“It makes you feel like you don’t own your own body, that you’ll never be able to take back your own identity,” said Franks, who is also president of the Cyber Civil Rights Initiative, a nonprofit organization dedicated to combating online abuse and discrimination.

Deepfakes are easier to create than ever

Less than a decade ago, a person would need to be an AI expert to make explicit deepfakes. Thanks to nudifier services, all that’s required is an internet connection and a Facebook photo.

Researchers said new AI models have helped usher in a wave of nudify services. The models are often bundled within easy-to-use apps, so that people lacking technical skills can create the content.

And while nudify services can contain disclaimers about obtaining consent, it’s unclear whether there is any enforcement mechanism. Additionally, many nudify sites market themselves simply as so-called face-swapping tools.

“There are apps that present as playful and they are actually primarily meant as pornographic in purpose,” said Alexios Mantzarlis, an AI security expert at Cornell Tech. “That’s another wrinkle in this space.”

Nudify service DeepSwap is hard to find

The site that was used to create the content is called DeepSwap, and there’s not much information about it online.

In a press release published in July, DeepSwap used a Hong Kong dateline and included a quote from Penyne Wu, who was identified in the release as CEO and co-founder. The media contact on the release was Shawn Banks, who was listed as marketing manager. 

CNBC was unable to find information online about Wu, and sent multiple emails to the address provided for Banks, but received no response.

DeepSwap’s website currently lists “MINDSPARK AI LIMITED” as its company name, provides an address in Dublin, and states that its terms of service are “governed by and construed in accordance with the laws of Ireland.”

However, in July, the same DeepSwap page had no mention of Mindspark, and references to Ireland instead said Hong Kong. 

AI’s collateral damage

Maye Quade’s bill, which is still being considered, would fine tech companies that offer nudify services $500,000 for every nonconsensual, explicit deepfake that they generate in the state of Minnesota.

Some experts are concerned, however, that the Trump administration’s plans to bolster the AI sector will undercut states’ efforts.

In late July, Trump signed executive orders as part of the White House’s AI Action Plan, underscoring AI development as a “national security imperative.” 

Kelley hopes that any federal AI push doesn’t jeopardize the efforts of the Minnesota women.

“I’m concerned that we will continue to be left behind and sacrificed at the altar of trying to have some geopolitical race for powerful AI,” Kelley said.

WATCH: The alarming rise of AI ‘nudify’ apps that create explicit images of real people.

The alarming rise of AI ‘nudify’ apps that create explicit images of real people



Source link

GT
  • Website

Keep Reading

Investors trust Google more than Meta when comes to spending on AI

Google launches training and inference TPUs in latest shot at Nvidia

Meta tracks employee usage on Google, LinkedIn AI training project

Meta will cut 10% of workforce as company pushes deeper into AI

OpenAI announces GPT-5.5, its latest artificial intelligence model

Trump says Anthropic deal is ‘possible’ for Department of Defense use

Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Investors trust Google more than Meta when comes to spending on AI

April 30, 2026

Google launches training and inference TPUs in latest shot at Nvidia

April 27, 2026

Meta tracks employee usage on Google, LinkedIn AI training project

April 25, 2026

Meta will cut 10% of workforce as company pushes deeper into AI

April 24, 2026
Latest Posts

Malicious Chrome Extension Steal ChatGPT and DeepSeek Conversations from 900K Users

April 1, 2026

Top 10 Best Server Monitoring Tools

April 1, 2026

10 Best Cybersecurity Risk Management Tools

March 31, 2026

Subscribe to News

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

Welcome to RoboNewsWire, your trusted source for cutting-edge news and insights in the world of technology. We are dedicated to providing timely and accurate information on the most important trends shaping the future across multiple sectors. Our mission is to keep you informed and ahead of the curve with deep dives, expert analysis, and the latest updates in key industries that are transforming the world.

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 Robonewswire. Designed by robonewswire.

Type above and press Enter to search. Press Esc to cancel.