Jessica Gistraice, Megan Hurley and Molly Kelly will be talking to CNBC in Minneapolis, Minnesota on July 11, 2025, about fake porn images and videos depicting faces made by mutual friend Ben using the AI site Deepswap.
Jordan Wyatt | CNBC
In the summer of 2024, a group of women in the Minneapolis area learned that a male friend had mixed Facebook photos with artificial intelligence to create sexual images and videos.
Using an AI site called Deepswap, the man secretly created deepfakes of more than 80 women in the Twin Cities area with friends. This discovery created emotional trauma and led the group to seek the help of sympathetic national senators.
As CNBC research shows, the rise of “Nudify” apps and sites has made it easier than ever for people to create unconsensual, explicit deepfakes. Experts say that these services are on the internet, many of which are promoted via Facebook ads, can be downloaded in the Apple and Google app stores, and can be easily accessed using simple web search.
“That’s the reality of where technology is now and that means anyone can really be sacrificed,” said Hailey McNamara, senior vice president of the National Centre for Sexual Exploitation’s Strategic Initiatives and Programs.
The CNBC report sheds light on the legal mire surrounding AI, showing how a group of friends became important figures in the fight against unconsensual AI-generated porn.
Here are five takeaways from the survey.
Women lack legal reliance
There were no obvious crimes as women were not minors and the men who created the deepfakes did not distribute the content.
“He didn’t break the laws we know,” said Molly Kelly, one of Minnesota’s victims and a law student. “And that’s the problem.”
Now, Kelly and the woman are advocating the state’s local law proposal proposed by Democratic Sen. Erin May Quade, which aims to block Minnesota’s nude services. If the bill becomes law, it will fine the entities and allow for the creation of deepfakes.
Maye Quade said the bill is reminiscent of a law that prohibits peering through a window to snap explicit photos without consent.
“We’re just not working on the advent of AI technology the same way,” Maye Quade spoke about the speed of AI development in an interview with CNBC.
Harm is real
Jessica Gistreis, one of Minnesota’s victims, said she is suffering from the panic and anxiety that stems from last year’s incident.
Sometimes, she clicks easily on the camera shutter and her eyes start to swell with tears as she loses breath and starts to tremble. That’s what happened at a meeting she attended a month after she first learned about images.
“I heard the camera click, and I was literally in the darkest corner of the internet,” Gustris said. “Because I’ve seen myself doing things that aren’t doing.”
Mary Ann Franks, a professor at George Washington University Law School, compared the emotions and experiences the victim describes while talking about so-called revenge porn, or posting sexual photos or videos of people online by a former romantic partner.
“I feel like I don’t own my body and I can’t regain my identity,” said Franks, who is also president of the Cyber Civil Rights Initiative, a nonprofit organization dedicated to combating online abuse and discrimination.
Deepfakes are easier to create than ever
Less than 10 years later, a person needs to be an AI expert to make explicit deepfakes. Thanks to Nudifier Services, all you need is an internet connection and Facebook photos.
Researchers said the new AI model could help pioneer the wave of Nudify services. Models are often bundled with easy-to-use apps, allowing those who lack technical skills to create content.
Nudify Services can also include a disclaimer regarding obtaining consent, but it is unclear whether there is an enforcement mechanism. Furthermore, many nudice sites sell themselves as so-called facial swapping tools.
“There are apps that exist as playfulness, and in reality they primarily mean as intended porn,” says Alexios Mantzarlis, AI security expert at Cornell Tech. “It’s another wrinkle in this space.”
Nudify Service Deepswap is hard to find
The site used to create the content is called DeepSwap and there is no much information about it online.
In a press release released in July, DeepSwap used Hong Kong data lines and included a quote from Penyne Wu, released as CEO and co-founder. Media contact for the release was Sean Banks, who was listed as Marketing Manager.
CNBC was unable to find any online information about the WU and sent multiple emails to the address provided to the bank, but no response.
The Deepswap website currently lists “MindSpark AI Limited” as the company name and provides an address in Dublin, with the terms and conditions stated that its terms and conditions “complying and interpreted by Irish law.”
However, in July, the same deep swap page did not mention Mindspark, and the reference to Ireland was instead made by Hong Kong.
AI incidental damage
Still considered, May Quade’s bill will become a great tech company offering nudies $500,000 for every unconsensual and explicit deepfark that it generates in Minnesota.
However, some experts are concerned that the Trump administration’s plan to strengthen the AI sector will deprive states of efforts.
In late July, Trump signed an executive order as part of the White House AI Action Plan, highlighting AI development as a “national security order.”
Kelly hopes that the federal government’s AI push will not put the efforts of Minnesota women at risk.
“We are worried that we will continue to be left behind and sacrificed at altars seeking to gain some geopolitical races for powerful AI,” Kelly said.
Watch: The amazing rise of the Ai ‘Nudify’ app, which creates explicit images of real people.
