Under EU digital market law, Apple is required to allow developers to freely notify customers of alternative offers outside of the App Store.
Gabby Jones Bloomberg via Getty Images
of apple and google The Play app store hosts dozens of “nudification” apps that can take a photo of a person and use artificial intelligence to generate a nude image, according to a Tuesday report from the industry watchdog.
An investigation of two app stores conducted by the Tech Transparency Project in January found 55 nudify apps on Google Play and 47 apps on the Apple App Store, according to the organization’s report shared exclusively with CNBC.
After being contacted by TPP and CNBC last week, an Apple spokesperson said Monday that the company had removed 28 apps identified in the report. The iPhone maker also said it has warned other app developers that they must address violations of the guidelines or risk being removed from the Apple App Store.
Two of the apps removed by Apple were restored to the store after their developers resubmitted new versions that addressed guidelines concerns, a company spokesperson told CNBC.
“Both companies say they are dedicated to the safety and security of their users, but they host a collection of apps that can turn innocuous photos of women into abusive and sexualized images,” TTP said in its report on Apple and Google.
TTP told CNBC on Monday that an investigation of the Apple App Store found only 24 apps had been removed by the tech company.
A Google spokesperson told CNBC that the company has suspended several apps mentioned in the report for violating app store policies and will investigate any reported policy violations. The company declined to say specifically how many apps it removed, as an investigation into the apps identified by TTP is ongoing.
The report comes after Elon Musk’s xAI faced backlash earlier this month after its Grok AI tool responded to user prompts that generated sexual photos of women and children.
The watchdog identified the apps in the two stores by searching for terms like “nudity” and “undressing” to find them, and tested them using AI-generated images of clothed women. The project tested two types of apps: one that uses AI to render an image of an unclothed woman, and a “face swap” app that superimposes the original woman’s face onto an image of a naked woman.
“It’s very clear that these are not just ‘getting dressed’ apps,” TPP Director Katie Paul told CNBC. “These are definitely designed for the non-consensual sexualization of people.”
In a report published in September, CNBC investigated the dangers of nudity apps and websites.
In an investigation, CNBC tracked a group of women in Minnesota who fed public social media photos into a nudity service to create sexual deepfakes without their consent. No obvious crime was committed because the women were all adults and the men who created the pornographic deepfakes were not necessarily the ones distributing them. More than 80 women were killed.
CNBC found that new AI models are making it easier than ever to generate deepfake nudes and explicit content, with services bundled into user-friendly apps like those found by TTP.
According to TPP, 14 of the apps reviewed were based in China. Paul said this further increases security concerns.
“China’s data retention laws mean the Chinese government has the right to obtain data from any company anywhere in China,” Paul said. “So if someone is creating deepfake nudes of you, if you use one of those apps, they’re in the hands of the Chinese government.”
After xAI came under scrutiny for its nudify functionality, Grok’s AI acknowledged a “deficiency in security” in a reply to an X user and said it was “immediately fixing it.”
The European Commission announced on Monday that it had opened an investigation into Grok’s distribution of sexually explicit content.
xAI sent an automated response to CNBC’s request for comment: “Legacy Media Lies.”
Joe Radle | Getty Images
In August, the National Association of Attorneys General sent letters to payment platforms such as Apple Pay and Google Pay expressing concern about services that produce non-consensual intimate images and asking the platforms to remove such services from their networks.
In a letter this month, Democratic senators from Oregon, New Mexico and Massachusetts asked Apple and Google to remove X from their app stores, saying the mass production of non-consensual sexual images violates the stores’ distribution terms.
The Google Play Developer Policy Center states that the platform does not allow “apps that claim to undress or show through people’s clothing, even if they are classified as prank or entertainment apps.” Apple’s app review guidelines prohibit material that is “overtly sexual or pornographic.”
In total, the apps identified by TTP have been downloaded more than 700 million times worldwide and generated $117 million in revenue, the report said, citing app analytics firm AppMagic. Apple and Google both receive a cut of the revenue generated from apps distributed through the store.
“The fact that they are not adhering to their own policies that are meant to protect people from non-consensual nude images and non-consensual pornography raises a lot of questions about how they can demonstrate that they are a trusted app platform,” Paul said.
–CNBC’s Jonathan Vanian and Katie Tarasov contributed to this report
