Now the gatekeepers of the global mobile ecosystem are facing a mounting crisis of accountability. In a comprehensive report released this Wednesday by the Tech Transparency Project (TTP), it has been revealed that the Apple Google Nudify apps 2026 controversy is far from over. First, despite both tech giants having strict policies against “sexually explicit” or “degrading” content, their app stores continue to offer and even suggest software designed to “undress” people using AI. Therefore, while these companies have removed flagged apps in the past, dozens of similar tools have resurfaced, generating an estimated $122 million in revenue. Meanwhile, the report highlights that the companies are not just hosting these apps but are actively directing users to them via search autocomplete features.
Also Read | Imran Khan and Bushra Bibi Sentenced to 17 Years in Jail
The ‘Undress’ Search: How Autocomplete Leads Users to Harm
Now we must analyze how the “discovery” of these apps is being facilitated by the platforms themselves. First, searching for terms like “nudify” and “undress” in the Apple App Store and Google Play Store gives customers instant access to software designed for nonconsensual image alteration. Therefore, the Apple Google Nudify apps 2026 report argues that the companies are failing their own review processes.
Next, the TTP found that both companies’ autocomplete features actually suggest more explicit names of nudifying apps as users type. Thus, the platforms are essentially acting as a “recommendation engine” for harmful AI tools.
Meanwhile, the companies also run paid advertisements for similar apps in their search results. Therefore, they are not only hosting the content but directly monetizing the “intent” to find undressing software. So the “safety” of the stores is currently being outweighed by their “search engagement” algorithms.
$122 Million in Revenue: Profiting from Nonconsensual AI
So how big is the business of “nudifying” apps? First, the apps identified by the TTP have been downloaded a staggering 483 million times. Therefore, the Apple Google Nudify apps 2026 controversy is a massive financial pillar for certain rogue developers.
Next, revenue estimates from market researcher AppMagic suggest these tools have generated $122 million. Thus, the 15% to 30% “App Store Tax” means that Apple and Google have likely pocketed millions from these violations.
Revenue Snapshot (AppMagic Data):
-
Total Downloads: 483 Million.
-
Estimated Revenue: $122 Million.
-
Primary Model: High-cost weekly or monthly subscriptions.
-
Status: Several apps were removed only after the TTP work was published.
Meanwhile, a spokesperson for AppMagic noted that the publicity surrounding the report has finally prompted some apps to change their user policies. Therefore, “public shaming” remains more effective than the companies’ internal moderation tools.
Apple vs. Google: Comparing the ‘Nudify’ App Inventory
Now let’s look at the breakdown across the two dominant stores. First, the TTP identified 18 apps with nudifying capabilities in the Apple App Store. Therefore, the Apple Google Nudify apps 2026 data shows that the “closed garden” of Apple is just as vulnerable as Google’s more open system.
Next, the Google Play Store was found to house 20 such apps. Thus, both companies are equally implicated in the failure to enforce their “anti-objectification” policies.
Store Breakdown:
-
Apple: 18 apps found; 15 subsequently removed after Bloomberg’s inquiry.
-
Google: 20 apps found; investigation ongoing for several suspended titles.
-
Flagged App: PicsVid AI Hot Video Generator featured explicit templates before its removal.
Meanwhile, Apple told researchers that six other identified apps were “at risk of being removed” but had not yet been taken down. Therefore, the “grace period” for these developers continues even after the harm has been documented.
Also Read | Imran Khan and Bushra Bibi Sentenced to 17 Years in Jail
The ‘Prank’ Loophole: Why Generic Image Generators Pass Review
So how do these apps get approved in the first place? First, many of these tools present themselves as “generic image generators” or “AI photo editors.” Therefore, the Apple Google Nudify apps 2026 report suggests that the review process is “uneven and largely opaque.”
Next, Professor Anne Helmond of Utrecht University notes that visibility is shaped by ranking systems that reward engagement. Thus, “controversial uses” can actually increase an app’s prominence in the search results.
Policy Loopholes:
-
The ‘Prank’ Label: Some apps claim to be “entertainment” to bypass porn filters.
-
Secondary Features: The “nudify” tool is often hidden inside a legitimate face-swap app.
-
Keyword Stuffing: Using terms that only trigger for specific searches like “undress.”
Meanwhile, Google’s policy specifically bans apps that claim to “see through clothing,” even if labeled as a prank. Therefore, the presence of these apps is a clear failure of enforcement rather than a lack of rules.
DeepFace and Face Swap: When ‘Everyone’ Ratings Hide Explicit Content
Now we must examine the specific case of the Video Face Swap AI: DeepFace app. First, this app advertised the ability to swap the face of actress Anya Taylor-Joy onto other characters. Therefore, the Apple Google Nudify apps 2026 report highlights the “nonconsensual celebrity” aspect of the crisis.
Next, Bloomberg found that inside the app, users could paste faces onto video templates of women in “sexually suggestive” poses. Thus, the app’s “E for Everyone” rating was a complete misrepresentation of its actual content.
Meanwhile, the developer, Okapi Software, claimed it had removed content “uploaded by users” and that it did not permit explicit generation. Therefore, the “user-generated content” defense is being used as a shield for what is essentially a sexualized AI tool. So the 1 million+ downloads of this specific app highlight the failure of age-rating systems.
Take It Down Act 2025: The New Federal Hammer Against Tech Giants
So what is the legal response? First, President Donald Trump signed the Take It Down Act into law in late 2025. Therefore, the Apple Google Nudify apps 2026 controversy is now a matter of federal criminal law.
Next, the act criminalizes the publication of nonconsensual sexual content and compels websites to remove such posts immediately. Thus, Apple and Google could potentially face liability for “facilitating” the distribution of these criminal tools.
Legal Implications:
-
Criminalization: Publishing nonconsensual AI imagery is now a federal offense.
-
Compulsion: Platforms are required to have “robust” removal mechanisms.
-
Liablity: The TTP report may serve as evidence that the companies were “knowingly” profiting.
Meanwhile, a growing chorus of regulators is calling for the companies to do more than just “reactive” removals. Therefore, the 2026 legislative session may see further pressure to fine tech giants for every hour a “nudify” app remains live.
Also Read | Imran Khan and Bushra Bibi Sentenced to 17 Years in Jail
UK Regulation 2026: Prosecuting Executives for AI Moderation Failures
Now we should look at the international front. First, the UK government is preparing to introduce legislation in late April 2026 that targets tech executives personally. Therefore, the Apple Google Nudify apps 2026 scandal could result in the first-ever prosecution of a Big Tech leader for AI moderation failure.
Next, the UK law would compel companies to take down nonconsensual images within a strict timeframe. Thus, the “uneven and opaque” enforcement noted by Professor Helmond will no longer be legally acceptable in the British market.
Meanwhile, the EU is also reviewing its Digital Services Act (DSA) to include specific “deepfake safety” clauses. Therefore, Apple and Google are facing a “regulatory pincer” from both sides of the Atlantic. So the “global outcry” mentioned in the report is quickly turning into a global legal trap.
Tech Transparency Project: Katie Paul on the ‘Systemic Failure’
Finally, what does the report’s director think of the tech giants’ response? First, Katie Paul, director of the project, stated in an interview that the companies are “actually directing users to the apps themselves.” Therefore, the Apple Google Nudify apps 2026 failure is seen as an active participation rather than a passive mistake.
Next, she argued that the companies are failing to appropriately review these apps and continue to profit from them. Thus, the “automated moderation” that Apple and Google often tout is being exposed as insufficient against modern AI.
Katie Paul’s Main Points:
-
Algorithms: Autocomplete is helping “nudify” apps go viral.
-
Profit: Apple and Google are making millions from the “undress” trend.
-
Opacity: The review process is inconsistent and relies on media flagging rather than proactive scanning.
Meanwhile, the companies insist they are taking “appropriate action.” Therefore, the gap between the TTP’s findings and the companies’ statements is the primary “battleground of truth” for 2026.
Common Questions Answered
What is the Apple Google Nudify apps 2026 scandal? Now it is a report by the Tech Transparency Project showing that both companies continue to host “undress” apps that create nonconsensual sexual images, despite policies against them.
How many of these apps were found in 2026? First, 18 apps were found in the Apple App Store and 20 in the Google Play Store. Next, many were being promoted through search autocomplete and ads.
How much money are these apps making? So, they have generated $122 million in revenue and have been downloaded 483 million times. Thus, it is a massive industry.
What is the Take It Down Act 2025? Next, it is a US federal law signed by Donald Trump that criminalizes the publication of nonconsensual sexual content and forces websites to remove it.
Did Apple and Google remove the apps? Finally, after being contacted by researchers and Bloomberg, Apple removed 15 apps, and Google suspended several. However, researchers say dozens of similar apps resurface within months.
Can an app with an ‘E’ rating be dangerous? Actually, yes. The report found that apps like DeepFace (rated ‘E’) had categories where users could paste faces onto sexually suggestive video templates.
Also Read | Imran Khan and Bushra Bibi Sentenced to 17 Years in Jail
End…



