Stock Markets January 27, 2026

Apple Removes Scores of 'Nudify' Apps After Watchdog Flags AI-Generated Nude Images

Tech Transparency Project finds dozens of apps that render nude images from photos; Google suspends multiple apps as investigations continue

By Caleb Monroe GOOGL
Apple Removes Scores of 'Nudify' Apps After Watchdog Flags AI-Generated Nude Images
GOOGL

Apple has taken down 28 mobile applications that use artificial intelligence to create nude images from user photos following a report by the Tech Transparency Project (TTP). TTP's January review identified dozens of apps on both Apple’s App Store and Google Play that can sexualize images of fully clothed women; Google has also suspended multiple apps while investigations proceed.

Key Points

  • TTP's January review found 47 apps on Apple's App Store and 55 on Google Play that can generate nude images from photos.
  • Apple removed 28 apps and warned developers about guideline violations; TTP's follow-up found 24 apps actually removed.
  • Google has suspended multiple apps for policy breaches while its investigation is ongoing - sectors affected include technology platforms, app marketplaces, and AI services.

Apple removed 28 apps from its App Store after the Tech Transparency Project (TTP) disclosed a January review that uncovered applications designed to produce nude images from photographs. The watchdog said its probe located 47 such apps on Apple’s platform and 55 on Google Play by searching for terms such as "nudify" and "undress." TTP tested the apps using AI-generated images of fully clothed women to assess their output.

In response to the TTP findings, Apple reportedly took action on Monday, deleting 28 of the identified applications and notifying other developers that their apps could be removed if they did not remedy violations of App Store guidelines. TTP's subsequent check, however, found that only 24 of the flagged apps had actually been removed at that time.

"These were definitely designed for non-consensual sexualization of people." - Katie Paul, TTP Director

Google has also moved to suspend several apps for alleged policy breaches, though the company did not disclose the exact number of suspensions and indicated its investigation was ongoing. The watchdog classified the problematic software into two distinct types: apps that deploy AI to render images of women without clothing and "face swap" apps that place women's faces onto nude bodies.

The report and ensuing removals come amid heightened scrutiny over potential misuse of AI tools. The debate intensified recently after criticism over the Grok AI tool from xAI, which produced sexualized images of women and children in response to user prompts.

TTP's findings and the subsequent platform enforcement illustrate the challenges app stores face policing AI-driven image-manipulation tools. Apple appears to be enforcing its guidelines against applications that enable non-consensual sexualization, while Google has signaled ongoing review and suspension activity. The situation remains fluid, with watchdog follow-ups uncovering discrepancies between apps identified and apps actually removed.


What happened

  • TTP's January review identified 47 apps on Apple and 55 on Google Play that can create nude images from photos.
  • Apple removed 28 apps and warned other developers about potential removal for guideline violations; a TTP follow-up found 24 apps removed.
  • Google suspended several apps for policy violations and said its investigation is ongoing.

Context

The watchdog tested the apps using AI-generated images of fully clothed women and searched for apps using terms like "nudify" and "undress." The report categorized the problematic apps as those rendering nudity via AI and face-swap tools placing faces on nude bodies. The episode coincides with broader concerns about AI misuse after controversy over the Grok AI tool's outputs.

Risks

  • Ongoing policy enforcement uncertainty - app stores may continue to identify and remove apps, affecting developers and the app marketplace sector.
  • Potential misuse of AI image tools - the ability to create non-consensual sexualized images raises reputational and regulatory risks for AI service providers and platform operators.
  • Incomplete removals and ongoing investigations - discrepancies between apps identified by watchdogs and apps actually removed create enforcement and compliance uncertainties for platform governance.

More from Stock Markets

Indian equities rally after U.S. agrees tariff reductions in trade accord Feb 2, 2026 SiTime Nears Acquisition of Renesas Timing Business in Potential $3 Billion Deal Feb 2, 2026 Tesla Debuts New All-Wheel Drive Model Y Trim in U.S.; Premium Option Also Launched Feb 2, 2026 Eastroc Beverage Shares Start Trading in Hong Kong at Offer Price After $1.3 Billion IPO Feb 2, 2026 SoftBank unit and Intel to jointly develop 'Z-Angle' memory technology Feb 2, 2026