Report Flags Dozens Of Nudify AI Apps On Apple And Google Stores
A new investigation has revealed that Apple and Google app stores hosted dozens of "nudify" AI applications, despite apparent policy violations. The investigation was conducted by the Tech Transparency Project (TTP), which is affiliated with the nonprofit watchdog Campaign for Accountability.
According to the report, TTP identified 55 nudify apps on the Google Play Store and 47 on the Apple App Store. These applications "can digitally remove the clothes from women and render them completely or partially naked or clad in a bikini or other minimal clothing."
TTP warned that many of these images can be created without the subject's consent, raising serious concerns about non-consensual deepfake abuse. The report claims the apps have been collectively downloaded more than 705 million times worldwide and generated an estimated $117 million in revenue, citing data from app analytics firm AppMagic.
TTP notes that Apple and Google indirectly benefited financially, as they take a commission from app revenue. Apple told CNBC it has removed 28 apps identified in the report. Similarly, Google said it suspended 31 apps and confirmed its investigation is ongoing.
Among the most popular apps tested by TTP was DreamFace, which promises to turn "photos, text, and voices into stunning HD videos and audio in seconds." According to AppMagic, it has been downloaded from the Google and Apple app stores more than 10 million times.
TTP concludes that both platforms need stronger enforcement measures to prevent non-consensual AI deepfake apps from reaching users. The report arrives amid broader scrutiny of AI tools creating sexualized non-consensual images, including recent concerns raised around xAI's Grok.
Maharashtra Deputy CM Ajit Pawar Dies After Plane Crash Near Baramati
Click here