
A shocking new report by the Tech Transparency Project (TTP) has revealed that dozens of apps capable of turning ordinary photos into sexually explicit images—known as “Nudify” apps—have been available on both the Apple App Store and Google Play Store. These apps have collectively generated nearly ₹970 crore in revenue, a portion of which is earned by the tech giants themselves through commission.
Widespread Availability Raises Security Concerns
According to TTP, these AI-powered apps use machine learning to remove clothing from photos, posing serious risks to women’s privacy and safety. The presence of such apps on major platforms has sparked outrage, raising questions about how they bypassed Apple and Google’s security policies and amassed millions of downloads.
Downloads and Earnings
TTP’s investigation, reported by Mashable, found that 55 Nudify apps were available on Google Play and 47 on Apple App Store. Data analytics firm AppMagic reports that these apps have been downloaded more than 70.5 million times worldwide, earning a total of around $117 million (₹970 crore). Apple and Google share in this revenue through app store commissions, highlighting how the profits of these unethical apps indirectly benefit the tech giants.
Steps Taken So Far
Following the TTP report, Apple told CNBC that it had removed 28 of these apps, while Google stated that it had suspended multiple apps and that investigations were ongoing. However, TTP describes these actions as insufficient, emphasizing that both companies continue to fail in preventing the misuse of AI deepfake technology, which can easily turn ordinary women’s photos into sexually explicit images.
The Global Deepfake Controversy
This issue is part of a broader global concern over AI-generated sexual content. Recently, Elon Musk’s company xAI faced backlash over its AI chatbot Grok, which allowed users to generate explicit images using only text prompts. Within just 11 days, over 3 million sexualized images were reportedly created via Grok. Governments worldwide, including India, issued ultimatums that eventually led to restrictions on the platform.
Experts’ Warning
Experts warn that governments must enact strict laws against non-consensual deepfake creation, which can turn anyone’s photo into explicit content without consent. Until then, apps like Nudify pose a serious and frightening threat to women’s privacy and safety worldwide.
Discover more from SD NEWS agency
Subscribe to get the latest posts sent to your email.
