Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
BOOM ReportsNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
Elections 2024No Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
BOOM ReportsNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
Elections 2024No Image is Available
VideosNo Image is Available
Explainers

AI-Driven Platforms Creating Explicit Images Of Women On The Rise: Report

In a recent report titled 'A Revealing Picture', researchers uncover a surge in the use of AI for creating explicit images of women, contributing to a rise in non-consensual pornography.

By - Hera Rizwan | 13 Dec 2023 11:33 AM GMT

The misuse of artificial intelligence (AI) for malicious purposes has been on the rise worldwide. In a recent report, researchers have found that apps and websites that use AI to make explicit images of women are becoming increasingly popular.

According to the report, titled 'A Revealing Picture', by social network analysis company Graphika, as many as 24 million people visited such sites in September alone. It highlighted that these AI-enabled sites and apps "undress" or “nudify”, existing clothed pictures and videos of real individuals.

These applications contribute to a concerning rise in the creation and distribution of non-consensual pornography, driven by advancements in artificial intelligence, specifically in the form of deepfake pornography. Lately, many deepfake videos of women celebrities in India have also gone viral.

The report examined 34 such websites providing services related to what it termed as non-consensual intimate imagery (NCII).

'Links advertising these apps increased more than 2,400% on X and Reddit'

Mainstream social media platforms, reportedly, function as key marketing channels for providers of NCII helping them showcase their capabilities and attract interested users. According to the report, majority of these providers concentrate on guiding potential customers to external spaces, such as their own websites, Telegram groups for service access, or mobile stores for app downloads.

According to the report, referral link spam for these services has surged by over 2400% on platforms like Reddit and X since the start of 2023. Additionally, as of September this year, a total of 52 Telegram groups facilitating access to these intimate imagery services collectively have at least 1 million users.

The report read, "Some providers are overt in their activities, stating that they provide “undressing” services and posting photos of people they claim have been “undressed” as proof. Others are less explicit and present themselves as AI art services or web3 photo galleries."

Reportedly, one of the apps has even paid for sponsored content on Google’s YouTube, and appears first when searching with the word “nudify”.

Apart from this, these services also use influencer marketing to promote their products. For instance, Graphika identified several content aggregation accounts on Instagram that incorporated referral links to synthetic NCII services in their posts and bios.

Freemium model and evasive tactics

The Graphika report highlights that these services have been identified as operating on a freemium model, wherein they entice users with a limited number of free features while restricting enhanced services behind a paywall. They nudge users to acquire additional "credits" or "tokens" for access to premium features, including higher resolution exports, customisation of "age" and "body traits," and an inpainting feature allowing the AI model to replace parts of an image.

According to the report, these services commonly utilise mainstream payment platforms like PayPal and Stripe, along with cryptocurrency platforms such as Coinbase Commerce. To avoid mainstream payment providers, some services use alternative methods, offering "credits" through crowdfunding on platforms like Patreon or subscriptions to adult websites, the report added.

In a bid to ease out the user experience, these services provide their users with capability of creating and accessing these explicit images within minutes of their initial visit to a provider's website or Telegram group, without any upfront charges. This, according to the report, "drastically lowers the barrier to entry for these services, which would otherwise require users to find, download, and operate custom image diffusion models".

What have the companies said?

Empowered by these AI services, providers of synthetic NCII, as the report highlights, have evolved into a fully-fledged online industry, "utilising similar marketing tactics and monetisation tools as established e-commerce companies".

Bloomberg reached out to the tech giants who are "facilitating" the advertisement of these apps and sites mentioned in the Graphika report. A Google spokesperson said that the company doesn’t allow ads “that contain sexually explicit content”. “We’ve reviewed the ads in question and are removing those that violate our policies,” the company said.

While the Reddit spokesperson said the site prohibits any non-consensual sharing of faked sexually explicit material and had banned several domains as a result of the research, X didn’t respond to a request for comment, Bloomberg reported.

TikTok and Meta have reportedly blocked the search word “undress” to tackle the problem. Google has also taken down some ads for undressing sites and apps, the media outlet said.