Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
BOOM ReportsNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
WorkshopsNo Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
BOOM ReportsNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
WorkshopsNo Image is Available
VideosNo Image is Available
Decode

X Is Full Of Deepfake Porn Videos Of Actresses; You May Be Targeted Next

Once confined to the seedy corners of the internet, rapid advancements in AI technology has turbo charged the spread of deepfake pornographic videos on mainstream social media platforms.

By - Boom Staff | 6 Nov 2023 11:03 AM GMT

An X (formerly Twitter) handle called @crazyashfan describes himself as a ‘photo and video manipulation artist’. But what he does is no art. He finds pornographic content and then using AI, manipulates them to show Indian actresses’ faces instead of the original adult stars’.

The X handle which has 39 posts include morphed AI-generated videos of Alia Bhatt, Kiara Advani, Kajol, Deepika Padukone and many other Bollywood actresses performing explicit sexual acts.

The 4 accounts that he follows on X are all similar in nature— creating deepfakes of Indian actresses.

With some digging, Decode found out a website called Desifakes.com which has a number of requests for ‘nude photos’. On one of the forums, ‘celebrities and personalities AI fakes’ there are multiple actresses’ real photos shared with their photos where they are without clothes.

Turns out, it’s a simple hack.

A website called clothoff which describes itself as “a breakthrough in AI” allows users to upload photos of their choice and then the AI does the work.

Just a few days ago, female students at a high school in the US found out that the male students made deepfakes of them using AI and shared it on group chats. While the investigation is still ongoing, what it proved is that all it takes is a phone and an AI tool to generate deepfakes. And it’s a dangerous territory.

Hany Farid, a professor at the University of California, Berkeley who has researched digital forensics and image analysis, told Axios that while it would take hundreds and thousands of images to create deepfakes, it takes only one photo now. This is the case with the examples Decode found.

“After the boom of ChatGPT and AI softwares, human moderation is only at the start and then AI takes it course,” said Malavika Rajkumar, a lawyer who works on digital justice for IT for change, an NGO based in Bengaluru. “Deepfakes are a violation of bodily privacy, the victim doesn’t know their rights are being violated,” Rajkumar added. The lawyer hopes that the Digital India Act will regulate AI and emerging technologies and make the Internet a safer place.

According to the Deeptrace report, an Amsterdam-based cybersecurity company, 96% of the deepfake videos on the Internet are pornographic videos.

“Police has infrastructure to track the accounts but what about AI tools that generate them?” Malavika asked pointing out a great loophole.

The accounts that Decode tracked on X (formerly Twitter) also often put out their Telegram handles. A quick search on X reveals that the social media platform is just one such avenue where these deepfake nude photos are posted. They are all over the Internet and just a click away.

Why Is It Dangerous For You?

All it takes to create a deepfake is one photo. The tools used to create them are easily available.

“Some resources are being worked upon to give help to individuals as law has not kept up with it. Our police forces are not trained nor are our judges or courts,” Mishi Choudhary, founder of SFLC told Decode.

Among the resources, she said, is the Detect Fakes website created by Massachusetts Institute of Technology (MIT) to help people identify deepfakes.

“Deep fakes have been an increasing area of concern with the developments in AI. they are being used to spread misinformation, disinformation, harass, intimidate , create pornographic images and several other ways to undermine people. More often than not the research that is designed to help detect deepfakes just ends up helping make deepfake technology better,” Mishi Choudhary said.

The images and videos that Decode found is not from some murky corner of the dark web. They are all available on a mainstream social media platform- X. The X accounts have also put out their Telegram channels, asking people to DM them with personal requests.

“Something like this is honestly, extremely scary not only for me, but also for each one of us who today is vulnerable to so much harm because of how technology is being misused,” wrote Rashmika Mandanna on X.

A video featuring Mandanna, an Indian actress, had gone viral on X. But like all deepfakes, it wasn’t her video. It was generated using AI.

This is dangerous for anyone out there who has photos on social media platforms and not just actresses. And the bigger trouble is even if they get deplatformed it is not going to stop this. They will just migrate to another place. It’s that easy.