BOOM

Trending Searches

    SUPPORT
    BOOM

    Trending News

      • Fact Check 
        • Fast Check
        • Politics
        • Business
        • Entertainment
        • Social
        • Sports
        • World
      • Law
      • Explainers
      • News 
        • All News
      • Decode 
        • Impact
        • Scamcheck
        • Life
        • Voices
      • Media Buddhi 
        • Digital Buddhi
        • Senior Citizens
        • Videos
      • Web Stories
      • BOOM Research
      • BOOM Labs
      • Deepfake Tracker
      • Videos 
        • Facts Neeti
      • Home-icon
        Home
      • About Us-icon
        About Us
      • Authors-icon
        Authors
      • Team-icon
        Team
      • Careers-icon
        Careers
      • Internship-icon
        Internship
      • Contact Us-icon
        Contact Us
      • Methodology-icon
        Methodology
      • Correction Policy-icon
        Correction Policy
      • Non-Partnership Policy-icon
        Non-Partnership Policy
      • Cookie Policy-icon
        Cookie Policy
      • Grievance Redressal-icon
        Grievance Redressal
      • Republishing Guidelines-icon
        Republishing Guidelines
      • Fact Check-icon
        Fact Check
        Fast Check
        Politics
        Business
        Entertainment
        Social
        Sports
        World
      • Law-icon
        Law
      • Explainers-icon
        Explainers
      • News-icon
        News
        All News
      • Decode-icon
        Decode
        Impact
        Scamcheck
        Life
        Voices
      • Media Buddhi-icon
        Media Buddhi
        Digital Buddhi
        Senior Citizens
        Videos
      • Web Stories-icon
        Web Stories
      • BOOM Research-icon
        BOOM Research
      • BOOM Labs-icon
        BOOM Labs
      • Deepfake Tracker-icon
        Deepfake Tracker
      • Videos-icon
        Videos
        Facts Neeti
      Trending Tags
      TRENDING
      • #Operation Sindoor
      • #Pahalgam Terror Attack
      • #Narendra Modi
      • #Rahul Gandhi
      • #Waqf Amendment Bill
      • #Arvind Kejriwal
      • #Deepfake
      • #Artificial Intelligence
      • Home
      • Decode
      • X Is Full Of Deepfake Porn Videos...
      Decode

      X Is Full Of Deepfake Porn Videos Of Actresses; You May Be Targeted Next

      Once confined to the seedy corners of the internet, rapid advancements in AI technology has turbo charged the spread of deepfake pornographic videos on mainstream social media platforms.

      By - Boom Staff | 6 Nov 2023 4:33 PM IST
    • Boomlive
      Listen to this Article
      X Is Full Of Deepfake Porn Videos Of Actresses; You May Be Targeted Next

      An X (formerly Twitter) handle called @crazyashfan describes himself as a ‘photo and video manipulation artist’. But what he does is no art. He finds pornographic content and then using AI, manipulates them to show Indian actresses’ faces instead of the original adult stars’.

      The X handle which has 39 posts include morphed AI-generated videos of Alia Bhatt, Kiara Advani, Kajol, Deepika Padukone and many other Bollywood actresses performing explicit sexual acts.

      The 4 accounts that he follows on X are all similar in nature— creating deepfakes of Indian actresses.

      With some digging, Decode found out a website called Desifakes.com which has a number of requests for ‘nude photos’. On one of the forums, ‘celebrities and personalities AI fakes’ there are multiple actresses’ real photos shared with their photos where they are without clothes.

      Turns out, it’s a simple hack.

      A website called clothoff which describes itself as “a breakthrough in AI” allows users to upload photos of their choice and then the AI does the work.

      Also Read:Viral Deepfake Video Shows Bella Hadid Stating Support For Israel

      Just a few days ago, female students at a high school in the US found out that the male students made deepfakes of them using AI and shared it on group chats. While the investigation is still ongoing, what it proved is that all it takes is a phone and an AI tool to generate deepfakes. And it’s a dangerous territory.

      Hany Farid, a professor at the University of California, Berkeley who has researched digital forensics and image analysis, told Axios that while it would take hundreds and thousands of images to create deepfakes, it takes only one photo now. This is the case with the examples Decode found.

      “After the boom of ChatGPT and AI softwares, human moderation is only at the start and then AI takes it course,” said Malavika Rajkumar, a lawyer who works on digital justice for IT for change, an NGO based in Bengaluru. “Deepfakes are a violation of bodily privacy, the victim doesn’t know their rights are being violated,” Rajkumar added. The lawyer hopes that the Digital India Act will regulate AI and emerging technologies and make the Internet a safer place.

      According to the Deeptrace report, an Amsterdam-based cybersecurity company, 96% of the deepfake videos on the Internet are pornographic videos.

      “Police has infrastructure to track the accounts but what about AI tools that generate them?” Malavika asked pointing out a great loophole.

      Also Read:What To Do When Your Mobile Is Hacked? Experts Explain

      The accounts that Decode tracked on X (formerly Twitter) also often put out their Telegram handles. A quick search on X reveals that the social media platform is just one such avenue where these deepfake nude photos are posted. They are all over the Internet and just a click away.

      Why Is It Dangerous For You?

      All it takes to create a deepfake is one photo. The tools used to create them are easily available.

      “Some resources are being worked upon to give help to individuals as law has not kept up with it. Our police forces are not trained nor are our judges or courts,” Mishi Choudhary, founder of SFLC told Decode.

      Among the resources, she said, is the Detect Fakes website created by Massachusetts Institute of Technology (MIT) to help people identify deepfakes.

      “Deep fakes have been an increasing area of concern with the developments in AI. they are being used to spread misinformation, disinformation, harass, intimidate , create pornographic images and several other ways to undermine people. More often than not the research that is designed to help detect deepfakes just ends up helping make deepfake technology better,” Mishi Choudhary said.

      The images and videos that Decode found is not from some murky corner of the dark web. They are all available on a mainstream social media platform- X. The X accounts have also put out their Telegram channels, asking people to DM them with personal requests.

      “Something like this is honestly, extremely scary not only for me, but also for each one of us who today is vulnerable to so much harm because of how technology is being misused,” wrote Rashmika Mandanna on X.

      A video featuring Mandanna, an Indian actress, had gone viral on X. But like all deepfakes, it wasn’t her video. It was generated using AI.

      This is dangerous for anyone out there who has photos on social media platforms and not just actresses. And the bigger trouble is even if they get deplatformed it is not going to stop this. They will just migrate to another place. It’s that easy.

      Tags

      Deepika PadukoneDeepfakeFacebook
      Read Full Article

      Next Story
      Our website is made possible by displaying online advertisements to our visitors.
      Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.
      X

      Subscribe to BOOM Newsletters

      👉 No spam, no paywall — but verified insights.

      Please enter a Email Address
      Subscribe for free!

      Stay Ahead of Misinformation!

      Please enter a Email Address
      Subscribe Now🛡️ 100% Privacy Protected | No Spam, Just Facts
      By subscribing, you agree with the Terms & conditions and Privacy Policy connected to the offer

      Thank you for subscribing!

      You’re now part of the BOOM community.

      Or, Subscribe to receive latest news via email
      Subscribed Successfully...
      Copy HTMLHTML is copied!
      There's no data to copy!