BOOM

Trending Searches

    SUPPORT
    BOOM

    Trending News

      • Fact Check 
        • Fast Check
        • Politics
        • Business
        • Entertainment
        • Social
        • Sports
        • World
      • Law
      • Explainers
      • News 
        • All News
      • Decode 
        • Impact
        • Scamcheck
        • Life
        • Voices
      • Media Buddhi 
        • Digital Buddhi
        • Senior Citizens
        • Videos
      • Web Stories
      • BOOM Research
      • BOOM Labs
      • Deepfake Tracker
      • Videos 
        • Facts Neeti
      • Home-icon
        Home
      • About Us-icon
        About Us
      • Authors-icon
        Authors
      • Team-icon
        Team
      • Careers-icon
        Careers
      • Internship-icon
        Internship
      • Contact Us-icon
        Contact Us
      • Methodology-icon
        Methodology
      • Correction Policy-icon
        Correction Policy
      • Non-Partnership Policy-icon
        Non-Partnership Policy
      • Cookie Policy-icon
        Cookie Policy
      • Grievance Redressal-icon
        Grievance Redressal
      • Republishing Guidelines-icon
        Republishing Guidelines
      • Fact Check-icon
        Fact Check
        Fast Check
        Politics
        Business
        Entertainment
        Social
        Sports
        World
      • Law-icon
        Law
      • Explainers-icon
        Explainers
      • News-icon
        News
        All News
      • Decode-icon
        Decode
        Impact
        Scamcheck
        Life
        Voices
      • Media Buddhi-icon
        Media Buddhi
        Digital Buddhi
        Senior Citizens
        Videos
      • Web Stories-icon
        Web Stories
      • BOOM Research-icon
        BOOM Research
      • BOOM Labs-icon
        BOOM Labs
      • Deepfake Tracker-icon
        Deepfake Tracker
      • Videos-icon
        Videos
        Facts Neeti
      Trending Tags
      TRENDING
      • #Operation Sindoor
      • #Pahalgam Terror Attack
      • #Narendra Modi
      • #Rahul Gandhi
      • #Waqf Amendment Bill
      • #Arvind Kejriwal
      • #Deepfake
      • #Artificial Intelligence
      • Home
      • Fact Check
      • 2023 Review: Generative AI...
      Fact Check

      2023 Review: Generative AI Amplifies Misinfo, Fraud, Deepfake Porn In India

      This year fact-checkers in India saw synthetic content tailored to local audiences and AI voice clones in Hindi.

      By -  Anmol Alphonso
      Published -  28 Dec 2023 2:07 PM IST
    • Boomlive
      Listen to this Article
      2023 Review: Generative AI Amplifies Misinfo, Fraud, Deepfake Porn In India

      The year 2023 provided an unsettling preview of how generative artificial intelligence (AI) can be misused to commit fraud, create non-consensual imagery and generate misinformation, in India.

      Widely accessible generative AI tools have enabled the creation of deepfakes in the form of images, audio and videos. The technology has largely been used to target already marginalised groups and communities.

      While the buzz around deepfakes has existed since 2017, this year fact-checkers in India saw synthetic content tailored to local audiences and AI voice clones in Hindi - a phenomenon not seen before.

      AI Images Generators Have Got Better And Better

      In May this year a photo of a few protesting Indian wrestlers smiling while detained inside a police van went viral on social media.


      The wrestlers were protesting against the former head of Wrestling Federation of India and Bharatiya Janata Party MP Brij Bhushan Singh who is accused of sexual harassment. However, the image was doctored using an AI photo editing app called Face App which artificially added smiles to the faces of the athletes in the photo.



      The incident also showed how AI based misinformation does not fall into neat binaries of AI or not and that current detection tools are not equipped to catch such manipulations that fall somewhere in between a wide spectrum.

      Also Read:Photo Of Wrestler Vinesh Phogat Smiling In Police Van Is Morphed

      Earlier in May a fully AI generated photo purporting to show an explosion at the Pentagon caused the US stock markets to dip briefly. The hoax was also carried by several Indian mainstream news outlets.



      Also Read:Indian Media Fall For Hoax Post Claiming Explosion At Pentagon

      The incident showed how AI generated misinformation can also be used to spread rumours, panic and roil financial markets.

      Closer to home, one of the biggest news stories in India this year saw the use of an AI generated image.



      Several mainstream Indian news outlets ran an AI-generated photo in their news articles on the Uttarakhand tunnel rescue operation claiming it showed rescuers posing for group photos after the successful evacuation of 41 workers from the collapsed Silkyara tunnel.



      Also Read:News Outlets Run AI-Generated Photo To Show Uttarakhand Tunnel Rescuers

      Synthetic images have also been shared in the context of the ongoing Israel-Hamas war. An image purporting to show a man walking with his five children amidst buildings reduced to rubble, turned out to be AI generated.



      Also Read:Image of Father Holding Children Amid Devastation Is AI-Generated

      The use of such images have unintended consequences as they have often been used to undermine the suffering of Gaza’s civilian population.

      AI Voice Clones Put Words In Someone's Mouth

      Along with AI images, AI voice clones have also been used to spread misinformation about the Israel-Hamas war.

      BOOM found an Israeli sound designer and voice-over artist who tested the boundaries of social media content moderation policies with deepfakes targeting anyone famous speaking out against Israel.



      Yishay Raziel created deepfakes of Queen Rania of Jordan, former adult film actress Mia Khalifa, musician Roger Waters, actor Angelina Jolie among others using AI voice cloning technology.

      Also Read:Deepfake Video Creator Tests Social Media Platforms In Israel-Hamas Conflict

      This year we also debunked several deepfakes videos that targeted US President Joe Biden, Ukraine President Volodymyr Zelenskyy, and Microsoft co-founder and philanthropist Bill Gates.

      Also Read:Viral Video Claiming To Show Zelenskyy Belly Dancing Is A Deepfake
      Also Read:Video Of Bill Gates Being Accused Of Profiting Off COVID Vaccines Is A Deepfake


      AI Voice Clones Can Speak Hindi Too

      AI voice clones are also being used to commit fraud. In India, con artists are using AI voice clones of celebrities to peddle fraudulent get-rich-quick schemes.

      BOOM found Facebook is filled with fraud ads using AI voice clones of popular Indian celebrities peddling fraudulent investment schemes and fake products.

      We found fraud ads with AI voice clones of Shah Rukh Khan, Virat Kohli, Mukesh Ambani, Ratan Tata, Narayana Murthy, Akshay Kumar and Sadhguru that have been overlaid onto real videos of these individuals.

      Similarly, we found AI voice clones of popular Hindi television news anchors such as Arnab Goswami, Ravish Kumar, Anjana Om Kashyap, Sudhir Chaudhary etc speaking in Hindi while promoting a fake diabetes drug.



      Also Read:Fake Edited Videos Of TV Anchors Promoting Diabetes Medicine Viral

      The rise of AI voice clones in Hindi is concerning and fact-checkers fear the upcoming general election in May 2024 could see a flood of such type of AI-based political misinformation. India has already seen the use of deepfakes in a state election campaign in 2020.

      The Troubling Rise Of Deepfake Pornography

      Finally, generative AI’s most troubling use case has been its role in generating non-consensual imagery in the form of deepfake pornography.

      BOOM found X user @crazyashfans posted over thirty pornographic deepfake videos made with the faces of Indian actresses showing them perform explicit sex acts. The account was deactivated after we published the story.



      Also Read:
      X Is Full Of Deepfake Porn Videos Of Actresses; You May Be Targeted Next

      However, several such accounts posting deepfake images and videos targeting Indian actresses exist on X and other platforms such as Facebook, Instagram, YouTube etc.



      Also Read:Video Purporting To Show Kajol Changing Outfit On Camera Is A Deepfake

      In addition, websites and apps that allow users to synthetically ‘strip’ someone by uploading just one photo of the person also exist.

      The above instances show AI image generation and voice cloning tools have been released hastily and their existing guardrails if any are easy to bypass.

      The need of the hour is AI literacy, tweaking existing laws to catch up with technology and more accountability from social media platforms that are rushing to introduce AI features into their products without fully understanding the long term impact.

      Tags

      Yearender 2023DeepfakeAI Voice CloneFacebook
      Read Full Article
      Next Story
      Our website is made possible by displaying online advertisements to our visitors.
      Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.
      X

      Subscribe to BOOM Newsletters

      👉 No spam, no paywall — but verified insights.

      Please enter a Email Address
      Subscribe for free!

      Stay Ahead of Misinformation!

      Please enter a Email Address
      Subscribe Now🛡️ 100% Privacy Protected | No Spam, Just Facts
      By subscribing, you agree with the Terms & conditions and Privacy Policy connected to the offer

      Thank you for subscribing!

      You’re now part of the BOOM community.

      Or, Subscribe to receive latest news via email
      Subscribed Successfully...
      Copy HTMLHTML is copied!
      There's no data to copy!