BOOM

Trending Searches

    SUPPORT
    BOOM

    Trending News

      • Fact Check 
        • Fast Check
        • Politics
        • Business
        • Entertainment
        • Social
        • Sports
        • World
      • Law
      • Explainers
      • News 
        • All News
      • Decode 
        • Impact
        • Scamcheck
        • Life
        • Voices
      • Media Buddhi 
        • Digital Buddhi
        • Senior Citizens
        • Videos
      • Web Stories
      • BOOM Research
      • BOOM Labs
      • Deepfake Tracker
      • Videos 
        • Facts Neeti
      • Home-icon
        Home
      • About Us-icon
        About Us
      • Authors-icon
        Authors
      • Team-icon
        Team
      • Careers-icon
        Careers
      • Internship-icon
        Internship
      • Contact Us-icon
        Contact Us
      • Methodology-icon
        Methodology
      • Correction Policy-icon
        Correction Policy
      • Non-Partnership Policy-icon
        Non-Partnership Policy
      • Cookie Policy-icon
        Cookie Policy
      • Grievance Redressal-icon
        Grievance Redressal
      • Republishing Guidelines-icon
        Republishing Guidelines
      • Fact Check-icon
        Fact Check
        Fast Check
        Politics
        Business
        Entertainment
        Social
        Sports
        World
      • Law-icon
        Law
      • Explainers-icon
        Explainers
      • News-icon
        News
        All News
      • Decode-icon
        Decode
        Impact
        Scamcheck
        Life
        Voices
      • Media Buddhi-icon
        Media Buddhi
        Digital Buddhi
        Senior Citizens
        Videos
      • Web Stories-icon
        Web Stories
      • BOOM Research-icon
        BOOM Research
      • BOOM Labs-icon
        BOOM Labs
      • Deepfake Tracker-icon
        Deepfake Tracker
      • Videos-icon
        Videos
        Facts Neeti
      Trending Tags
      TRENDING
      • #Operation Sindoor
      • #Pahalgam Terror Attack
      • #Narendra Modi
      • #Rahul Gandhi
      • #Waqf Amendment Bill
      • #Arvind Kejriwal
      • #Deepfake
      • #Artificial Intelligence
      • Home
      • Decode
      • Deepfake Instagram Influencer...
      Decode

      Deepfake Instagram Influencer Steals Hearts, Reels And Morphs Faces Using AI

      Ira Sharma has over 260,000 followers on Instagram. But the handle's videos have been made by stealing content from other influencers and swapping their faces with an AI-generated face.

      By - Karen Rebelo | 28 March 2024 8:00 AM IST
    • Boomlive
      Listen to this Article
      Deepfake Instagram Influencer Steals Hearts, Reels And Morphs Faces Using AI

      Ira Sharma describes herself on Instagram in three short sentences. ‘A normal girl’; ‘loves bgms* and songs’ and ‘Not real (alien)’.

      The last sentence in her bio is ironically accurate. That’s because Ira Sharma, who has over 2,60,000 followers on Instagram, is not real.

      The handle me_ira.sharma, created in November 2023, is a deepfake influencer made by stealing reels of young Indian women on the Meta-owned platform and morphing those videos using face swap technology.


      Deepfake influencers are a small but growing subset of Artificial Intelligence (AI) or virtual influencers that have been around since the last few years.

      Deepfake influencers are intended to be money spinners for their creators, through paid brand promotions on platforms such as Instagram and YouTube.

      Long time advertising industry observer Karthik Srinivasan told Decode that this type of deepfake influencer was a form of “inventive plagiarism”.

      “...here it's not just content being downloaded and passed on but the content is almost taken like a white label and repackaged with a new face. The only difference is that the new face doesn't exist. It’s an AI entity,” Srinivasan, who works as a communications strategy consultant, said.


      Forensic Analysis Confirms Faces In Videos Altered


      Decode traced at least three videos, which the account plagiarised and morphed with a generative-AI face, that could be traced back to other women and teenage content creators on Instagram.

      "This really worries me a lot. It's more than just about my hard work being stolen. It's also about the risk of someone pretending to be me. And the fact that these videos could be used in harmful ways just adds to the stress," Pushti Shah, whose video was stolen and morphed, told Decode.

      The Vadodara based influencer, who has over 24,200 followers on Instagram, shares photos of herself sporting different outfits.

      To test our hypothesis that we were dealing with deepfakes, we reached out to Professor Mayank Vatsa at the Indian Institute of Technology (IIT) Jodhpur to analyse the videos.

      Professor Vatsa’s team tested the videos using their detection tool named ‘itisaar’ - a suite of algorithms, to detect digital manipulation in audio, video, or image, developed by the Image Analysis and Biometric Lab at IIT Jodhpur.

      The analysis determined that all the samples were manipulated or edited and that there were several inconsistencies in each of them.




      The detection network highlighted the following areas as altered regions. (See screenshot below)






      “The videos have been altered but detecting which tool/method is being used is non-trivial. However, this looks like face swap only. Moreover, such level of alterations in videos are feasible only with GenAI approaches,” Professor Mayank Vatsa, a Swarnajayanti Fellow at IIT Jodhpur told Decode.

      Democratising Deepfakes

      While the spectre of deepfakes has been around since 2017, rapid leaps in the development of generative AI last year has democratised deepfake technology to anyone with a laptop and an internet connection.

      Decode found step-by-step YouTube tutorials in Hindi on how to create your own AI influencer to earn money off Instagram through brand promotions.

      These tutorials rely on sites like Tensor.Art - a free online image generator and AI model hosting site.

      Tensor.Art is used to create AI-generated faces using a text-to-image prompt.

      The AI generated face can be given a unique number and saved to use the same synthetic face and generate new images with it.

      Using face-swap technology you can then swap any face in a digital photo or video with the synthetic face

      And it’s not just techies or amateur social media enthusiasts who are perpetrating this type of generative AI plagiarism but also small digital marketing companies.

      Decode also found nine other linked Instagram accounts using variations of the username Ira Sharma and posting the same deepfake content. The main handle me_ira.sharma was also tagging some of the fake accounts in its Instagram Stories.



      A tenth account, created in January 2021, and using a different fake alias - Ishita Choudhary, was also posting the same videos.



      AI Influencers And Authenticity

      Social media influencers, particularly those who skew younger, are coveted by companies globally to promote their products and services.

      Increasingly, they are being sought by political parties to get their message across to potential young voters.

      Like every other field, influencer marketing has also been disrupted by AI, with global brands such as Samsung, Nike and Calvin Klein counting AI influencers as part of their publicity roster.

      India followed the trend in 2022 with Kyra - India’s first virtual influencer and the likes of online fashion retailer Myntra’s Maya and more recently Naina.

      Deepfake influencer Ira Sharma’s follower count of 260,000 made up of mostly young Indian males crushing over her trumps that of Kyra, Maya, and Naina, individually.

      The difference between the former and the latter is that the latter have clearly disclosed their synthetic/virtual nature, nor is their content plagiarised from someone else’s work.






      ‘It’s In Social Media Platforms’ Interest To Care’

      Pushti Shah, whose video was stolen and morphed, said platforms such as Instagram should implement technological solutions to prevent misuse of content.

      "It would be incredibly beneficial if Instagram could implement measures to safeguard creators from both unauthorised screen recording and the alarming rise of deepfake technology. Such proactive steps would not only protect our creative efforts but also preserve the integrity of our online presence," she added.

      Digital rights advocate and researcher Ramak Molavi Vasse’i said social media platforms play a dual role, both in facilitating the technology and in spreading the output.

      “Companies such as Meta, which owns Facebook and Instagram and provides large language models (LLMs), are prime examples. Their revenue model is driven by content that goes viral and evokes strong emotions. They do not sufficiently address and mitigate the harm they cause,” Molavi Vasse’i said.

      “As a result, the challenges posed by these technologies and their widespread adoption burden both society and individuals. AI providers have a responsibility to thoroughly assess and mitigate the potential risks associated with their products before they are launched,” she added.

      Communications strategy consultant Karthik Srinivasan echoed her sentiment.

      “In their own interest they should care if people complain that this is plagiarising content which is a much bigger problem because some creator has put in a lot of effort to create original content and that is being reused here,” Srinivasan said.

      “And if they find that this is widespread, it could really affect the credibility of Instagram as a platform. So they better care in the long run. Hopefully they should.”

      Decode reached out to Meta for a comment. The article will be updated upon receiving a response.



      With inputs from Sista Mukherjee and Srijit Das

      Tags

      DeepfakeAI influencersInstagramMetaYouTubeFacebook
      Read Full Article

      Next Story
      Our website is made possible by displaying online advertisements to our visitors.
      Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.
      X

      Subscribe to BOOM Newsletters

      👉 No spam, no paywall — but verified insights.

      Please enter a Email Address
      Subscribe for free!

      Stay Ahead of Misinformation!

      Please enter a Email Address
      Subscribe Now🛡️ 100% Privacy Protected | No Spam, Just Facts
      By subscribing, you agree with the Terms & conditions and Privacy Policy connected to the offer

      Thank you for subscribing!

      You’re now part of the BOOM community.

      Or, Subscribe to receive latest news via email
      Subscribed Successfully...
      Copy HTMLHTML is copied!
      There's no data to copy!