BOOM

Trending Searches

    SUPPORT
    BOOM

    Trending News

      • Fact Check 
        • Fast Check
        • Politics
        • Business
        • Entertainment
        • Social
        • Sports
        • World
      • Law
      • Explainers
      • News 
        • All News
      • Decode 
        • Impact
        • Scamcheck
        • Life
        • Voices
      • Media Buddhi 
        • Digital Buddhi
        • Senior Citizens
        • Videos
      • Web Stories
      • BOOM Research
      • BOOM Labs
      • Deepfake Tracker
      • Videos 
        • Facts Neeti
      • Home-icon
        Home
      • About Us-icon
        About Us
      • Authors-icon
        Authors
      • Team-icon
        Team
      • Careers-icon
        Careers
      • Internship-icon
        Internship
      • Contact Us-icon
        Contact Us
      • Methodology-icon
        Methodology
      • Correction Policy-icon
        Correction Policy
      • Non-Partnership Policy-icon
        Non-Partnership Policy
      • Cookie Policy-icon
        Cookie Policy
      • Grievance Redressal-icon
        Grievance Redressal
      • Republishing Guidelines-icon
        Republishing Guidelines
      • Fact Check-icon
        Fact Check
        Fast Check
        Politics
        Business
        Entertainment
        Social
        Sports
        World
      • Law-icon
        Law
      • Explainers-icon
        Explainers
      • News-icon
        News
        All News
      • Decode-icon
        Decode
        Impact
        Scamcheck
        Life
        Voices
      • Media Buddhi-icon
        Media Buddhi
        Digital Buddhi
        Senior Citizens
        Videos
      • Web Stories-icon
        Web Stories
      • BOOM Research-icon
        BOOM Research
      • BOOM Labs-icon
        BOOM Labs
      • Deepfake Tracker-icon
        Deepfake Tracker
      • Videos-icon
        Videos
        Facts Neeti
      Trending Tags
      TRENDING
      • #Operation Sindoor
      • #Pahalgam Terror Attack
      • #Narendra Modi
      • #Rahul Gandhi
      • #Waqf Amendment Bill
      • #Arvind Kejriwal
      • #Deepfake
      • #Artificial Intelligence
      • Home
      • Fact Check
      • Ratan Tata Calls Out Deepfake Fraud...
      Fact Check

      Ratan Tata Calls Out Deepfake Fraud Ad Made With AI Voice Cloning Tech

      Ratan Tata took to Instagram to call out a fake ad promoting an investment scheme misusing his name.

      By - Hazel Gandhi |
      Published -  7 Dec 2023 2:33 PM IST
    • Boomlive
      Listen to this Article
      Ratan Tata Calls Out Deepfake Fraud Ad Made With AI Voice Cloning Tech

      Ratan Tata took to social media to call out a deepfake video created with artificial intelligence (AI) voice cloning technology where he is supposedly seen promoting an investment opportunity, becoming the latest high profile name in India to warn about the menace of deepfakes.

      The fake ad on Facebook purports to show the legendary industrialist endorsing an investment opportunity by a woman supposedly named Sona Agarwal.

      However, Sona Agarwal is a fictitious persona. We also found fake ads on Facebook using the same MO where Tata is seen endorsing one Laila Rao.

      BOOM has written several articles showing how Meta-owned Facebook is replete with fraud ads made with AI voice cloning tools impersonating the voices of popular Indian celebrities. These scams involve creating fake identities such as Laila Rao and Suraj Sharma. Sona Agarwal is the latest fake alias to this list. The visuals of the men and women used to build these fake identities are of real people whose photographs have been stolen from their social media accounts without their knowledge.

      The first video purportedly shows Tata, former chairman of the Tata Group, promoting the scheme by saying that his manager, Sona Agarwal, is handling it.

      "I get millions of messages every day asking for my help. It is physically impossible but I can't be indifferent. To address poverty in India, my manager Sona and I initiated a project. By following Sona's instructions, you can multiply 10,000 Rs, ensuring a solution to the critical poverty situation in India..."

      The second video shares a similar message, but this time, Tata can be heard endorsing the investment scheme by another individual, Laila Rao. "Attention: I announce the start of a new investment project for all residents of India. I personally support this project which guarantees that any resident of India can multiply their money without any risks. I appoint Laila Rao as the head of this project..."


      The two videos have been shared by the accounts of Sona Agarwal and Laila Rao and are circulating as sponsored posts or ads on Facebook. Below are the screenshots of the ads:




      Also Read:How Unemployed Tech-Savvy Youth Ran The Infamous Mahadev Betting App


      FACT CHECK


      BOOM found that the videos are fake and have been created using AI voice cloning tools. Both the videos have been overlaid with a fake voiceover of Ratan Tata.

      A reverse image search of some key frames from the viral video of Sona Agarwal led us to a video from June 4, 2015 shared by the official account of HEC, a business school in Paris. The video was titled 'Ratan N. Tata receives Honoris Causa degree at HEC Paris' and carried visuals similar to the viral video. The clip showed Tata receiving the honoris causa degree at HEC, speaking about leadership and management, and interacting with the audience during a Q&A session.

      We watched the entire clip and did not find Tata speaking about any investment scheme by Sona Agarwal that would help people get rich quickly.



      Below is a comparison between the viral video and the original video.




      We also found that Ratan Tata had put out a clarification on his verified Instagram account regarding the Sona Agarwal video, where he called this ad fake.




      We also ran both the audio samples of the ads through a tool called AI Voice Detector, which estimated that the voice in the Sona Agarwal video was 83% AI-generated, and the one in the Laila Rao video was 61% AI-generated.



      We previously reported on the Laila Rao scam that used the face of actor Smriti Khanna to lure women into false investment schemes and dupe thousands of rupees from them. BOOM has also debunked other videos that showed Madhya Pradesh CM candidates Shivraj Singh Chouhan and Kamal Nath that were created using fake voiceovers likely made with the help of AI voice cloning tools. During the same time, BOOM debunked other videos that included a fake voiceover of Amitabh Bachchan, likely created using AI, from popular game show Kaun Banega Crorepati.


      Also Read:Video Of Muslim Students Learning Sanskrit In Kerala Viral With False Claim


      Tags

      Ratan TataLaila RaoDeepfakeFact CheckFake NewsFacebookAI Voice Clone
      Read Full Article
      Claim :   Video shows Ratan Tata promoting investment schemes by Laila Rao and Sona Agarwal
      Claimed By :  Facebook ads
      Fact Check :  False
      Next Story
      Our website is made possible by displaying online advertisements to our visitors.
      Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.
      X

      Subscribe to BOOM Newsletters

      👉 No spam, no paywall — but verified insights.

      Please enter a Email Address
      Subscribe for free!

      Stay Ahead of Misinformation!

      Please enter a Email Address
      Subscribe Now🛡️ 100% Privacy Protected | No Spam, Just Facts
      By subscribing, you agree with the Terms & conditions and Privacy Policy connected to the offer

      Thank you for subscribing!

      You’re now part of the BOOM community.

      Or, Subscribe to receive latest news via email
      Subscribed Successfully...
      Copy HTMLHTML is copied!
      There's no data to copy!