Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
BOOM ReportsNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
Elections 2024No Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
BOOM ReportsNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
Elections 2024No Image is Available
VideosNo Image is Available
Fact Check

Ratan Tata Calls Out Deepfake Fraud Ad Made With AI Voice Cloning Tech

Ratan Tata took to Instagram to call out a fake ad promoting an investment scheme misusing his name.

By - Hazel Gandhi | 7 Dec 2023 9:03 AM GMT

Ratan Tata took to social media to call out a deepfake video created with artificial intelligence (AI) voice cloning technology where he is supposedly seen promoting an investment opportunity, becoming the latest high profile name in India to warn about the menace of deepfakes.

The fake ad on Facebook purports to show the legendary industrialist endorsing an investment opportunity by a woman supposedly named Sona Agarwal. 

However, Sona Agarwal is a fictitious persona. We also found fake ads on Facebook using the same MO where Tata is seen endorsing one Laila Rao. 

BOOM has written several articles showing how Meta-owned Facebook is replete with fraud ads made with AI voice cloning tools impersonating the voices of popular Indian celebrities. These scams involve creating fake identities such as Laila Rao and Suraj Sharma. Sona Agarwal is the latest fake alias to this list. The visuals of the men and women used to build these fake identities are of real people whose photographs have been stolen from their social media accounts without their knowledge.

The first video purportedly shows Tata, former chairman of the Tata Group, promoting the scheme by saying that his manager, Sona Agarwal, is handling it.

"I get millions of messages every day asking for my help. It is physically impossible but I can't be indifferent. To address poverty in India, my manager Sona and I initiated a project. By following Sona's instructions, you can multiply 10,000 Rs, ensuring a solution to the critical poverty situation in India..."

Full View

The second video shares a similar message, but this time, Tata can be heard endorsing the investment scheme by another individual, Laila Rao. "Attention: I announce the start of a new investment project for all residents of India. I personally support this project which guarantees that any resident of India can multiply their money without any risks. I appoint Laila Rao as the head of this project..."

Full View


The two videos have been shared by the accounts of Sona Agarwal and Laila Rao and are circulating as sponsored posts or ads on Facebook. Below are the screenshots of the ads: 





FACT CHECK


BOOM found that the videos are fake and have been created using AI voice cloning tools. Both the videos have been overlaid with a fake voiceover of Ratan Tata.

A reverse image search of some key frames from the viral video of Sona Agarwal led us to a video from June 4, 2015 shared by the official account of HEC, a business school in Paris. The video was titled 'Ratan N. Tata receives Honoris Causa degree at HEC Paris' and carried visuals similar to the viral video. The clip showed Tata receiving the honoris causa degree at HEC, speaking about leadership and management, and interacting with the audience during a Q&A session.

We watched the entire clip and did not find Tata speaking about any investment scheme by Sona Agarwal that would help people get rich quickly.


Full View


Below is a comparison between the viral video and the original video.




We also found that Ratan Tata had put out a clarification on his verified Instagram account regarding the Sona Agarwal video, where he called this ad fake.




We also ran both the audio samples of the ads through a tool called AI Voice Detector, which estimated that the voice in the Sona Agarwal video was 83% AI-generated, and the one in the Laila Rao video was 61% AI-generated.

 



We previously reported on the Laila Rao scam that used the face of actor Smriti Khanna to lure women into false investment schemes and dupe thousands of rupees from them. BOOM has also debunked other videos that showed Madhya Pradesh CM candidates Shivraj Singh Chouhan and Kamal Nath that were created using fake voiceovers likely made with the help of AI voice cloning tools. During the same time, BOOM debunked other videos that included a fake voiceover of Amitabh Bachchan, likely created using AI, from popular game show Kaun Banega Crorepati.