A deepfake video of Prime Minister Narendra Modi is viral on the internet, where he is falsely shown as saying that the cow is the best animal for qurbani (Islamic sacrificial slaughter).
BOOM extracted the audio and ran it through several voice clone detectors, which provided significant indication of the audio being AI-generated. Furthermore, we traced the original video, and found no evidence of Modi making such a statement.
The Claim:
The video has been circulating on Facebook (archived here) in the past week, and allegedly shows Prime Minister Narendra Modi saying in Hindi that “cow is the best animal for qurbani.”
What We Found:
The video has been overlaid with AI voice clone, and video was manipulated with AI.
No record of such statement: BOOM found no verified transcript, public speech, or official video of Modi making such a statement, either in Parliament, during a campaign rally, or in media interviews.
Source video has no such statement: A reverse image search of keyframes led us to a speech by Modi during a National Democratic Alliance parliamentary meeting a year ago, following the results of the 2024 Lok Sabha elections. From the 1:03:16 mark onward in the video, we found visuals matching the viral video, albeit horizontally flipped. We listened to Modi's entire speech, and found no mention of the controversial statements heard in the viral video.
Lip-sync mismatch: Furthermore, we closely analysed the video and found a clear lack of synchronisation between lip movement and speech—a common giveaway in voice-cloning deepfakes.
Deepfake detection tools: Taking cue from this, we extracted the audio and ran it through several voice clone detection models using the tool Deepfake-o-meter. Five out of six models provided a very high likelihood of the audio being AI-generated. However, Hive Moderator could not conclusively state whether the audio was AI-generated,
Experts confirm AI manipulation: BOOM consulted with its partners at the Deepfakes Analysis Unit (DAU) who also found mixed results from various deepfake detection tools. DAU further escalated the video with deepfake detection experts at Contrails AI for further analysis. The report by Contrails indicated the presence of "unnatural lip movements" and "black space between lips indicative of a lip-sync attack. This report further confirm the likelihood of AI-manipulation in both the video and the audio.