Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available
Fact Check

Video Of DY Chief Of Army Staff Claiming India Lost S-400s Is A Deepfake

BOOM found that the AI-generated audio falsely claims Indian military equipment loss during Operation Sindoor.

By -  Archis Chowdhury |

21 July 2025 6:10 PM IST

A video claiming to show Deputy Chief of Army Staff Lt Gen Rahul R Singh admitting that India lost two S-400 missile systems to Chinese missiles during a military escalation with Pakistan in May 2025 following Operation Sindoor is going viral on social media.

BOOM found that the video has been manipulated with an inserted 14-second audio snippet that is not present in the original footage.

The Claim

Multiple users on X and Facebook shared a video claiming that Lt Gen Rahul R Singh had admitted India lost two S-400 missile systems during "Operation Sindoor." The viral post shows him allegedly saying, “We are currently in negotiations with Russia to get two S-400 systems by the end of August, which we lost to Chinese missiles on 10th May.” (archive)

What We Found:

BOOM found that the video has been digitally altered with synthetic audio and mismatched visuals. BOOM's partners at the Deepfakes Analysis Unit (DAU) confirmed that a small AI-generated segment was inserted into a real speech to falsely suggest India’s military suffered equipment loss.

Original video contains no mention of S-400 losses: BOOM traced the original video to a recording of Singh’s speech at a FICCI event on July 4, 2025, but did not find any mention of damage to the Russian S-400 systems used by India due to Chinese missile strikes.

Full View

Voice and gesture mismatch: BOOM observed the video closely and found evidence of manipulation. Specifically, during the segment where Singh is speaking of damage to S-400s, the body language becomes stiff, while the head and arm movements becomes inconsistent with the rest of the speech. The background sound also suddenly shifts, suggesting the use of voice-cloning and lip-syncing algorithms.

AI detection tools confirm partial manipulation: ConTrails AI flagged the entire audio track initially, but their low-resolution model later isolated a roughly 15-second segment as manipulated. Hive AI’s video tool also flagged potential visual alterations, though its audio tool did not conclusively confirm AI generation.

Expert analysis confirms stitching and cloning: Experts from DAU’s partner institutions, including Kelly Wu and Saniat Sohrawardi from RIT’s DeFake Project, identified synthetic voice elements and minor facial syncing errors. Sohrawardi noted tonal shifts in the fake segment and described this as a “new type of fake” blending real and AI-generated audio. While Wu noted that in the real video, the nameplate shows his full name and rank; while in the viral version, it displays random characters.



Tags: