A video claiming to show Deputy Chief of Army Staff Lt Gen Rahul R Singh admitting that India lost two S-400 missile systems to Chinese missiles during a military escalation with Pakistan in May 2025 following Operation Sindoor is going viral on social media.
BOOM found that the video has been manipulated with an inserted 14-second audio snippet that is not present in the original footage.
The Claim
Multiple users on X and Facebook shared a video claiming that Lt Gen Rahul R Singh had admitted India lost two S-400 missile systems during "Operation Sindoor." The viral post shows him allegedly saying, “We are currently in negotiations with Russia to get two S-400 systems by the end of August, which we lost to Chinese missiles on 10th May.” (archive)
What We Found:
BOOM found that the video has been digitally altered with synthetic audio and mismatched visuals. BOOM's partners at the Deepfakes Analysis Unit (DAU) confirmed that a small AI-generated segment was inserted into a real speech to falsely suggest India’s military suffered equipment loss.
Original video contains no mention of S-400 losses: BOOM traced the original video to a recording of Singh’s speech at a FICCI event on July 4, 2025, but did not find any mention of damage to the Russian S-400 systems used by India due to Chinese missile strikes.
Voice and gesture mismatch: BOOM observed the video closely and found evidence of manipulation. Specifically, during the segment where Singh is speaking of damage to S-400s, the body language becomes stiff, while the head and arm movements becomes inconsistent with the rest of the speech. The background sound also suddenly shifts, suggesting the use of voice-cloning and lip-syncing algorithms.
AI detection tools confirm partial manipulation: ConTrails AI flagged the entire audio track initially, but their low-resolution model later isolated a roughly 15-second segment as manipulated. Hive AI’s video tool also flagged potential visual alterations, though its audio tool did not conclusively confirm AI generation.
Expert analysis confirms stitching and cloning: Experts from DAU’s partner institutions, including Kelly Wu and Saniat Sohrawardi from RIT’s DeFake Project, identified synthetic voice elements and minor facial syncing errors. Sohrawardi noted tonal shifts in the fake segment and described this as a “new type of fake” blending real and AI-generated audio. While Wu noted that in the real video, the nameplate shows his full name and rank; while in the viral version, it displays random characters.