Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
ScamCheckNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
ScamCheckNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available
Fact Check

News18 Article On Doval’s ‘ISI Recruited More Hindus’ Remark Erroneous

BOOM found that the viral video of Ajit Doval is genuine, predating deepfake technology.

By -  Archis Chowdhury |

17 Nov 2025 8:20 PM IST

News18 recently published a report claiming that a video of National Security Advisor Ajit Doval, where he is heard stating that ISI has recruited more Hindus than Muslims for intelligence tasks in India, is a deepfake. The outlet carried Doval’s on-record denial and incorrectly framed the clip as manipulated, and also erroneously wrote “ISIS” instead of “ISI.”

BOOM reviewed the source footage and found that News18’s assessment is incorrect. The video is authentic, dates back to 2014, and predates the availability of modern deepfake tools. The full recording shows no signs of tampering or synthetic audio, according to BOOM’s analysis and assessments shared by the Deepfakes Analysis Unit (DAU).

The clip has gone viral in the backdrop of the November 10, 2025 car explosion in Delhi that claimed 13 lives, as part of a larger terror module that authorities are investigating across several states.

The Claim

News18 published an article titled, “Ajit Doval Issues Denial After 'Hindus Attracted To ISIS' Video, Flags Deepfake Threat,” in which they carried Doval’s official denial.

The report stated: “Speaking to CNN-News18, Doval clarified that he has never made such a statement and warned that the clip appears to be a case of deepfake manipulation, designed to distort India’s national-security discourse.” (archive here)

The same was also reported by Moneycontrol (archive here).

What We Found

BOOM found that the viral clip is not a deepfake. The video originates from a publicly available recording of a 2014 event.

  • Video dates back to March 2014: BOOM traced the viral footage to a YouTube video uploaded on March 20, 2014 (archive here). The video shows Ajit Doval speaking at an Australia India Institute event titled “The Challenge of Global Terrorism.” A tweet from the institute on March 11, 2014 (archive here), and their event newsletter (archive here), confirm that Doval delivered a talk that month.
  • Viral statement appears in original 2014 footage: Between the 01:04:00 and 01:14:00 timestamps, Doval can be clearly heard saying: “The number of persons that ISI has recruited for intelligence tasks in India… there has been more Hindus than Muslims.” This matches the viral clip.
  • No signs of tampering or manipulation: BOOM reviewed the full-length footage and found no cuts, splices, audio anomalies, or visual inconsistencies in the segment containing the statement.
  • Context behind his statement: At the 58:38 mark, Doval was asked about India’s domestic security challenge, to which he responded that the state must work with Indian Muslims and make them partners in actions against terror. He added that nearly 90% of casualties of Islamic terrorism are Muslims, and said, “we will carry the Muslims with us.” Doval also remarked that Muslims had fought for India’s independence as well, underscoring their nationalism. This is the broader context in which he later discusses ISI recruitment activities.
  • Footage predates widespread availability of deepfake tools: The video is from March 11, 2014, which predates the public accessibility of deepfake creation tools. It was also recorded months before the publication of the first Generative Adversarial Networks paper by Ian Goodfellow et al on June 10, 2014, the foundational model behind modern deepfakes.
  • No mention of ISIS: BOOM found that News18's article erronously referred to ISIS while speaking of Doval's comments. After reviewing the video, we found that at no point in the original footage does Doval mention “ISIS”; he refers specifically to the ISI.
  • Deepfake detection tools indicate the clip is genuine: BOOM escalated the footage to its partners at the Deepfakes Analysis Unit. Their tool assessments found:
     - Hive AI audio classifier: Entire audio track classified as “not AI-generated.”
     - ElevenLabs speech classifier: Rated “Very Unlikely” that the audio was generated using their platform.
     - Aurigin AI: Analysis conducted using its advanced audio deepfake detection engine found found the audio to be genuine.
     - Hive AI video detector: No detection of AI in the frames featuring Ajit Doval.

    DAU added that because the source footage has low-quality audio and video, tool certainty can be reduced, and detection outcomes should be interpreted with that limitation in mind.

BOOM has also escalated the video to media forensics experts for further verification. This article will be updated upon receiving their assessments.

Tags: