Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available

Support

Explore

HomeNo Image is Available
About UsNo Image is Available
AuthorsNo Image is Available
TeamNo Image is Available
CareersNo Image is Available
InternshipNo Image is Available
Contact UsNo Image is Available
MethodologyNo Image is Available
Correction PolicyNo Image is Available
Non-Partnership PolicyNo Image is Available
Cookie PolicyNo Image is Available
Grievance RedressalNo Image is Available
Republishing GuidelinesNo Image is Available

Languages & Countries :






More about them

Fact CheckNo Image is Available
LawNo Image is Available
ExplainersNo Image is Available
NewsNo Image is Available
DecodeNo Image is Available
Media BuddhiNo Image is Available
Web StoriesNo Image is Available
BOOM ResearchNo Image is Available
BOOM LabsNo Image is Available
Deepfake TrackerNo Image is Available
VideosNo Image is Available
Scamcheck

When What You See Isn’t Real: Beware of Deepfake Scams

Deepfake 'vishing' creates a false sense of familiarity and trust by mimicking a family member or friend.

By -  Titha Ghosh |

22 Aug 2025 9:58 AM IST

AI is getting more sophisticated, and so are scammers. With just a short clip from a social media video or voicemail, criminals can now duplicate faces and clone voices, and then use them to trick you.

How is deepfake different from voice cloning?

Deepfake video: AI uses images or clips of a person to create a fake video of them saying or doing things they never did.

Voice cloning: AI analyses speech patterns to generate new audio that sounds identical to the real person.

Scary Scenarios Scammers Create

“Family Emergency” Calls: A cloned voice of your child or relative begs for urgent money for bail, ransom, or medical bills. Unlike fake emails or links, this direct interaction over the phone creates emotional pressure, making victims feel greater urgency.

Executive Fraud: A fake voice or video of a company leader instructs staff to wire funds immediately.

Tech Support: Scammers contact their victims as trusted company executives, saying that their mobile device or computer has been compromised. The scammer then instructs the victim to install remote access software, through which they can extract sensitive data.


How to Stay Safe

  • Use a family code word or phrase: Ask for it on any “emergency” call.
  • Hang up and call back using a known number.
  • Double-check with the family member, or friend before making any transactions.
  • Enable 2FA on bank and email accounts.
  • Turn on phone notifications for all debit/credit transactions.
  • Report suspicious calls on cybercrime.gov.in or call 1930 to report.
  • If you’ve received a suspicious call or video message, don’t panic. Send it to BOOM’s Tipline (7700906588) and we’ll verify it for you.

Tags: