AI is getting more sophisticated, and so are scammers. With just a short clip from a social media video or voicemail, criminals can now duplicate faces and clone voices, and then use them to trick you.
How is deepfake different from voice cloning?
Deepfake video: AI uses images or clips of a person to create a fake video of them saying or doing things they never did.
Voice cloning: AI analyses speech patterns to generate new audio that sounds identical to the real person.
Scary Scenarios Scammers Create
“Family Emergency” Calls: A cloned voice of your child or relative begs for urgent money for bail, ransom, or medical bills. Unlike fake emails or links, this direct interaction over the phone creates emotional pressure, making victims feel greater urgency.
Executive Fraud: A fake voice or video of a company leader instructs staff to wire funds immediately.
Tech Support: Scammers contact their victims as trusted company executives, saying that their mobile device or computer has been compromised. The scammer then instructs the victim to install remote access software, through which they can extract sensitive data.
How to Stay Safe
- Use a family code word or phrase: Ask for it on any “emergency” call.
- Hang up and call back using a known number.
- Double-check with the family member, or friend before making any transactions.
- Enable 2FA on bank and email accounts.
- Turn on phone notifications for all debit/credit transactions.
- Report suspicious calls on cybercrime.gov.in or call 1930 to report.
- If you’ve received a suspicious call or video message, don’t panic. Send it to BOOM’s Tipline (7700906588) and we’ll verify it for you.