BOOM

Trending Searches

    SUPPORT
    BOOM

    Trending News

      • Fact Check 
        • Fast Check
        • Politics
        • Business
        • Entertainment
        • Social
        • Sports
        • World
      • Law
      • Explainers
      • News 
        • All News
      • Decode 
        • Impact
        • Scamcheck
        • Life
        • Voices
      • Media Buddhi 
        • Digital Buddhi
        • Senior Citizens
        • Resources
      • Web Stories
      • BOOM Research
      • BOOM Labs
      • Deepfake Tracker
      • Videos 
        • Facts Neeti
      • Home-icon
        Home
      • About Us-icon
        About Us
      • Authors-icon
        Authors
      • Team-icon
        Team
      • Careers-icon
        Careers
      • Internship-icon
        Internship
      • Contact Us-icon
        Contact Us
      • Methodology-icon
        Methodology
      • Correction Policy-icon
        Correction Policy
      • Non-Partnership Policy-icon
        Non-Partnership Policy
      • Cookie Policy-icon
        Cookie Policy
      • Grievance Redressal-icon
        Grievance Redressal
      • Republishing Guidelines-icon
        Republishing Guidelines
      • Fact Check-icon
        Fact Check
        Fast Check
        Politics
        Business
        Entertainment
        Social
        Sports
        World
      • Law-icon
        Law
      • Explainers-icon
        Explainers
      • News-icon
        News
        All News
      • Decode-icon
        Decode
        Impact
        Scamcheck
        Life
        Voices
      • Media Buddhi-icon
        Media Buddhi
        Digital Buddhi
        Senior Citizens
        Resources
      • Web Stories-icon
        Web Stories
      • BOOM Research-icon
        BOOM Research
      • BOOM Labs-icon
        BOOM Labs
      • Deepfake Tracker-icon
        Deepfake Tracker
      • Videos-icon
        Videos
        Facts Neeti
      Trending Tags
      TRENDING
      • #Bihar Elections 2025
      • #Lok Sabha
      • #Narendra Modi
      • #Rahul Gandhi
      • #Asia Cup 2025
      • #BJP
      • #Deepfake
      • #Artificial Intelligence
      • Home
      • Decode
      • Scamcheck
      • When What You See Isn’t Real:...
      Scamcheck

      When What You See Isn’t Real: Beware of Deepfake Scams

      Deepfake 'vishing' creates a false sense of familiarity and trust by mimicking a family member or friend.

      By -  Titha Ghosh |
      22 Aug 2025 9:58 AM IST
    • Boomlive
      Listen to this Article
      When What You See Isn’t Real: Beware of Deepfake Scams

      AI is getting more sophisticated, and so are scammers. With just a short clip from a social media video or voicemail, criminals can now duplicate faces and clone voices, and then use them to trick you.

      How is deepfake different from voice cloning?

      Deepfake video: AI uses images or clips of a person to create a fake video of them saying or doing things they never did.

      Voice cloning: AI analyses speech patterns to generate new audio that sounds identical to the real person.

      Scary Scenarios Scammers Create

      “Family Emergency” Calls: A cloned voice of your child or relative begs for urgent money for bail, ransom, or medical bills. Unlike fake emails or links, this direct interaction over the phone creates emotional pressure, making victims feel greater urgency.

      Executive Fraud: A fake voice or video of a company leader instructs staff to wire funds immediately.

      Tech Support: Scammers contact their victims as trusted company executives, saying that their mobile device or computer has been compromised. The scammer then instructs the victim to install remote access software, through which they can extract sensitive data.

      Also Read:Don't Get Fooled: Your Guide To Spotting Deepfakes & AI Voice Clones


      How to Stay Safe

      • Use a family code word or phrase: Ask for it on any “emergency” call.
      • Hang up and call back using a known number.
      • Double-check with the family member, or friend before making any transactions.
      • Enable 2FA on bank and email accounts.
      • Turn on phone notifications for all debit/credit transactions.
      • Report suspicious calls on cybercrime.gov.in or call 1930 to report.
      • If you’ve received a suspicious call or video message, don’t panic. Send it to BOOM’s Tipline (7700906588) and we’ll verify it for you.

      Tags

      DeepfakeAI Voice CloneFamilyEmergencyScamsFraudLoss
      Read Full Article

      Next Story
      Our website is made possible by displaying online advertisements to our visitors.
      Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.
      X

      Subscribe to BOOM Newsletters

      👉 No spam, no paywall — but verified insights.

      Please enter a Email Address
      Subscribe for free!

      Stay Ahead of Misinformation!

      Please enter a Email Address
      Subscribe Now🛡️ 100% Privacy Protected | No Spam, Just Facts
      By subscribing, you agree with the Terms & conditions and Privacy Policy connected to the offer

      Thank you for subscribing!

      You’re now part of the BOOM community.

      Or, Subscribe to receive latest news via email
      Subscribed Successfully...
      Copy HTMLHTML is copied!
      There's no data to copy!