BOOM

Trending Searches

    SUPPORT
    BOOM

    Trending News

      • Fact Check 
        • Fast Check
        • Politics
        • Business
        • Entertainment
        • Social
        • Sports
        • World
      • Law
      • Explainers
      • News 
        • All News
      • Decode 
        • Impact
        • Scamcheck
        • Life
        • Voices
      • Media Buddhi 
        • Digital Buddhi
        • Senior Citizens
        • Videos
      • Web Stories
      • BOOM Research
      • BOOM Labs
      • Deepfake Tracker
      • Videos 
        • Facts Neeti
      • Home-icon
        Home
      • About Us-icon
        About Us
      • Authors-icon
        Authors
      • Team-icon
        Team
      • Careers-icon
        Careers
      • Internship-icon
        Internship
      • Contact Us-icon
        Contact Us
      • Methodology-icon
        Methodology
      • Correction Policy-icon
        Correction Policy
      • Non-Partnership Policy-icon
        Non-Partnership Policy
      • Cookie Policy-icon
        Cookie Policy
      • Grievance Redressal-icon
        Grievance Redressal
      • Republishing Guidelines-icon
        Republishing Guidelines
      • Fact Check-icon
        Fact Check
        Fast Check
        Politics
        Business
        Entertainment
        Social
        Sports
        World
      • Law-icon
        Law
      • Explainers-icon
        Explainers
      • News-icon
        News
        All News
      • Decode-icon
        Decode
        Impact
        Scamcheck
        Life
        Voices
      • Media Buddhi-icon
        Media Buddhi
        Digital Buddhi
        Senior Citizens
        Videos
      • Web Stories-icon
        Web Stories
      • BOOM Research-icon
        BOOM Research
      • BOOM Labs-icon
        BOOM Labs
      • Deepfake Tracker-icon
        Deepfake Tracker
      • Videos-icon
        Videos
        Facts Neeti
      Trending Tags
      TRENDING
      • #Operation Sindoor
      • #Pahalgam Terror Attack
      • #Narendra Modi
      • #Rahul Gandhi
      • #Waqf Amendment Bill
      • #Arvind Kejriwal
      • #Deepfake
      • #Artificial Intelligence
      • Home
      • Explainers
      • 83% Indians Fell Prey To AI Voice...
      Explainers

      83% Indians Fell Prey To AI Voice Scams: McAfee Report

      A McAfee report suggests that 69% of Indians are unable to distinguish between a genuine human voice and an AI-generated voice.

      By - Hera Rizwan |
      Published -  3 May 2023 6:08 PM IST
    • Boomlive
      Listen to this Article
      83% Indians Fell Prey To AI Voice Scams: McAfee Report

      India has the highest number of people who fell victim to AI-generated voice scams, a study titled 'Beware the Artificial Impostor' by McAfee has found. About 83% of Indians have lost money to such scams, the report said. Additionally, 69% of Indians are not confident if they could identify the cloned AI voice meant for scamming, from the real human voice.

      Amidst the increasing cybercrimes exploiting AI, attackers are now using AI-based voice technology to defraud people. The recent study conducted by global computer security software McAfee found that fraudsters are using Artificial Intelligence to mimic voices of anxious family members, and a large number of Indians are falling victims of such frauds.

      How does the AI voice-cloning works?

      Cloning someone's voice is now a potent weapon in the hands of a cyber criminals. According to the McAfee study, 53% of adults share their speech data online at least once a week through social media, voice notes, and more.Forty-nine percent of people do so up to 10 times a week. The practice is most common in India, with 86% of people making their voices available online at least once a week, followed by the U.K. at 56%, and the U.S. at 52%.

      It might seem like a harmless activity meant to ease communication. But the voices leave behind digital footprint which can be misused by cyber-criminals to target people. A small snippet of voice can be used to create a believable clone, using AI, that can be manipulated for fraudulent purposes, the study said.

      The scammers invent scenarios to manipulate people into believing that someone close to them is in dire need of money. According to the study "some scenarios are more likely to be successful than others". Some of the scenarios which work well include car issues or accident, theft victim, lost wallet and catering to someone who is traveling abroad and needed help.

      Also Read:YouTube Plagued By Death Hoaxes Of Celebrities And Their Children

      What are the key takeaways from the study?

      -The study was conducted by McAfee with 7,054 people from seven countries, including India. According to the study, 47% of the Indian adults have experienced or know someone who has been a victim of AI voice scam. The study was conducted between April 13, 2023 to April 19, 2023.

      - According to McAfee, the AI voice-cloning tools are capable of replicating a person's voice with up to 95% accuracy. As per the study, 69% of Indians are not confident that they can identify the cloned version of a voice from the real voice.

      -More than a third of the individuals who participated in the study lost over $1,000, while 7% were duped of between $5,000 and $15,000. The number was highest in the U.S., where more than one in 10 victims lost between $5,000–$15,000. The cost of falling for an AI voice scam is also very significant in case of India, with 83% Indians losing money and 48% of them lost up to Rs 50,000.

      Also Read:Phones, Wallets, TVs: Delhi Has Most Forgetful Uber Riders In India








      Tags

      cyber fraudFacebook
      Read Full Article
      Next Story
      Our website is made possible by displaying online advertisements to our visitors.
      Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.
      X

      Subscribe to BOOM Newsletters

      👉 No spam, no paywall — but verified insights.

      Please enter a Email Address
      Subscribe for free!

      Stay Ahead of Misinformation!

      Please enter a Email Address
      Subscribe Now🛡️ 100% Privacy Protected | No Spam, Just Facts
      By subscribing, you agree with the Terms & conditions and Privacy Policy connected to the offer

      Thank you for subscribing!

      You’re now part of the BOOM community.

      Or, Subscribe to receive latest news via email
      Subscribed Successfully...
      Copy HTMLHTML is copied!
      There's no data to copy!