BOOM

Trending Searches

    SUPPORT
    BOOM

    Trending News

      • Fact Check 
        • Fast Check
        • Politics
        • Business
        • Entertainment
        • Social
        • Sports
        • World
      • Law
      • Explainers
      • News 
        • All News
      • Decode 
        • Impact
        • Scamcheck
        • Life
        • Voices
      • Media Buddhi 
        • Digital Buddhi
        • Senior Citizens
        • Videos
      • Web Stories
      • BOOM Research
      • BOOM Labs
      • Deepfake Tracker
      • Videos 
        • Facts Neeti
      • Home-icon
        Home
      • About Us-icon
        About Us
      • Authors-icon
        Authors
      • Team-icon
        Team
      • Careers-icon
        Careers
      • Internship-icon
        Internship
      • Contact Us-icon
        Contact Us
      • Methodology-icon
        Methodology
      • Correction Policy-icon
        Correction Policy
      • Non-Partnership Policy-icon
        Non-Partnership Policy
      • Cookie Policy-icon
        Cookie Policy
      • Grievance Redressal-icon
        Grievance Redressal
      • Republishing Guidelines-icon
        Republishing Guidelines
      • Fact Check-icon
        Fact Check
        Fast Check
        Politics
        Business
        Entertainment
        Social
        Sports
        World
      • Law-icon
        Law
      • Explainers-icon
        Explainers
      • News-icon
        News
        All News
      • Decode-icon
        Decode
        Impact
        Scamcheck
        Life
        Voices
      • Media Buddhi-icon
        Media Buddhi
        Digital Buddhi
        Senior Citizens
        Videos
      • Web Stories-icon
        Web Stories
      • BOOM Research-icon
        BOOM Research
      • BOOM Labs-icon
        BOOM Labs
      • Deepfake Tracker-icon
        Deepfake Tracker
      • Videos-icon
        Videos
        Facts Neeti
      Trending Tags
      TRENDING
      • #Operation Sindoor
      • #Pahalgam Terror Attack
      • #Narendra Modi
      • #Rahul Gandhi
      • #Waqf Amendment Bill
      • #Arvind Kejriwal
      • #Deepfake
      • #Artificial Intelligence
      • Home
      • Explainers
      • Deepfake Scams: Here's What To Do...
      Explainers

      Deepfake Scams: Here's What To Do When You've Been Duped

      Deepfakes are convincingly realistic replicas of individuals in audio, video, or image form. As they are becoming increasingly accessible, there has been a rise in scams involving these manipulated content.

      By - The Conversation |
      Published -  27 Feb 2024 4:47 PM IST
    • Boomlive
      Listen to this Article
      Deepfake Scams: Heres What To Do When Youve Been Duped

      Tero Vesalainen/Shutterstock

      Jeannie Marie Paterson, The University of Melbourne

      Earlier this month, a Hong Kong company lost HK$200 million (A$40 million) in a deepfake scam. An employee transferred funds following a video conference call with scammers who looked and sounded like senior company officials.

      Generative AI tools can create image, video and voice replicas of real people saying and doing things they never would have done. And these tools are becoming increasingly easy to access and use.

      This can perpetuate intimate image abuse (including things like “revenge porn”) and disrupt democratic processes. Currently, many jurisdictions are grappling with how to regulate AI deepfakes.

      But if you’ve been a victim of a deepfake scam, can you obtain compensation or redress for your losses? The legislation hasn’t caught up yet.

      Also Read:AI Voice Clone Of Imran Khan Falsely Claims He Is Boycotting Polls

      Who is responsible?

      In most cases of deepfake fraud, scammers will avoid trying to fool banks and security systems, instead opting for so-called “push payment” frauds where victims are tricked into directing their bank to pay the fraudster.

      So, if you’re seeking a remedy, there are at least four possible targets:

      1. the fraudster (who will often have disappeared)

      2. the social media platform that hosted the fake

      3. any bank that paid out the money on the instructions of the victim of the fraud

      4. the provider of the AI tool that created the fake.

      The quick answer is that once the fraudster vanishes, it is currently unclear whether you have a right to a remedy from any of these other parties (though that may change in the future).

      Let’s see why.

      Also Read:Deepfake Video Of Joe Biden Calling For US Military Conscription Goes Viral

      The social media platform

      In principle, you could seek damages from a social media platform if it hosted a deepfake used to defraud you. But there are hurdles to overcome.

      Platforms typically frame themselves as mere conduits of content – which means they are not legally responsible for the content. In the United States, platforms are explicitly shielded from this kind of liability. However, no such protection exists in most other common law countries, including Australia.

      The Australian Competition and Consumer Commission (ACCC) is taking Meta (Facebook’s parent company) to court. They are testing the possibility of making digital platforms directly liable for deepfake crypto scams if they actively target the ads to possible victims.

      The ACCC is also arguing Meta should be liable as an accessory to the scam – for failing to remove the misleading ads promptly once notified of the problem.

      At the very least, platforms should be responsible for promptly removing deepfake content used for fraudulent purposes. They may already claim to be doing this, but it might soon become a legal obligation.

      The bank

      In Australia, the legal obligations of whether a bank has to reimburse you in the case of a deepfake scam aren’t settled.

      This was recently considered by the United Kingdom’s Supreme Court, in a case likely to be influential in Australia. It suggests banks don’t have a duty to refuse a customer’s payment instructions where the recipient is suspected to be a (deepfake) fraudster, even if they have a general duty to act promptly once the scam is discovered.

      That said, the UK is introducing a mandatory scheme that requires banks to reimburse victims of push payment fraud, at least in certain circumstances.

      In Australia, the ACCC and others have presented proposals for a similar scheme, though none exists at this stage.

      Also Read:Taylor Swift's Deepfakes: How Is Twitter Dealing With AI Images?

      The AI tool provider

      The providers of generative AI tools are currently not legally obliged to make their tools unusable for fraud or deception. In law, there is no duty of care to the world at large to prevent someone else’s fraud.

      However, providers of generative AI do have an opportunity to use technology to reduce the likelihood of deepfakes. Like banks and social media platforms, they may soon be required to do this, at least in some jurisdictions.

      The recently proposed EU AI Act obligates the providers of generative AI tools to design these tools in a way that allows the synthetic/fake content to be detected.

      Currently, it’s proposed this could work through digital watermarking, although its effectiveness is still being debated. Other measures include prompt limits, digital ID to verify a person’s identity, and further education about the signs of deepfakes.

      Also Read:Deepfake Ads Of ICICI Pru's Nimesh Shah Surface On Facebook

      Can we stop deepfake fraud altogether?

      None of these legal or technical guardrails are likely to be entirely effective in stemming the tide of deepfake fraud, scams or deception – especially as generative AI technology keeps advancing.

      However, the response doesn’t need to be perfect: slowing down AI generated fakes and frauds can still reduce harm. We also need to pressure platforms, banks and tech providers to stay on top of the risks.

      So while you might never be able to completely prevent yourself from being the victim of a deepfake scam, with all these new legal and technical developments, you might soon be able to seek compensation if things go wrong. The Conversation

      With audio, video and image deepfakes only growing more realistic, we need multi-layered strategies of prevention, education and compensation.

      Jeannie Marie Paterson, Professor of Law, The University of Melbourne

      This article is republished from The Conversation under a Creative Commons license. Read the original article.

      Also Read:Deepfake Videos Ahead Of Bangladesh Polls Should Have Us All Concerned


      Tags

      DeepfakeCyber-Crime
      Read Full Article
      Next Story
      Our website is made possible by displaying online advertisements to our visitors.
      Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.
      X

      Subscribe to BOOM Newsletters

      👉 No spam, no paywall — but verified insights.

      Please enter a Email Address
      Subscribe for free!

      Stay Ahead of Misinformation!

      Please enter a Email Address
      Subscribe Now🛡️ 100% Privacy Protected | No Spam, Just Facts
      By subscribing, you agree with the Terms & conditions and Privacy Policy connected to the offer

      Thank you for subscribing!

      You’re now part of the BOOM community.

      Or, Subscribe to receive latest news via email
      Subscribed Successfully...
      Copy HTMLHTML is copied!
      There's no data to copy!