BOOM

Trending Searches

    SUPPORT
    BOOM

    Trending News

      • Fact Check 
        • Fast Check
        • Politics
        • Business
        • Entertainment
        • Social
        • Sports
        • World
      • Law
      • Explainers
      • News 
        • All News
      • Decode 
        • Impact
        • Scamcheck
        • Life
        • Voices
      • Media Buddhi 
        • Digital Buddhi
        • Senior Citizens
        • Videos
      • Web Stories
      • BOOM Research
      • BOOM Labs
      • Deepfake Tracker
      • Videos 
        • Facts Neeti
      • Home-icon
        Home
      • About Us-icon
        About Us
      • Authors-icon
        Authors
      • Team-icon
        Team
      • Careers-icon
        Careers
      • Internship-icon
        Internship
      • Contact Us-icon
        Contact Us
      • Methodology-icon
        Methodology
      • Correction Policy-icon
        Correction Policy
      • Non-Partnership Policy-icon
        Non-Partnership Policy
      • Cookie Policy-icon
        Cookie Policy
      • Grievance Redressal-icon
        Grievance Redressal
      • Republishing Guidelines-icon
        Republishing Guidelines
      • Fact Check-icon
        Fact Check
        Fast Check
        Politics
        Business
        Entertainment
        Social
        Sports
        World
      • Law-icon
        Law
      • Explainers-icon
        Explainers
      • News-icon
        News
        All News
      • Decode-icon
        Decode
        Impact
        Scamcheck
        Life
        Voices
      • Media Buddhi-icon
        Media Buddhi
        Digital Buddhi
        Senior Citizens
        Videos
      • Web Stories-icon
        Web Stories
      • BOOM Research-icon
        BOOM Research
      • BOOM Labs-icon
        BOOM Labs
      • Deepfake Tracker-icon
        Deepfake Tracker
      • Videos-icon
        Videos
        Facts Neeti
      Trending Tags
      TRENDING
      • #Operation Sindoor
      • #Pahalgam Terror Attack
      • #Narendra Modi
      • #Rahul Gandhi
      • #Waqf Amendment Bill
      • #Arvind Kejriwal
      • #Deepfake
      • #Artificial Intelligence
      • Home
      • Decode
      • From Inception To Identification:...
      Decode

      From Inception To Identification: What Is A Deepfake And How To Detect Them?

      Explore the alarming rise of deepfake technology and Uncover the origins of deepfakes, Tips to Identify Deepfakes, and understand the legal and technological measures in place to combat this growing digital threat.

      By -  Hera Rizwan |
      10 Nov 2023 3:38 PM IST
    • Boomlive
      Listen to this Article
      From Inception To Identification: What Is A Deepfake And How To Detect Them?

      Earlier this week, a viral video featuring actress Rashmika Mandanna circulated on social media, sparking a blend of shock, surprise, and horror among netizens. The original video was of a British Indian influencer named Zara Patel, which was manipulated using deepfake technology.

      The actress took to social media to express her dismay and astonishment at the circulating video. In her post on X, Mandanna wrote, "Something like this is honestly, extremely scary not only for me, but also for each one of us who today is vulnerable to so much harm because of how technology is being misused."

      Meanwhile, Decode found that X (formerly Twitter) is full of deepfake videos of Indian actresses.

      After the incident, the Ministry of Electronics and Information Technology promptly issued an advisory to social media platforms. The advisory mandated the platforms to remove such content within 36 hours upon receiving a report from either a user or government authority. Highlighting Section 66D of the Information Technology Act, 2000, it stated that punishment for cheating by personation by using computer resources amounts to imprisonment of up to 3 years and a fine of up to Rs 1 lakh.

      But how did it all begin? As this technology is becoming more common and convincing, BOOM delves into the concerning surge of deepfake technology, exploring its inception, methods of detection, and examining the legal and technological safeguards in position to counteract this expanding digital menace.

      Also Read:X Is Full Of Deepfake Porn Videos Of Actresses; You May Be Targeted Next

      What is deepfake?

      A deepfake is a form of synthetic media that involves replacing a person in an image or video with that of another individual.

      The term "deepfake" originated in late 2017, coined by a Reddit user with the same name. This user established a platform on the online news and aggregation site, sharing explicit videos utilising open-source face-swapping technology.

      In September 2019, the artificial intelligence company Deeptrace identified nearly 15,000 online deepfake videos, marking a nearly twofold increase over a nine-month period. An astonishing 96% of these videos had pornographic content, with 99% of them superimposing the faces of female celebrities onto adult film performers.

      While the term "deepfake" was coined in 2017, the technology itself has roots that extend further into the past. The creation of lifelike fake portraits was facilitated by the development of 'Generative Adversarial Networks' (GAN) in 2014. These networks comprise two AI agents: one generates an image, while the other aims to identify the fake. If the detecting agent uncovers the forgery, the AI forger adjusts and enhances its capabilities.

      An X post by Ian Goodfellow, currently a research scientist at Google Deepmind, showed the development of the technology over the last few years. He published a paper back in 2014, with colleagues that introduced a GAN for the first time.

      Subsequently, deepfakes started finding acceptance in a larger set-up in the creative industry to weave believable stories of going in the past, future or a modified present. In India too, the advertising industry has been leveraging the technology for the past few years.

      Also Read:Once A Student Politician, This Man Is Teaching How To Make Deepfakes

      How can we spot deepfakes?

      Speaking to BOOM, Jaspreet Bindra, managing director and founder of The Tech Whisperer, enlisted ways of spotting deepfakes, which he feels are becoming "incredibly realistic". "To detect deepfakes, look for inconsistencies in the imagery. Facial features that don't align properly, lighting that seems off, or irregular blinking patterns can be tell-tale signs," he said.

      Audio-visual mismatches are also red flags, where the tone and cadence of the voice may not match the person’s usual speech patterns, he added.

      According to Bindra, one should be wary if any video sounds too sensational or outlandish to be true. "One must question who put this out into the world – a reliable source or a notorious fake news factory?"

      The proliferation of deepfakes, as Bindra puts, is a stark reminder of the dual-edged nature of technology. "It holds a mirror to society, reflecting our potential for creation and destruction," he said.

      As machine learning and AI, is being used to create deepfakes, they are also being used to combat them. "Companies like Deeptrace are pioneering software to identify deepfakes by analysing shadows and reflections that are not congruent with physics. Additionally, platforms like Facebook and X are also implementing policies to flag, and sometimes remove, deceptive deepfake content," he said.

      On the technological front, as Bindra says, embracing blockchain offers a solution, where media can be verified and traced back to its origin.

      Also Read:Video Of Jordan's Queen Rania Supporting Israel Is A Deepfake

      How to stay informed and vigilant about deepfake threats?

      Some of the tips and advice which we can follow to keep ourselves and others safe against the proliferation of this misinformation are-

      • Double-check the source. Look for the same story across different media outlets to verify authenticity.
      • Avoid sharing unverified information.
      • Always approach content with a critical mind. If it seems off, there's a good chance it might be.
      • Tighten your online privacy settings. The less data you have out there, the harder it is for someone to create a deepfake of you.

      According to Bindra, apart from us being savvy consumers of media and questioning the authenticity of suspicious content, there also needs to be a robust legal framework that penalises the malicious creation and distribution of deepfakes.

      He said, "The IT Act does exist as a recourse for misinformation. However, we require a stronger Act in terms of a stronger penalty with an exemplary punishment to deter such actions. Deepfakes can be very harmful not only from a pornographic viewpoint but also from elections and events of war or communal events."

      Lastly, Bindra suggested that it should be mandated that anyone using an AI model to produce an image or information must disclose it. "People must be made aware of Classifiers – software which can detect AI-generated content – and widespread use of the same, much like antivirus," he added.

      Also Read:Meta Limits Political Campaigns' Access To Its AI Ads Tools. What Are These Tools?


      Tags

      DeepfakeMisinformationReddit
      Read Full Article

      Next Story
      Our website is made possible by displaying online advertisements to our visitors.
      Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.
      X

      Subscribe to BOOM Newsletters

      👉 No spam, no paywall — but verified insights.

      Please enter a Email Address
      Subscribe for free!

      Stay Ahead of Misinformation!

      Please enter a Email Address
      Subscribe Now🛡️ 100% Privacy Protected | No Spam, Just Facts
      By subscribing, you agree with the Terms & conditions and Privacy Policy connected to the offer

      Thank you for subscribing!

      You’re now part of the BOOM community.

      Or, Subscribe to receive latest news via email
      Subscribed Successfully...
      Copy HTMLHTML is copied!
      There's no data to copy!