BOOM

Trending Searches

    SUPPORT
    BOOM

    Trending News

      • Fact Check 
        • Fast Check
        • Politics
        • Business
        • Entertainment
        • Social
        • Sports
        • World
      • Law
      • Explainers
      • News 
        • All News
      • Decode 
        • Impact
        • Scamcheck
        • Life
        • Voices
      • Media Buddhi 
        • Digital Buddhi
        • Senior Citizens
        • Videos
      • Web Stories
      • BOOM Research
      • BOOM Labs
      • Deepfake Tracker
      • Videos 
        • Facts Neeti
      • Home-icon
        Home
      • About Us-icon
        About Us
      • Authors-icon
        Authors
      • Team-icon
        Team
      • Careers-icon
        Careers
      • Internship-icon
        Internship
      • Contact Us-icon
        Contact Us
      • Methodology-icon
        Methodology
      • Correction Policy-icon
        Correction Policy
      • Non-Partnership Policy-icon
        Non-Partnership Policy
      • Cookie Policy-icon
        Cookie Policy
      • Grievance Redressal-icon
        Grievance Redressal
      • Republishing Guidelines-icon
        Republishing Guidelines
      • Fact Check-icon
        Fact Check
        Fast Check
        Politics
        Business
        Entertainment
        Social
        Sports
        World
      • Law-icon
        Law
      • Explainers-icon
        Explainers
      • News-icon
        News
        All News
      • Decode-icon
        Decode
        Impact
        Scamcheck
        Life
        Voices
      • Media Buddhi-icon
        Media Buddhi
        Digital Buddhi
        Senior Citizens
        Videos
      • Web Stories-icon
        Web Stories
      • BOOM Research-icon
        BOOM Research
      • BOOM Labs-icon
        BOOM Labs
      • Deepfake Tracker-icon
        Deepfake Tracker
      • Videos-icon
        Videos
        Facts Neeti
      Trending Tags
      TRENDING
      • #Operation Sindoor
      • #Pahalgam Terror Attack
      • #Narendra Modi
      • #Rahul Gandhi
      • #Waqf Amendment Bill
      • #Arvind Kejriwal
      • #Deepfake
      • #Artificial Intelligence
      • Home
      • Decode
      • Why Using AI And Deepfake To...
      Decode

      Why Using AI And Deepfake To Recreate History Is Problematic

      AI is making it easier than ever to reanimate the past. Will that change how we understand history and, as a result, ourselves?

      By - The Conversation | 3 Nov 2021 4:48 PM IST
    • Boomlive
      Why Using AI And Deepfake To Recreate History Is Problematic
      To mark Israel's Memorial Day in 2021, the Israel Defense Forces musical ensembles collaborated with a company that specializes in synthetic videos, also known as "deepfake" technology, to bring photos from the 1948 Israeli-Arab war to life.


      They produced a video in which young singers clad in period uniforms and carrying period weapons sang "Hareut", an iconic song commemorating soldiers killed in combat. As they sing, the musicians stare at faded black-and-white photographs they hold. The young soldiers in the old pictures blink and smile back at them, thanks to artificial intelligence.

      The result is uncanny. The past comes to life, Harry Potter style.

      For the past few years, my colleagues and I at UMass Boston's Applied Ethics Center have been studying how everyday engagement with AI challenges the way people think about themselves and politics. We've found that AI has the potential to weaken people's capacity to make ordinary judgments. We've also found that it undermines the role of serendipity in their lives and can lead them to question what they know or believe about human rights.

      Now AI is making it easier than ever to reanimate the past. Will that change how we understand history and, as a result, ourselves?

      Musicians dressed as soldiers connect with soldiers in old photographs in a 2021 production by the Israel Defense Forces and the artificial intelligence company D-ID.

      Low financial risk, high moral cost

      The desire to bring the past back to life in vivid fashion is not new. Civil War or Revolutionary War re-enactments are commonplace. In 2018, Peter Jackson painstakingly restored and colourized World War I footage to create "They Shall Not Grow Old," a film that allowed 21st-century viewers to experience the Great War more immediately than ever before.

      Live re-enactments and carefully processed historical footage are expensive and time-consuming undertakings. Deepfake technology democratizes such efforts, offering a cheap and widely available tool for animating old photos or creating convincing fake videos from scratch.

      But as with all new technologies, alongside the exciting possibilities are serious moral questions. And the questions get even trickier when these new tools are used to enhance understanding of the past and reanimate historical episodes.

      The 18th-century writer and statesman Edmund Burke famously argued that society is a "partnership not only between those who are living, but between those who are living, those who are dead, and those who are to be born." Political identity, in his view, is not simply what people make of it. It is not merely a product of our own fabrication. Rather, to be part of a community is to be part of a compact between generations – part of a joint enterprise connecting the living, the dead and those who will live in the future.

      If Burke is right to understand political belonging this way, deepfake technology offers a powerful way to connect people to the past, to forge this intergenerational contract. By bringing the past to life in a vivid, convincing way, the technology enlivens the "dead" past and makes it more vivid and vibrant. If these images spur empathy and concern for ancestors, deepfakes can make the past matter a lot more.

      But this capability comes with risk. One obvious danger is the creation of fake historical episodes. Imagined, mythologized and fake events can precipitate wars: a storied 14th-century defeat in the Battle of Kosovo still inflames Serbian anti-Muslim sentiments, even though nobody knows if the Serbian coalition actually lost that battle to the Ottomans.

      Similarly, the second Gulf of Tonkin attack on American warships on Aug. 4, 1964, was used to escalate American involvement in Vietnam. It later turned out the attack never happened.

      An atrophying of the imagination

      It used to be difficult and expensive to stage fake events. Not any more.

      Imagine, for example, what strategically doctored deepfake footage from the January 6 events in the United States could do to inflame political tensions or what fake video from a Centers for Disease Control and Prevention meeting appearing to disparage COVID-19 vaccines would do to public health efforts.

      The upshot, of course, is that deepfakes may gradually destabilize the very idea of a historical "event". Perhaps over time, as this technology advances and becomes ubiquitous, people will automatically question whether what they are seeing is real.

      Whether this will lead to more political instability or – paradoxically, to more stability as a result of hesitancy to act on the basis of what are possibly fabricated occurrences – is open to question.

      But beyond anxieties about the wholesale fabrication of history, there are subtler consequences that worry me.

      Yes, deepfakes let us experience the past as more alive and, as a result, may increase our sense of commitment to history. But does this use of the technology carry the risk of atrophying our imagination – providing us with ready-made, limited images of the past that will serve as the standard associations for historical events? An exertion of the imagination can render the horrors of World War II, the 1906 San Francisco earthquake or the 1919 Paris Peace Conference in endless variations.

      But will people keep exerting their imagination in that way? Or will deepfakes, with their lifelike, moving depictions, become the practical stand-ins for history? I worry that animated versions of the past might give viewers the impression that they know exactly what happened – that the past is fully present to them – which will then obviate the need to learn more about the historical event.

      People tend to think that technology makes life easier. But they don't realize that their technological tools always remake the toolmakers – causing existing skills to deteriorate even as they open up unimaginable and exciting possibilities.

      The advent of smartphones meant photos could be posted online with ease. But it's also meant that some people don't experience breathtaking views as they used to, since they're so fixated on capturing an "instagrammable" moment. Nor is getting lost experienced the same way since the ubiquity of GPS. Similarly, AI-generated deepfakes are not just tools that will automatically enhance our understanding of the past.

      Nevertheless, this technology will soon revolutionize society's connection to history, for better and worse.

      People have always been better at inventing things than at thinking about what the things they invent do to them – "always adroiter with objects than lives," as the poet W.H. Auden put it. This incapacity to imagine the underside of technical achievements is not destiny. It is still possible to slow down and think about the best way to experience the past.

      Author: Nir Eisikovits, Associate Professor of Philosophy and Director, Applied Ethics Center, University of Massachusetts Boston

      This article is republished from The Conversation under a Creative Commons license. Read the original article.

      Tags

      deepfakeFacebook
      Read Full Article

      Next Story
      Our website is made possible by displaying online advertisements to our visitors.
      Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.
      X

      Subscribe to BOOM Newsletters

      👉 No spam, no paywall — but verified insights.

      Please enter a Email Address
      Subscribe for free!

      Stay Ahead of Misinformation!

      Please enter a Email Address
      Subscribe Now🛡️ 100% Privacy Protected | No Spam, Just Facts
      By subscribing, you agree with the Terms & conditions and Privacy Policy connected to the offer

      Thank you for subscribing!

      You’re now part of the BOOM community.

      Or, Subscribe to receive latest news via email
      Subscribed Successfully...
      Copy HTMLHTML is copied!
      There's no data to copy!