BOOM

Trending Searches

    SUPPORT
    BOOM

    Trending News

      • Fact Check 
        • Fast Check
        • Politics
        • Business
        • Entertainment
        • Social
        • Sports
        • World
      • Law
      • Explainers
      • News 
        • All News
      • Decode 
        • Impact
        • Scamcheck
        • Life
        • Voices
      • Media Buddhi 
        • Digital Buddhi
        • Senior Citizens
        • Videos
      • Web Stories
      • BOOM Research
      • BOOM Labs
      • Deepfake Tracker
      • Videos 
        • Facts Neeti
      • Home-icon
        Home
      • About Us-icon
        About Us
      • Authors-icon
        Authors
      • Team-icon
        Team
      • Careers-icon
        Careers
      • Internship-icon
        Internship
      • Contact Us-icon
        Contact Us
      • Methodology-icon
        Methodology
      • Correction Policy-icon
        Correction Policy
      • Non-Partnership Policy-icon
        Non-Partnership Policy
      • Cookie Policy-icon
        Cookie Policy
      • Grievance Redressal-icon
        Grievance Redressal
      • Republishing Guidelines-icon
        Republishing Guidelines
      • Fact Check-icon
        Fact Check
        Fast Check
        Politics
        Business
        Entertainment
        Social
        Sports
        World
      • Law-icon
        Law
      • Explainers-icon
        Explainers
      • News-icon
        News
        All News
      • Decode-icon
        Decode
        Impact
        Scamcheck
        Life
        Voices
      • Media Buddhi-icon
        Media Buddhi
        Digital Buddhi
        Senior Citizens
        Videos
      • Web Stories-icon
        Web Stories
      • BOOM Research-icon
        BOOM Research
      • BOOM Labs-icon
        BOOM Labs
      • Deepfake Tracker-icon
        Deepfake Tracker
      • Videos-icon
        Videos
        Facts Neeti
      Trending Tags
      TRENDING
      • #Operation Sindoor
      • #Pahalgam Terror Attack
      • #Narendra Modi
      • #Rahul Gandhi
      • #Waqf Amendment Bill
      • #Arvind Kejriwal
      • #Deepfake
      • #Artificial Intelligence
      • Home
      • Explainers
      • What Does Biden's Executive Order...
      Explainers

      What Does Biden's Executive Order On AI Mean?

      US President Joe Biden has issued a comprehensive executive order on artificial intelligence, covering a range of AI-related issues, from deepfake prevention to job security.

      By - The Conversation |
      Published -  31 Oct 2023 7:02 PM IST
    • Boomlive
      Listen to this Article
      What Does Bidens Executive Order On AI Mean?

      Toby Walsh, UNSW Sydney

      On Monday US President Joe Biden released a wide ranging and ambitious executive order on artificial intelligence (AI) – catapulting the US to the front of conversations about regulating AI.

      In doing so, the US is leap frogging over other states in the race to rule over AI. Europe previously led the way with its AI Act, which was passed by the European Parliament in June 2023, but which won’t take full effect until 2025.

      The presidential executive order is a grab bag of initiatives for regulating AI – some of which are good, and others which seem rather half-baked. It aims to address harms ranging from the immediate, such as AI-generated deepfakes, through to intermediate harms such as job losses, to longer-term harms such as the much-disputed existential threat AI may pose to humans.

      Also Read:When AI Goes Awry: How A Chatbot Encouraged A Man To Kill Queen Elizabeth

      Biden’s ambitious plan

      The US Congress has been slow to pass significant regulation of big tech companies. This presidential executive order is likely both an attempt to sidestep an often deadlocked Congress, as well as to kick-start action. For example, the order calls upon Congress to pass bipartisan data privacy legislation.

      Bipartisan support in the current climate? Good luck with that, Mr President.

      The executive order will reportedly be implemented over the next three months to one year. It covers eight areas:

      1. safety and security standards for AI
      2. privacy protections
      3. equity and civil rights
      4. consumer rights
      5. jobs
      6. innovation and competition
      7. international leadership
      8. AI governance.

      On one hand, the order covers many concerns raised by academics and the public. For example, one of its directives is to issue official guidance on how AI-generated content may be watermarked to reduce the risk from deepfakes.

      Also Read:‘Shoot Heroin’: AI Chatbots’ Advise Can Worsen Eating Disorder, Finds Study

      It also requires companies developing AI models to prove they are safe before they can be rolled out for wider use. President Biden said:

      that means companies must tell the government about the large scale AI systems they’re developing and share rigorous independent test results to prove they pose no national security or safety risk to the American people.

      AI’s potentially disastrous use in warfare

      At the same time, the order fails to address a number of pressing issues. For instance, it doesn’t directly address how to deal with killer AI robots, a vexing topic that was under discussion over the past two weeks at the General Assembly of the United Nations.

      This concern shouldn’t be ignored. The Pentagon is developing swarms of low-cost autonomous drones as part of its recently announced Replicator program. Similarly, Ukraine has developed homegrown AI-powered attack drones that can identify and attack Russian forces without human intervention.

      Could we end up in a world where machines decide who lives or dies? The executive order merely asks for the military to use AI ethically, but doesn’t stipulate what that means.

      And what about protecting elections from AI-powered weapons of mass persuasion? A number of outlets have reported on how the recent election in Slovakia may have been influenced by deepfakes. Many experts, myself included, are also concerned about the misuse of AI in the upcoming US presidential election.

      Unless strict controls are implemented, we risk living in an age where nothing you see or hear online can be trusted. If this sounds like an exaggeration, consider that the US Republican Party has already released a campaign advert which appears entirely generated by AI.

      Also Read:Not Real Doctors: AI Docs Peddling ChatBot Advice As Home Remedies

      Missed opportunities

      Many of the initiatives in the executive order could and should be replicated elsewhere, including Australia. We too should, as the order requires, provide guidance to landlords, government programs and government contractors on how to ensure AI algorithms aren’t being used to discriminate against individuals.

      We should also, as the order requires, address algorithmic discrimination in the criminal justice system where AI is increasingly being used in high stakes settings, including for sentencing, parole and probation, pre-trial release and detention, risk assessments, surveillance and predictive policing, to name a few.

      AI has controversially been used for such applications in Australia, too, such as in the Suspect Targeting Management Plan used to monitor youths in New South Wales.

      Perhaps the most controversial aspect of the executive order is that which addresses the potential harms of the most powerful so-called “frontier” AI models. Some experts believe these models – which are being developed by companies such as Open AI, Google and Anthropic – pose an existential threat to humanity.

      Others, including myself, believe such concerns are overblown and might distract from more immediate harms, such as misinformation and inequity, that are already hurting society.

      Also Read:AI-Generated Child Sexual Abuse Material 'Nightmare' Is Now A Reality: Report

      Biden’s order invokes extraordinary war powers (specifically the 1950 Defense Production Act introduced during the Korean war) to require companies to notify the federal government when training such frontier models. It also requires they share the results of “red-team” safety tests, wherein internal hackers use attacks to probe a software for bugs and vulnerabilities.

      I would say it’s going to be difficult, and perhaps impossible, to police the development of frontier models. The above directives won’t stop companies developing such models overseas, where the US government has limited power. The open source community can also develop them in a distributed fashion – one which makes the tech world “borderless”.

      The impact of the executive order will likely have the greatest impact on the government itself, and how it goes about using AI, rather than businesses.

      Nevertheless, it’s a welcome piece of action. The UK Prime Minister Rishi Sunak’s AI Safety Summit, taking place over the next two days, now looks to be somewhat of a diplomatic talk fest in comparison.

      It does make one envious of the presidential power to get things done.The Conversation

      Toby Walsh, Professor of AI, Research Group Leader, UNSW Sydney

      This article is republished from The Conversation under a Creative Commons license. Read the original article.

      Also Read:Friend Or Foe: Exploring The Risks And Rewards Of AI Revolution


      Tags

      #US PresidentDeepfakeJoe BidenFacebook
      Read Full Article
      Next Story
      Our website is made possible by displaying online advertisements to our visitors.
      Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.
      X

      Subscribe to BOOM Newsletters

      👉 No spam, no paywall — but verified insights.

      Please enter a Email Address
      Subscribe for free!

      Stay Ahead of Misinformation!

      Please enter a Email Address
      Subscribe Now🛡️ 100% Privacy Protected | No Spam, Just Facts
      By subscribing, you agree with the Terms & conditions and Privacy Policy connected to the offer

      Thank you for subscribing!

      You’re now part of the BOOM community.

      Or, Subscribe to receive latest news via email
      Subscribed Successfully...
      Copy HTMLHTML is copied!
      There's no data to copy!