BOOM

Trending Searches

    SUPPORT
    BOOM

    Trending News

      • Fact Check 
        • Fast Check
        • Politics
        • Business
        • Entertainment
        • Social
        • Sports
        • World
      • Law
      • Explainers
      • News 
        • All News
      • Decode 
        • Impact
        • Scamcheck
        • Life
        • Voices
      • Media Buddhi 
        • Digital Buddhi
        • Senior Citizens
        • Videos
      • Web Stories
      • BOOM Research
      • BOOM Labs
      • Deepfake Tracker
      • Videos 
        • Facts Neeti
      • Home-icon
        Home
      • About Us-icon
        About Us
      • Authors-icon
        Authors
      • Team-icon
        Team
      • Careers-icon
        Careers
      • Internship-icon
        Internship
      • Contact Us-icon
        Contact Us
      • Methodology-icon
        Methodology
      • Correction Policy-icon
        Correction Policy
      • Non-Partnership Policy-icon
        Non-Partnership Policy
      • Cookie Policy-icon
        Cookie Policy
      • Grievance Redressal-icon
        Grievance Redressal
      • Republishing Guidelines-icon
        Republishing Guidelines
      • Fact Check-icon
        Fact Check
        Fast Check
        Politics
        Business
        Entertainment
        Social
        Sports
        World
      • Law-icon
        Law
      • Explainers-icon
        Explainers
      • News-icon
        News
        All News
      • Decode-icon
        Decode
        Impact
        Scamcheck
        Life
        Voices
      • Media Buddhi-icon
        Media Buddhi
        Digital Buddhi
        Senior Citizens
        Videos
      • Web Stories-icon
        Web Stories
      • BOOM Research-icon
        BOOM Research
      • BOOM Labs-icon
        BOOM Labs
      • Deepfake Tracker-icon
        Deepfake Tracker
      • Videos-icon
        Videos
        Facts Neeti
      Trending Tags
      TRENDING
      • #Operation Sindoor
      • #Pahalgam Terror Attack
      • #Narendra Modi
      • #Rahul Gandhi
      • #Waqf Amendment Bill
      • #Arvind Kejriwal
      • #Deepfake
      • #Artificial Intelligence
      • Home
      • Explainers
      • Hallucination, Misinformation:...
      Explainers

      Hallucination, Misinformation: Sundar Pichai Cautions Against AI's Potential Harms

      In an interview with Scott Pelly on CBS's 60 Minutes, Google CEO Pichai delved into the negative side of AI and called for new regulations to it.

      By - Hera Rizwan |
      Published -  19 April 2023 4:52 PM IST
    • Boomlive
      Listen to this Article
      Hallucination, Misinformation: Sundar Pichai Cautions Against AIs Potential Harms

      In an interview on CBS’s 60 Minutes, Google CEO Sundar Pichai warned the society to brace up against the quick development of AI. “We need to adapt as a society for it,” said Pichai, adding that jobs that would be affected the most by AI would include “knowledge workers,” including writers, accountants, architects and software engineers.

      Pichai voiced his concerns regarding the growing artificial intelligence technology which can be “very harmful” if deployed wrongly,. These concerns, he said, keeps him awake at night.

      Here are the key takeaways from Pichai's interview with Scott Pelly which aired on Sunday.

      Barrage of disinformation-

      According to Pichai, AI could cause great harm because of its ability to produce disinformation. “It will be possible with AI to create a video easily. Where it could be saying something, or me saying something, and we never said that. And it could look accurate. But on a societal scale, you know, it can cause a lot of harm.”

      AI can also create believable fake news, images, and videos that could harm society on a large scale.

      The problem of hallucinations with chatbots-

      Pichai talked at length about Google AI chatbot Bard during the interview. Unlike Google search, Bard does not look up answers online but generates responses based on its language model. Despite Bard’s capabilities, AI technology poses several challenges. Hallucination refers to AI's capacity to confidently invent knowledge. For instance, Bard suggested five books that didn't exist in an article it wrote about inflation.

      Pichai noted, “No one in the field has yet solved the hallucination problems. All models do have this as an issue.”

      Evolving workforce-

      Pichai said AI will have an impact across all industries and on every product produced by every company. Citing an example, Pichai said, "You could be a radiologist. If you think about five to 10 years from now, you’re gonna have a AI collaborator with you. Let’s say you have 100 things to go through. It may say, ‘These are the most serious cases you need to look at first.’ Or when you’re looking at something, it may pop up and say, ‘You may have missed something important.’ “

      While new job categories may emerge and many existing jobs will change as a result of the incorporation of AI technology, other jobs may decline. "Knowledge workers" like authors, accountants, and software engineers may be impacted by the evolving AI technology, according to Pichai.

      Tapping into AI's potential-

      AI's "emergent properties" alludes to them demonstrating surprising abilities for which they are not trained yet. For instance, despite not having been taught Bengali, a Google AI programme was able to comprehend the language.

      Talking about it, Pichai said, "Of the AI issues we talked about, the most mysterious is called emergent properties. Some AI systems are teaching themselves skills that they weren’t expected to have. How this happens is not well understood. For example, one Google AI programme adapted, on its own, after it was prompted in the language of Bangladesh, which it was not trained to know.”

      The company aims to balance AI’s potential advantages and risks, gradually, letting society adjust and offer feedback.

      Global regulatory frameworks-

      Pichai also called for a global regulatory framework for AI comparable to the treaties governing the use of nuclear weapons, warning that the race to develop technological improvements could cause safety issues to be ignored.

      According to the Google CEO, society must quickly adapt to the regulations, laws to punish abuse, and treaties among nations to make AI safe for the world. The rules, he said, must "align with human values including morality.”

      Also Read:'CEO Scam' Returns On WhatsApp: What It Is And How To Protect Yourself








      Tags

      Sundar PichaiFacebook
      Read Full Article
      Next Story
      Our website is made possible by displaying online advertisements to our visitors.
      Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.
      X

      Subscribe to BOOM Newsletters

      👉 No spam, no paywall — but verified insights.

      Please enter a Email Address
      Subscribe for free!

      Stay Ahead of Misinformation!

      Please enter a Email Address
      Subscribe Now🛡️ 100% Privacy Protected | No Spam, Just Facts
      By subscribing, you agree with the Terms & conditions and Privacy Policy connected to the offer

      Thank you for subscribing!

      You’re now part of the BOOM community.

      Or, Subscribe to receive latest news via email
      Subscribed Successfully...
      Copy HTMLHTML is copied!
      There's no data to copy!