BOOM

Trending Searches

    SUPPORT
    BOOM

    Trending News

      • Fact Check 
        • Fast Check
        • Politics
        • Business
        • Entertainment
        • Social
        • Sports
        • World
      • Law
      • Explainers
      • News 
        • All News
      • Decode 
        • Impact
        • Scamcheck
        • Life
        • Voices
      • Media Buddhi 
        • Digital Buddhi
        • Senior Citizens
        • Videos
      • Web Stories
      • BOOM Research
      • BOOM Labs
      • Deepfake Tracker
      • Videos 
        • Facts Neeti
      • Home-icon
        Home
      • About Us-icon
        About Us
      • Authors-icon
        Authors
      • Team-icon
        Team
      • Careers-icon
        Careers
      • Internship-icon
        Internship
      • Contact Us-icon
        Contact Us
      • Methodology-icon
        Methodology
      • Correction Policy-icon
        Correction Policy
      • Non-Partnership Policy-icon
        Non-Partnership Policy
      • Cookie Policy-icon
        Cookie Policy
      • Grievance Redressal-icon
        Grievance Redressal
      • Republishing Guidelines-icon
        Republishing Guidelines
      • Fact Check-icon
        Fact Check
        Fast Check
        Politics
        Business
        Entertainment
        Social
        Sports
        World
      • Law-icon
        Law
      • Explainers-icon
        Explainers
      • News-icon
        News
        All News
      • Decode-icon
        Decode
        Impact
        Scamcheck
        Life
        Voices
      • Media Buddhi-icon
        Media Buddhi
        Digital Buddhi
        Senior Citizens
        Videos
      • Web Stories-icon
        Web Stories
      • BOOM Research-icon
        BOOM Research
      • BOOM Labs-icon
        BOOM Labs
      • Deepfake Tracker-icon
        Deepfake Tracker
      • Videos-icon
        Videos
        Facts Neeti
      Trending Tags
      TRENDING
      • #Operation Sindoor
      • #Pahalgam Terror Attack
      • #Narendra Modi
      • #Rahul Gandhi
      • #Waqf Amendment Bill
      • #Arvind Kejriwal
      • #Deepfake
      • #Artificial Intelligence
      • Home
      • Explainers
      • YouTube Algorithm Recommends Videos...
      Explainers

      YouTube Algorithm Recommends Videos That Violate Its Own Policies: Study

      According to Mozilla, the data collected reveals how YouTube's recommendation algorithm pushes harmful, inappropriate and misleading content online.

      By - Archis Chowdhury |
      Published -  8 July 2021 9:14 PM IST
    • Boomlive
      YouTube Algorithm Recommends Videos That Violate Its Own Policies: Study

      A recent study by non-profit organisation Mozilla Foundation found that video hosting giant YouTube's recommendation algorithm promotes disturbing and hateful content, which would often violate the platform's very own content policies.

      According to the study, 71 per cent of the videos that the volunteers reported as regrettable were found to be actively recommended by YouTube's algorithm. It also found that almost 200 videos recommended by YouTube to the volunteers were later removed, included several deemed by the platform to violate its own policies, but had cumulatively garnered over 160 million views before being taken down.

      Mozilla collected data 37,380 YouTube users, who volunteered to share their data of regrettable experiences arising from following recommendations by YouTube's algorithm. The data was provided through RegretsReporter, a browser extension and crowdsourced research project, where users could report their regrettable experience.

      "YouTube needs to admit their algorithm is designed in a way that harms and misinforms people," Brandi Geurkink, Mozilla's Senior Manager of Advocacy, said in a blogpost uploaded by the foundation.

      "Our research confirms that YouTube not only hosts, but actively recommends videos that violate its very own policies. We also now know that people in non-English speaking countries are the most likely to bear the brunt of YouTube's out-of-control recommendation algorithm," he added.

      YouTube Recommendations: A Rabbit Hole

      According to Mozilla, the data collected reveals how YouTube's recommendation algorithm pushes harmful, inappropriate and misleading content online.

      Recommended videos were found to be 40 per cent more likely to be reported, than those that were searched. Furthermore, reported videos were found to perform well on the platform - they drew 70 per cent more views per day than other videos watched by the volunteers.

      One of the volunteers was recommended a misogynistic video titled "Man humiliates feminist in viral video", after watching a video about the United States military. Another volunteer was watching a video about software rights, and the algorithm recommended him a video about gun rights. Yet another volunteer was recommended a highly sensational conspiracy theory video about former US President Donald Trump, while watching an Art Garfunkel music video.

      The study also found the algorithm the treat people unequally based on location. Countries not having English as a primary language had 60 per cent higher rate of reporting as opposed to English-speaking countries.

      This was found to especially true for pandemic-related reports. Among reported videos in English, only 14 per cent were pandemic -related. The number for report non-English videos is at 36 per cent.

      The study made the following recommendations to YouTube on how to improve its recommendation algorithm"

      • Platforms should publish frequent and thorough transparency reports that include information about their recommendation algorithms
      • Platforms should provide people with the option to opt-out of personalized recommendations
      • Platforms should create risk management systems devoted to recommendation AI
      • Policymakers should enact laws that mandate AI system transparency and protect independent researchers

      An Industry-Wide Issue

      This is not the first time that concerns have been raised about the algorithm of a big-tech company.

      In August 2020, advocacy group Avaaz did a year-long study of health-related misinformation on Facebook and found a similar pattern.

      Also Read: Health Misinformation Racked Up Billions Of Views On Facebook: Report

      The report - titled, "Facebook's Algorithm: A Major Threat to Public Health" - mentioned that the top 10 health misinformation spreading websites had four times the views as equivalent content on websites of top 10 leading health institutions like the World Health Organisation (WHO) and Centre for Disease Control and Prevention (CDC).

      It revealed that Facebook's efforts in minimising the spread of such health-related misinformation were outperformed by the amplification of such content by Facebook's own algorithm.

      Tags

      Mozilla FoundationYouTube AlgorithmRecommendtion
      Read Full Article
      Next Story
      Our website is made possible by displaying online advertisements to our visitors.
      Please consider supporting us by disabling your ad blocker. Please reload after ad blocker is disabled.
      X

      Subscribe to BOOM Newsletters

      👉 No spam, no paywall — but verified insights.

      Please enter a Email Address
      Subscribe for free!

      Stay Ahead of Misinformation!

      Please enter a Email Address
      Subscribe Now🛡️ 100% Privacy Protected | No Spam, Just Facts
      By subscribing, you agree with the Terms & conditions and Privacy Policy connected to the offer

      Thank you for subscribing!

      You’re now part of the BOOM community.

      Or, Subscribe to receive latest news via email
      Subscribed Successfully...
      Copy HTMLHTML is copied!
      There's no data to copy!