YouTube on Wednesday issued a ban on the accounts of some of the most prominent anti-vaccine activists, as part of its efforts to rein in on content making false claims about all approved vaccines.
This latest move by the Google-owned video sharing platform is an expansion of its earlier efforts to remove misinformation around COVID-19 vaccines, that had been rampant since last year. In March, the company said it had removed more than 30,000 videos that provided misleading information on COVID-19 vaccines.
In its latest step, the platform has banned the accounts of Robert Kennedy Jr, Joseph Mercola, Christiane Northrup - three highly popular anti-vaccine activists whose videos have been viral among anti-vaccine groups.
Is YouTube A Little Late?
Last year, YouTube implemented a ban on videos that misled people on the COVID-19 vaccine, a step which has since led to the removal of 130,000 videos from the platform. Overall, the platform removed a total of 1 million videos for spreading general mis/disinformation around the pandemic.
"Today's policy update is an important step to address vaccine and health misinformation on our platform, and we'll continue to invest across the board in the policies and products that bring high quality information to our viewers and the entire YouTube community," the company said in a blogpost.
While YouTube's actions have been welcomed by many, misinformation researchers believe it may have been a little late and that the damage has largely been done.
"You create this breeding ground and when you deplatform it doesn't go away, they just migrate," Hany Farid, a computer science professor and misinformation researcher at the University of California at Berkeley told the Washington Post. "This is not one that should have been complicated. We had 18 months to think about these issues, we knew the vaccine was coming, why was this not the policy from the very beginning?"
Matt Halprin, YouTube's vice president of global trust and safety told the Post that the company was previously focusing solely on removing misleading content against COVID-19 vaccines. "Developing robust policies takes time," Halprin said. "We wanted to launch a policy that is comprehensive, enforceable with consistency and adequately addresses the challenge."
"Fact Checkers Have A Role To Play"
Cristina Tardáguila, the founder of Brazilian fact checking website Agência Lupa, and senior program director at the International Centre for Journalists, believes that there are both good and bad things about the latest steps by YouTube.
"The good thing is that it seems like the platform is facing this issue as an important thing, they are finally and globally communicating with each other inside the company to figure a way to combat mis/disinformation on its platform, which is a huge step," Tardáguila told BOOM.
However, she believes there is an issue with YouTube's approach to tackle the problem, and that it is too reliant on national authorities and are yet to take fact checkers seriously. Tardáguila recently attended a global press conference organised by YouTube, where the issue of tackling anti-vaccine content on its platform was the central topic of discussion.
"In the conference they said they would rely on national health authorities, and in the World Health Organisation, to flag anti-vaccine falsehoods on YouTube. That is troublesome, at least in some parts of the planet," she said. "Because national health organisations, for example in Brazil, are mainly composed of anti-vaxxers. The health ministry in Brazil, for example, pushes the idea of taking ivermectin, and not using mask. So even though it is a national authority, it has a very complicated and controversial position," said Tardáguila
During the conference, Tardáguila asked YouTube representatives whether fact checkers would have a role to play. The company representatives assured that they would address the questions, but Tardáguila never got a response from the company throughout the week.
Meanwhile, Indian anti-vaccine activists have started using newer methods to escape YouTube's anti-vaccine policing, for example the use of coded words to denote terms like vaccine and coronavirus. According to some viral anti-vaccine videos, the word paneer tikka is used to denote vaccines, while the term trending beemari denotes coronavirus.
YouTube would have to customise their search for such anti-vaccine content according to the local, and sometimes hyper-local context used in such videos to mislead people.
Do you always want to share the authentic news with your friends?