Health

YouTube to remove COVID-19 vaccine misinformation videos

YouTube - Appy Pie

YouTube has announced it will ban videos containing misinformation about COVID-19 vaccines from its platform, expanding its current rules against conspiracy theories about the pandemic.

The video platform gave details saying that it will remove videos that spread vaccine misinformation and it would now ban any content with claims about COVID-19 vaccines that contradict any consent from local health authorities or World Health Organization (WHO).

The new rules are a planned extension of YouTube’s existing COVID-19 Medical Misinformation Policy. This existing policy does not allow videos that falsely suggest that the coronavirus doesn’t exist or that the virus is not contagious. This misinformation and alternative, unproven remedies can be dangerous.

YouTube remarked that it already removes content that argues the existence or transmission of COVID-19, endorses medically unconfirmed methods of treatment, discourages people from seeking medical help, or openly disputes health authorities’ self-isolation or social distancing guidance.

YouTube claims that it has already removed over 2,00,000 videos related to COVID-19 misinformation since early February.

Farshad Shadloo, a YouTube spokesman said, “A COVID-19 vaccine is in the pipeline and may likely be coming soon, therefore the company is making sure that they have the right policies in place to be able to remove misinformation related to the vaccine. This could even include false claims that vaccines cause infertility or vaccines implant microchips in people’s bodies. Both the rumors are not true.”

Andy Pattison, manager of digital solutions at the World Health Organization, said that the WHO meets weekly with the policy team at YouTube to discuss content trends and potentially problematic videos. He further added that the WHO was encouraged by YouTube’s declaration on coronavirus vaccine misinformation.

Facebook also made a similar announcement of its crackdown on that anti-vaccination content. It said that will not be allowing ads that discourage vaccination. Facebook said in a statement, “We don’t want these ads on our platform.” However, organic posts from anti-vaccine groups will still be permitted on the platform.

Despite the fact that researchers and drugmakers are working on various treatments, vaccines are important for the long-term war against the new coronavirus, which has already led to more than a million deaths, infected about 38 million across the globe, and crippled the economy worldwide. Both YouTube’s and Facebook’s latest announcements indicate a move in the right direction.

The Covid19 pandemic has not just exposed flaws in the ways social media platforms handle misinformation but has also brought forth the fact how medical institutions around the world define misinformation, with organizations like WHO backtracking on various Covid19 claims as study after study revealed different views on the virus. Due to this uncertainty, YouTube’s policy earlier in the pandemic said remove videos with medically unsubstantiated claims.

YouTube said that it would be announcing more steps in the coming weeks to emphasize convincing information about COVID-19 vaccines on the platform.