YouTube has taken strict action to remove videos from its platform that were misinformed about the COVID-19 vaccine. At the same time, wrong data related to the epidemic were also being given in these videos. According to The Verge, video sharing platform YouTube announced this news on Wednesday. YouTube announced that the video, which is contrary to information from health experts or the World Health Organization, will be removed from the platform. So far, more than 30 million people in the world are vulnerable to this dangerous virus, while more than 1 million people have died. Also Read - 1 Lakh Participants From 115 Countries: 'Grace Cancer Run' by Hyderabad-Based Foundation Sets 2 Guinness Records

Farshad Shadloo, a YouTube spokesperson said: “A COVID-19 vaccine may be imminent, therefore we’re ensuring we have the right policies in place to be able to remove misinformation related to a COVID-19 vaccine.” Also Read - Absolutely Crazy! Russian YouTuber Sets Rs 2.4 Crore Mercedes on Fire | Video Goes Viral

According to YouTube’s official blog post, the banned video claimed that people were being killed by the Kovid vaccine. Also, this vaccine is making women infertile. Not only this, a microchip is being implanted in the body of people who have received the vaccine earlier. YouTube has said that 2 lakh videos have been removed from the platform. All these videos were spreading false information related to the corona infection. Along with this, people were being told through these videos, from the wrong treatment to medical treatment. Also Read - In a First, Gujarat High Court Holds Live Streaming of Court Proceedings on YouTube

According to the Indian Council of Medical Research (ICMR), 9 crores 12 lakh 26 thousand 305 corona sample tests have been conducted in the country so far. In the last 24 hours, 11 lakh 36 thousand 183 people have been tested.

The manager of digital solutions at the World Health Organization, Andy Pattison told Reuters that the WHO meets weekly with the policy team at YouTube to discuss content trends and potentially problematic videos. Pattison said the WHO was encouraged by YouTube’s announcement on coronavirus vaccine misinformation.

Facebook has also adopted a new policy to stop misleading COVID anti-vaccination messages. The company said –

“Our goal is to help messages about the safety and efficacy of vaccines reach a broad group of people, while prohibiting ads with misinformation that could harm public health efforts.”

Unsah Malik, a social media advisor of Facebook said: “The volume of content defined as misinformation overrides the number of employees to oversee such things, or (the automated) functionalities the platforms have. We should probably have stronger consequences for those who publish misinformation – make it unlawful and fine people.”