According to YouTube’s Chief Product Officer Neil Mahon, YouTube has removed videos for dangerous COVID-19 misinformation since February 2020.
Mahoon shared the statistics in a blog post on how the company communicates misinformation on its platform. “Misinformation has moved from the margins to the mainstream,” he wrote. “Holocaust deniers no longer belong to the sealed worlds of 9-11 truths, it is now pervasive in every aspect of society, sometimes tearing communities apart at breakneck speed.”
At the same time, YouTube executives have argued that “bad content” is only a small part of YouTube content as a whole. “Bad content is just a small part of the billions of videos on YouTube,” wrote Mahon, adding that YouTube removes about 10 million videos every quarter, “the majority of which do not even reach 10 views.” ۔ “
Facebook has recently made a similar argument about the content on its platform. The social network published a report last week claiming that the most popular posts were memes and other content. And, the company has been criticized for handling Covid 19, and the company has argued that the vaccine misinformation does not represent the kind of content that most consumers see.
Both Facebook and YouTube have come under special scrutiny for their policies regarding health misinformation during epidemics. Both platforms have over one billion users, which means that even a small piece of content can have a far-reaching effect. And both platforms have so far refused to elaborate on how vaccines and health information are spread or how many users are experiencing it. Mahon also said that removing misinformation is just one aspect of the company’s vision. YouTube is also working to “get information from reliable sources and reduce the spread of videos with harmful misinformation.”
Editor’s note: This post was originally published on Engadget.