The blog post by YouTube’s Chief Product Officer Neal Mohan revealed that the platform removes up to 10 million videos every quarter. YouTube deletes a bulk of the violating videos even before they receive 10 views. The blog post added that “bad content” only makes up a small portion of the content on YouTube. “About .16-.18% of total views turn out to be content that violates our policies,” it read. YouTube said that it relied on expert opinions from health organizations such as the CDC and the WHO.
Battling misinformation remains a core focus of YouTube and other social media platforms
“Speedy removals will always be important but we know they’re not nearly enough. Instead, it’s how we also treat all the content we’re leaving up on YouTube that gives us the best path forward,” YouTube said. Over the past few years, YouTube has implemented mechanisms to screen content that violate the YouTube community guidelines. However, the platform has faced accusations of keeping some controversial videos up for long to boost revenue. Addressing these concerns, the company said, “Not only have we found that this type of content doesn’t perform well on YouTube, it also erodes trust with viewers and advertisers.” Keeping a check on misinformation is an endless job on a platform like YouTube. The timely removal of misinformation reduces the chances of such content spreading on other platforms. This is a challenge also faced by Facebook, Instagram, and other social media platforms. Earlier this year, a report from Pew Research Center revealed that YouTube is the most used social media platform in the U.S. Platforms like Instagram and TikTok were not too far behind either. This can be attributed to the coronavirus restrictions across the U.S. in 2020, forcing millions to stay indoors. “This represents a broader trend that extends beyond the past two years,” the Pew report said.