In an effort to make the platform free of violating content, YouTube has removed millions of videos subjected to go against its guidelines, in only three months.
In an official blog post published on Monday, the video-sharing platform revealed that it has removed over eight million videos that had violated the company’s policies regarding the content.
“The majority of these eight million videos were mostly spam or people attempting to upload adult content,” read the blog. The videos were pulled down between October and December last year.
After Facebook and Instagram, YouTube will now go live
The shared information was part of the company’s first quarterly report on how it is imposing community guidelines. Since the company believes that a YouTube user joins a community of people from around the globe, following certain guidelines will make ‘YouTube fun and enjoyable for everyone.’ Harassment, hateful, sexual or violence are some of the contents not allowed on YouTube.
The platform’s moderation report has been made available amid claims on how YouTube is unable to tackle abusive and extremist content. Earlier this year, the YouTube Kids app was widely criticized for age-inappropriate content.
According to The Guardian, the parent company Google has assured to bring in about 10,000 people working on enforcing YouTube’s community guidelines by the end of this year.
In other news, YouTube just turned a teenager as the very first video published was on April 23 in 2005, which was also the company’s founding year.