YouTube, a unit of Alphabet Inc’s Google, has removed 58 million plus videos and 224 million comments during the third quarter based on violations of its policies, Reuters reported.
YouTube, Facebook and other social media services are under pressure to quickly identify and remove extremist and hateful content.
The European Union has proposed online services should face steep fines unless they remove extremist material within one hour of a government order to do so.
YouTube said most of the removed content was spam.
Automated detection tools help YouTube identify spam, extremist content and nudity. During September, 90 percent of the nearly 10,400 videos removed for violent extremism or 279,600 videos removed for child safety issues received fewer than 10 views, according to YouTube.
Automated detection technologies for those policies are relatively new and less efficient, so YouTube relies on users to report potentially problematic videos or comments. This means that the content may be viewed widely before being removed.
Google added thousands of moderators this year, expanding to more than 10,000, in hopes of reviewing user reports faster. It has described pre-screening every video as unfeasible.
YouTube removed about 1.67 million channels and all of the 50.2 million videos that were available from them.
Nearly 80 percent of the channel takedowns related to spam uploads, YouTube said. About 13 percent concerned nudity, and 4.5 percent child safety.
YouTube said users post billions of comments each quarter. It declined to disclose the overall number of accounts that have uploaded videos, but said removals were a small fraction.
In addition, about 7.8 million videos were removed individually for policy violations, in line with the previous quarter.