In a blog post, the company said that a “large increase in hate speech removals” had led to the 500 million purged comments — doubling the number of comments that were removed during Q1.
Another 100,000 videos were deleted for hate speech violations, along with 17,000 channels that were terminated also for breaking hate speech rules; both represented a “5x spike” from the prior quarter.
The sharp increase comes after YouTube broadened its hate speech policy in June to block videos “alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.”
The new rule covered videos that “glorify Nazi ideology, which is inherently discriminatory,” YouTube said, as well as videos denying events like the Holocaust or Sandy Hook Elementary shooting.
The update added to YouTube’s existing policy against hate speech, which banned content promoting violence or hatred against people based on a myriad of factors, including ethnicity, gender identity, immigration status and religion.
Last week, YouTube chief Susan Wojcicki said the site remained dedicated to “openness” — even for “offensive” content — but that it would continue to fight against content that violates its rules.