YouTube says that changes it made to broaden its hate speech policies in June have resulted in a significant increase in problematic videos being removed from its platform, Axios reports.
In June, YouTube said “Over the past few years, we’ve been investing in the policies, resources and products needed to live up to our responsibility and protect the YouTube community from harmful content. This work has focused on four pillars: removing violative content, raising up authoritative content, reducing the spread of borderline content and rewarding trusted creators. Thanks to these investments, videos that violate our policies are removed faster than ever and users are seeing less borderline content and harmful misinformation. As we do this, we’re partnering closely with lawmakers and civil society around the globe to limit the spread of violent extremist content online.”
Adding, “We review our policies on an ongoing basis to make sure we are drawing the line in the right place: In 2018 alone, we made more than 30 policy updates. One of the most complex and constantly evolving areas we deal with is hate speech. We’ve been taking a close look at our approach towards hateful content in consultation with dozens of experts in subjects like violent extremism, supremacism, civil rights, and free speech. Based on those learnings, we are making several updates…
Removing more hateful and supremacist content from YouTube”