Youtube - Strict on Copyright Claims, Weak on Hate Speech
One of the biggest moderation challenges Youtube struggles with is copyright claims. With 300 hours of video uploaded every minute, the task of detecting and removing copyrighted content was colossal. Yet through an elaborate system of artificial intelligence, community flagging, and content moderators, the platform was successfully able to create a functional copyright enforcement system. While some argue Youtube is actually too eager to remove videos under dubious copyright circumstances, there can be no doubt that the platform is serious about copyright enforcement.
So why then is Youtube so lenient when it comes it comes to deadly hate speech of the kind which inspired the New Zealand Mosque shooter? Even though its own terms of service explicitly forbid hate speech, Youtube has become a favorite platform for the Alt Right, Neo-Nazis, and other hate groups to spread their ideas unopposed. As the largest platform for video content on the internet, it is vital that Youtube take a stand against hate. Identifying and removing videos containing hate speech represents its own set of challenges, but it is something that Youtube is fully capable of, and must prioritize if it is to join platforms like Facebook and Twitter in their recent efforts to promote healthy online communities.