What Youtube Can do to Combat Hate Speech and Create Healthier Online Community
It is clear that Youtube has a serious hate speech problem. Extremists are currently able to, without opposition, use the platform to spread ideas of the sort which inspired the recent New Zealand mosque shooting. Other social networks such as Facebook have been making recent efforts to remove hate speech from their platforms. The time has come for Youtube to join them. The problem of hate on the world’s largest video sharing site is complex, but after reviewing studies on the subject and the methods used by other organization, we have developed three actions Youtube can take to reduce the spread of hate speech on it’s platform:
Ban Content Which Glorifies White Nationalism
Joining Facebook in explicitly banning white nationalist content on Youtube is an essential first step in removing hate speech from the platform.
Remove Content Already Identified as Hate Speech
Youtube recently implemented a policy of demonetizing videos deemed to contain hate speech. If this determination is made, these videos should be removed from the platform entirely.
Make Race Relations Experts Youtube Moderators Focused on Removing Hate Speech from the Platform
The best way to eradicate the elaborate network of hate groups on Youtube is to give organizations such as the Southern Poverty Law Center moderator status so that they can remove hateful content and it’s creators from the platform. Giving these groups “trusted flagger” status alone has proven ineffective. Hate groups are often spread violent and hateful messages while tiptoeing around community guidelines to avoid having their videos removed. Experts on race relations could provide the deeper level of analysis required to make moderation decisions about this type of content. They would focus not only on video creators, but also those they host on their shows and what their guests say.
We are confident that Youtube will in the end, as is the slogan of its parent company: ‘do the right thing’.