YouTube can be a mess sometimes. While it's great for watching vloggers, tutorials and that latest viral music video, it's also home to some pretty dark behavior.
In an attempt to curb the bullying, hate-speech and sexually explicit content that often plagued its user comments (basically anything that violates its Community Guidelines), the video streaming site created the YouTube Heroes program.
The initiative designates users as "heroes," who can moderate and flag inappropriate or abusive videos for the YouTube staff to review.
The program intends to draw in volunteers by gamifying the reporting process and rewarding participants. When users sign up, they start off at level 1 and have access to a "hero dashboard." As heroes flag videos, they earn more points. They can also add video subtitles or captions and post in the YouTube Help forum to level up and gain special perks.
These perks includes moderating tools (like being able to mass-flag multiple videos at once); moderating content within the YouTube Heroes community; being able to "contact YouTube staff directly" and applying for the Heroes Summit (we're not exactly sure what this is yet).
Although the program encourages moderators to flag as much content as possible, YouTube still has the final say in what content should be removed, and does not automatically censor whatever the heroes deem inappropriate.
Regardless, YouTube users are critical of the initiative and took to Twitter to voice their opinions:
We don't know if an online community of unpaid volunteers will help rid YouTube of some of its nastier trolls, but other sites also encourage users to self-police. Reddit employs a similar method, allowing some members to moderate forums and even ban other users if need be. At the very least, YouTube is acknowledging it has a problem with negative comments and inappropriate videos. Let's hope the program starts making a difference quick.