I’ve been speaking to news organisations’ community editors on the lessons they’ve learned from their time in the job. In the 2nd of the series, the Guardian’s Mark Fothergill:
1. Getting the tools right for the job are ultra-important, both front end and back end:
Too many sites knock together something that ‘will do’ and it always comes back to haunt.
An oft-made mistake is spending lots of time on front end, user-facing functionality and spending no time thinking about how to moderate it.
Additionally, once users have tools/functionality, good or bad, they grow accustomed to them and when you then attempt to ‘improve’ the offering at a later date, they inevitably don’t like it and you can lose a sizeable portion of your community.
2. Define your role (and more specifically, the role of the moderation team):
If it’s not clear to other departments, particularly editorial, that the final decision on the moderation of any piece of user generated content lies with you, it can cause numerous problems. Other departments should have a say in procedures and should have a higher priority when it comes to 50/50 decisions, but they should respect the decisions of the moderation team, that are based on both experience and policy.
This is the only way to maintain consistency across your offering. Users won’t know if they’re coming or going if it appears there are a number of different moderation policies across a site that they see as being one entity.
Slight difffences between moderation on, say, Sport and Politics are to be expected, but not wholesale differences, especially when users are only asked to follow one set of community standards.
3. Deal with user complaints quickly:
If you’re not on top of user complaints within a reasonable time-frame, you’re fostering problems and problem areas. Dealing with a piece of content calling someone a “wanker” within 15 minutes, for instance, can prevent a flame war from ever getting off the ground. Deal with the same complaint after 2 hours and you’re likely to be mopping up for another hours afterwards.
Quick response times help to protect yourselves from a legal standpoint and, at the same time, help to protect the users who are much happier in the knowledge that a piece of reported content, that they deem to be offensive or inappropriate, has been acted upon swiftly. Who wants a system where you report someone telling you to “F off” and, on a regular basis, the comment is still there 8 hours later?