Tag Archives: moderation

Guest post: Do we need moderation guidelines for dealing with mental health issues?

Last month the Press Complaints Commission made a judgement in a case involving discriminatory comments on a newspaper article. The case highlighted the issue of journalism on mental health and how it is treated by publishers alongside similar considerations such as sexuality, gender, religion and ethnicity. The complaint also led to a change in The Guardian’s moderation rules.

In a guest post for the Online Journalism Blog the person who brought that case, Beatrice Bray, writes about her experiences of comment abuse, and the role she feels publishers should take in dealing both with comments relating to mental health, as well as writers with mental health issues.

Last April I wrote a rallying cry for the Guardian for all who have endured taunts about mental ill health. In my reply article Cartoonists should be careful how they portray mental health (23/4/10) I reclaimed the word “psychotic”. Guardian cartoonist Martin Rowson had used the word to abuse Mrs Thatcher. I put him right.

I am a long-standing reader of the Guardian newspaper but I did not know the website audience. Being a proud campaigner I told Guardian readers that I had bipolar disorder and had experienced psychosis.

I expected a civil hearing. Newspaper readers did oblige but many online readers were foul.

The Guardian’s managing editor Chris Elliott did not warn me about the impending abuse. That was a mistake. I think Mr Elliott knew I would face hostility but I do not think he realised how badly I would be hurt.

Those insults made me physically sick. My head was sore for many weeks. This was all so pointless. If Mr Elliott had given me a chance to discuss the risks involved we both could have taken precautions. Instead there was a row.

Guardian staff gave me an apology but told me to grow a “thick skin”. That jibe spurned me into going to the Press Complaints Commission. It is free. It is also less adversarial and less costly than a disability tribunal.

I was not asking for anything unprecedented. The BBC has guidelines on working with vulnerable people. We need to extend this to new media.

Working with vulnerable people

For example when dealing with discussion sites moderators need to deal swiftly with abuse. They also must facilitate discussions so that they do not turn nasty.

Staff should appreciate the reasons for this action. This is not prima donna treatment. This action is necessary because the writer and many of the readers share a common disability. They all have mental health problems.

Section 2 of the PCC Editors’ code promised fairness to complainants. I thought it only fair to ask for warning of abuse but in my PCC ruling the Guardian and the PCC disagreed with me. The PCC did not say why.

However, I did score other points.

Before the PCC ruling the Guardian at my request did add the word “disability” to its moderation rules.

The PCC and the Guardian and did apologise with regard to the abuse.

Guardian online readers called me, amongst other things, a “nutter” and a “retard”. Unfortunately both the Guardian and PCC refused to accept that this was discrimination as defined by the terms of section 12 of the Editor’s code of the PCC.

This is not just semantics. To me the word “discrimination” is a word with power. It holds the abuser responsible but the PCC fights shy of doing that online.

I now know that you can only complain to the PCC if a staff member makes a discriminatory remark about you. Comments made by non-staff members do not fall within the PCC’s remit. My abusers were not Guardian staff.

It is a shame. By being discrimination deniers both the Guardian and the PCC cut themselves off from a store of knowledge on handling disability and mental health in particular.

‘UGC’ and journalism: the Giffords shooting and Facebook page moderation

SarahPalinFacebook

The Obama London blog has a post looking at the moderation of comments on Sarah Palin’s Facebook page (following the Giffords shooting) which raises a couple of key points for journalists dealing with user generated content.

Editorially selected, not UGC

The first point is that it can be easy to assume user generated content is an unadulterated reflection of one community’s point of view, but in many cases it is not. A political page like Palin’s is, in many ways, no different to any piece of campaigning literature, with quotes carefully selected to reflect well on the candidate.

Political blogs – where critical comments can also be removed, should be subject to the same scepticism (MP Nadine Dorries’ claim that 70% of her blog was fiction is a good example of blog-as-political-pamphlet).

Taking a virtual trip to a Facebook page, then, is not comparable to treading the streets – or even a particular politician’s campaign team – in search of ‘the feeling on the ground’.

Inaction can be newsworthy

The second point, however, is that this very moderation can generate stories itself.

The Obama London post notes that while even constructively critical comments were removed almost instantly, one comment was left to stand (shown in the image above). And it appeared to condone the killing of 9-year-old Christina Taylor Green:

“It’s ok. Christina Taylor Green was probably going to end up a left wing bleeding heart liberal anyway. Hey, as ‘they’ say, what would you do if you had the chance to kill Hitler as a kid? Exactly.”

Drawing on the campaign literature analogy again, you can see the newsworthiness of Palin staffers leaving this comment to stand (even when other commenters highlight its offensiveness).

Had Obama London been so inclined they could have led more strongly on something like: ‘Palin staff endorse comments condoning killing of 9-year-old’, or chased up a response from the team on why the comment was not removed.

But regardless of the nature of this individual example, you can see the broader point about comments on heavily moderated Facebook pages and blogs: they represent views that the politician’s camp is prepared to condemn or condone.

Comments

By the way, the extensive comment thread on that post is well worth exploring – it details how users can flag comments for moderation, removing them from their own view of the page but not that of others, as well as users’ experiences of being barred from Facebook groups for posting mildly critical comments.

Dylan Reeve in particular expresses my point more succinctly for moderators:

“The problem with the type of moderation policy that Sarah Palin (and others) utilise in places with user-contributed content is that they effectively appear to endorse any comments that do remain published.”

In the case of Facebook pages, admins are not named, but security lapses can lead to them being revealed and recorded, as is the case with Palin’s Facebook pages.

Oh, and on the more general thread of ‘analysis’ in the wake of the Giffords shooting, this post is well worth reading.

UPDATE: More discussion of the satirical nature of the comment on Reddit (thanks Mary Hamilton)

h/t Umair Haque

Lessons in community from community editors #2: Mark Fothergill, The Guardian

I’ve been speaking to news organisations’ community editors on the lessons they’ve learned from their time in the job. In the 2nd of the series, the Guardian’s Mark Fothergill:

1. Getting the tools right for the job are ultra-important, both front end and back end:

Too many sites knock together something that ‘will do’ and it always comes back to haunt.

An oft-made mistake is spending lots of time on front end, user-facing functionality and spending no time thinking about how to moderate it.

Additionally, once users have tools/functionality, good or bad, they grow accustomed to them and when you then attempt to ‘improve’ the offering at a later date, they inevitably don’t like it and you can lose a sizeable portion of your community.

2. Define your role (and more specifically, the role of the moderation team):

If it’s not clear to other departments, particularly editorial, that the final decision on the moderation of any piece of user generated content lies with you, it can cause numerous problems. Other departments should have a say in procedures and should have a higher priority when it comes to 50/50 decisions, but they should respect the decisions of the moderation team, that are based on both experience and policy.

This is the only way to maintain consistency across your offering. Users won’t know if they’re coming or going if it appears there are a number of different moderation policies across a site that they see as being one entity.

Slight difffences between moderation on, say, Sport and Politics are to be expected, but not wholesale differences, especially when users are only asked to follow one set of community standards.

3. Deal with user complaints quickly:

If you’re not on top of user complaints within a reasonable time-frame, you’re fostering problems and problem areas. Dealing with a piece of content calling someone a “wanker” within 15 minutes, for instance, can prevent a flame war from ever getting off the ground. Deal with the same complaint after 2 hours and you’re likely to be mopping up for another hours afterwards.

Quick response times help to protect yourselves from a legal standpoint and, at the same time, help to protect the users who are much happier in the knowledge that a piece of reported content, that they deem to be offensive or inappropriate, has been acted upon swiftly. Who wants a system where you report someone telling you to “F off” and, on a regular basis, the comment is still there 8 hours later?