Test your online journalism law: 4 – nasty comments on your Facebook page

All this week I am publishing examples of legal dilemmas that a journalism student might face (Read my previous post on students being publishers, and the responsibilities that come with that for the background). I can’t promise a ‘right answer’ at the end of the week – but I hope you can comment on what a student publisher might do – and why. Here’s the fourth – probably the most complex of the lot:

Case 4: your Facebook page starts getting some nasty comments

You run a Facebook page for a university society group, publishing news about what the group is doing, links to relevant events, and how-tos.

One week, while you are on holiday, a series of hateful comments appear on the site, all from different accounts.

  • One is a joke by Member A about Jews which many commenters think is sick.
  • In response, Member B says that all Muslims should be beaten up on sight;
  • A further comment by Member C adds “homosexuals” to the list for the same treatment;
  • And for good measure Member D says “Polacks” should be beaten up too – although you know the commenter personally and think the term was used in a tongue-in-cheek fashion (given the timestamp you suspect she was under the influence).

A few days later Member E messages you directly to tell you about those messages, and ask that two commenters be kicked off the page.

To complicate things further, it isn’t the first time that Member E has asked you to kick people off the page – they have been arguing both privately and publicly on the page that a number of openly gay people are trying to ‘hijack’ the group and openly gay members should not be allowed to join it.

The questions

  1. What are the legal issues here – and what tests need to be met for them to be an issue?
  2. What defence could you mount?
  3. How likely is it that legal action would result?
  4. Would you publish – and why?

2 thoughts on “Test your online journalism law: 4 – nasty comments on your Facebook page

  1. Anneleen Ophoff

    Well, Facebook’s community standards says: “Facebook does not permit hate speech, but distinguishes between serious and humorous speech. While we encourage you to challenge ideas, institutions, events, and practices, we do not permit individuals or groups to attack others based on their race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or medical condition”.

    You might also incorporate a warning about hate speech in your private community. Reply to the member’s comments refering to both the Facebook community standards and your own community policy. You could kick member A, B and C off your page – no idea why member E would only want 2 of them kicked off it – and spare member D because of the humorous meaning of her comment. Or be a little more tolerant and just warn all four of them one last time.

    Also, make sure member E understands openly gay people have the same rights on Facebook as an openly heterosexual or any other user, as long as they don’t break any of the community standards.

    Reply
  2. Paul Bradshaw Post author

    The ‘answers’:
    1. The legal issues here are many, The Public Order Act 1986 (stirring up hatred based on nationality, colour, and ethnic origins), The Crime and Disorder Act 1998 (incitement to ethnic or racial hatred); The Racial and Religious Hatred Act 2006 (religious hatred); The Criminal Justice and Immigration Act 2008 (inciting hatred on the basis of sexual orientation) the Communications Act 2003 (Section 127 covers “grossly offensive” messages) and the Serious Crime Act 2007 (encouraging a crime). See https://onlinejournalismblog.com/2012/11/22/7-laws-journalists-now-need-to-know-from-database-rights-to-hate-speech/ for more
    2. There is a defence if you are not aware of the communications – but once notified you have a responsibility to act. Having terms and conditions would also help your defence.
    3. The likelihood of legal action depends on the users and how well you know them, but more important is probably how health a page you will have if you don’t address the issues.
    4. Each incident should be dealt with separately, under any terms and conditions that you’ve established, and that Facebook has. For example, hate speech can be flagged under Facebook’s own internal moderation procedures. However, you should also communicate clearly what is acceptable on the site and direct users to that in making the decision to delete comments or eject users. This ensures consistency and that you are held accountable too.

    Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.