Tag Archives: Jon Hickman

Free ebook on teaching collaborative journalism and peer-to-peer learning

Stories and Streams free ebook on teaching collaborative journalism with peer to peer learning

I’ve just published a free ebook documenting a method of teaching collaborative journalism. Called ‘Stories and Streams’ the method, which was piloted last year, uses investigation teams and focuses on student-driven, peer-to-peer learning. Traditional lectures are not used.

You can download the free ebook from Leanpub.

You can read more about the background to the project here. A research report co-written with Jon Hickman and Jennifer Jones is published in a research report in ADM-HEA Networks Magazine. A fuller report will be included in a HEA publication on collaborative learning soon.

I’m also about to start a new class using the same method again, so if you have a class you’d like to get involved, let me know.

What made the crowdsourcing successful? A case study in crowdsourcing investigative journalism part 6

In the penultimate part of the serialisation of research underpinning a new Help Me Investigate project I explore the qualities that successful crowdsourcing investigations shared. Previous parts are linked below:

What made the crowdsourcing successful?

Clearly, a distinction should be made between what made the investigation successful as a series of outcomes, and what made crowdsourcing successful as a method for investigative reporting. This section concerns itself with the latter.

What made the community gather, and continue to return? One hypothesis was that the nature of the investigation provided a natural cue to interested parties – The London Weekly was published on Fridays and Saturdays and there was a build up of expectation to see if a new issue would indeed appear.

The data, however, did not support this hypothesis. There was indeed a rhythm but it did not correlate to the date of publication. Wednesdays were the most popular day for people contributing to the investigation.

Upon further investigation a possible explanation was found: one of the investigation’s ‘alpha’ contributors – James Ball – had set himself a task to blog about the investigation every week. His blog posts appeared on a Wednesday.

That this turned out to be a significant factor in driving activity suggests one important lesson: talking publicly and regularly about the investigation’s progress is key to its activity and success.

This data was backed up from the interviews. One respondent mentioned the “weekly cue” explicitly. And Jon Hickman’s research also identified that investigation activity related to “events and interventions. Leadership, especially by staffers, and tasking appeared to be the main drivers of activity within the investigation.” (2010, p10)

He breaks down activity on the site into three ‘acts’, although their relationship to the success of the investigation is not explored further:

  • ‘Brainstorm’ (an initial flurry of activity, much of which is focused on scoping the investigation and recruiting)
  • ‘Consolidation’ (activity is driven by new information)
  • ‘Long tail’ (intermittent caretaker activity, such as supportive comments or occasional updates)

Networked utility

Hickman describes the site as a “centralised sub-network that suits a specific activity” (2010, p12). Importantly, this sub-network forms part of a larger ‘network of networks’ which involves spaces such as users’ blogs, Twitter, Facebook, email and other platforms and channels.

“And yet Help Me Investigate still provided a useful space for them to work within; investigators and staffers feel that the website facilitates investigation in a way that their other social media tools could not:

““It adds the structure and the knowledge base; the challenges, integration with ‘what do they know’ ability to pose questions allows groups to structure an investigation logically and facilitates collaboration.” (Interview with investigator)” (Hickman, 2010, p12)

In the London Weekly investigation the site also helped keep track of a number of discussions taking place around the web. Having been born from a discussion on Twitter, further conversations on Twitter resulted in further people signing up, along with comments threads and other online discussion. This fit the way the site was designed culturally – to be part of a network rather than asking people to do everything on-site.

The presence of ‘alpha’ users like James and Judith was crucial in driving activity on the site – a pattern observed in other successful investigations. They picked up the threads contributed by others and not only wove them together into a coherent narrative that allowed others to enter more easily, but also set the new challenges that provided ways for people to contribute. The fact that they brought with them a strong social network presence is probably also a factor – but one that needs further research.

The site had been designed to emphasise the role of the user in driving investigations. The agenda is not owned by a central publisher, but by the person posing the question – and therefore the responsibility is theirs as well. This cultural hurdle – towards acknowledging personal power and responsibility – may be the biggest one that the site has to address, and the offer of “failure for free” (Shirky, 2008), allowing users to learn what works and what doesn’t, may support that.

The fact that crowdsourcing worked well for the investigation is worth noting, as it could be broken down into separate parts and paths – most of which could be completed online: “Where does this claim come from?” “Can you find out about this person?” “What can you discover about this company?”. One person, for example, used Google Streetview to establish that the registered address of the company was a postbox. Other investigations that are less easily broken down may be less suitable for crowdsourcing – or require more effort to ensure success.

Momentum and direction

A regular supply of updates provided the investigation with momentum. The accumulation of discoveries provided valuable feedback to users, who then returned for more. In his book on Wikipedia, Andrew Lih (2009 p82) notes a similar pattern – ‘stigmergy’ – that is observed in the natural world: “The situation in which the product of previous work, rather than direct communication [induces and directs] additional labour”. An investigation without these ‘small pieces, loosely joined’ (Weinberger, 2002) might not suit crowdsourcing so well.

Hickman’s interviews with participants in the Birmingham council website investigation found a feeling of the investigation being communally owned and led:

“Certain members were good at driving the investigation forward, helping decide on what to do next, but it did not feel like anyone was in charge as such.”

“I’d say HMI had pivital role in keeping us together and focused but it felt owned by everyone.” (Hickman 2010, p10)

One problem, however, was that the number of diverging paths led to a range of potential avenues of enquiry. In the end, although the core questions were answered (was the publication a hoax and what were the bases for their claims) the investigation raised many more questions. These remained largely unanswered once the majority of users felt that their questions had been answered. As in a traditional investigation, there came a point at which those involved had to make a judgement whether they wished to invest any more time in it.

Finally, the investigation benefited from a diverse group of contributors who contributed specialist knowledge or access. Some physically visited stations where the newspaper was claiming distribution to see how many copies were being handed out. Others used advanced search techniques to track down details on the people involved and the claims being made, or to make contact with people who had had previous experiences with those behind the newspaper. The visibility of the investigation online also led to more than one ‘whistleblower’ approach providing inside information, which was not published on the site but resulted in new challenges being set.

The final part of this series outlines some conclusions to be taken from the project, and where it plans to go next.

What are the characteristics of a crowdsourced investigation? A case study in crowdsourcing investigative journalism part 5

Continuing the serialisation of the research underpinning a new Help Me Investigate project, in this fifth part I explore the characteristics of crowdsourcing outlined in the literature. Previous parts are linked below:

What are the characteristics of a crowdsourced investigation?

Tapscott and Williams (2006, p269) explore a range of new models of collaboration facilitated by online networks across a range of industries. These include:

  • Peer producers creating “products made of bits – from operating systems to encyclopedias”
  • “Ideagoras … a global marketplace of ideas, innovations and uniquely qualified minds”
  • Prosumer – ‘professional consumer’ – communities which can produce value if given the right tools by companies
  • Collaborative science (“The New Alexandrians”)
  • Platforms for participation
  • “Global plant floors” – physical production lines split across countries
  • Wiki workplaces which cut across organisational hierarchies

Most of these innovations have not touched the news industry, and some – such as platforms for participation – are used in publishing, but rarely in news production itself (an exception here can be made for a few magazine communities, such as Reed Business Information’s Farmer’s Weekly).

Examples of explicitly crowdsourced journalism can be broadly classified into two types. The first – closest to the ‘Global plant floors’ described above – can be described as the ‘Mechanical Turk’ model (after the Amazon-owned web service that allows you to offer piecemeal payment for repetitive work). This approach tends to involve large numbers of individuals performing small, similar tasks. Examples from journalism would include The Guardian’s experiment with inviting users to classify MPs’ expenses in order to find possible stories, or the pet food bloggers inviting users to add details of affected pets to their database.

The second type – closest to the ‘peer producers’ model – can be described as the ‘Wisdom of Crowds’ approach (after James Surowiecki’s 2005 book of the same name). This approach tends to involve smaller numbers of users performing discrete tasks that rely on a particular expertise. It follows the creed of open source software development, often referred to as Linus’ Law, which states that: “Given enough eyeballs, all bugs are shallow” (Raymond, 1999). The Florida News Press example given above fits into this category, relying as it did on users with specific knowledge (such as engineering or accounting) or access. Another example – based explicitly on examples in Surowiecki’s book – is that of an experiment by The Guardian’s Charles Arthur to predict the specifications of Apple’s rumoured tablet (Arthur, 2010). Over 10,000 users voted on 13 questions, correctly predicting its name, screen size, colour, network and other specifications – but getting other specifications, such as its price, wrong.

Help Me Investigate fits into the ‘Wisdom of Crowds’ category: rather than requiring users to complete identical tasks, the technology splits investigations into different ‘challenges’. Users are invited to tag themselves so that it is easier to locate users with particular expertise (tagged ‘FOI’ or ‘lawyer’ for example) or in a particular location, and many investigations include a challenge to ‘invite an expert’ from a particular area that is not represented in the group of users.

Some elements of Tapscott and Williams’s list can also be related to Help Me Investigate’s processes: for example, the site itself was a ‘platform for participation’ which allowed users from different professions to collaborate without any organisational hierarchy. There was an ‘ideagora’ for suggesting ways of investigating, and the resulting stories were examples of peer production.

One of the first things the research analysed was whether the investigation data matched up to patterns observed elsewhere in crowdsourcing and online activity. An analysis of the number of actions by each user, for example, showed a clear ‘power law’ distribution, where a minority of users accounted for the majority of activity.

This power law, however, did not translate into a breakdown approaching the 90-9-1 ‘law of participation inequality’ observed by Jakob Nielsen (2006). Instead, the balance between those who made a couple of contributions (normally the 9% of the 90-9-1 split) and those who made none (the 90%) was roughly equal. This may have been because the design of the site meant it was not possible to ‘lurk’ without being a member of the site already, or being invited and signing up. Adding in data on those looking at the investigation page who were not members may shed further light on this.

In Jon Hickman’s ethnography of a different investigation (into the project to deliver a new website for Birmingham City Council) he found a similar pattern: of the 32 ‘investigators’, thirteen did nothing more than join the investigation. Others provided “occasional or one-off contributions”, and a few were “prolific” (Hickman, 2010, p10). Rather than being an indication of absence, however, Hickman notes the literature on lurking that suggests it provides an opportunity for informal learning. He identifies support for this in his interviews with lurkers on the site:

“One lurker was a key technical member of the BCC DIY collective: the narrative within Help Me Investigate suggested a low level of engagement with the process and yet this investigator was actually quite prominent in terms of their activism; the lurker was producing pragmatic outcomes and responses to the investigation, although he produced no research for the project. On a similar note, several of the BCC DIY activists were neither active nor lurking within Help Me Investigate. For example, one activist’s account of BCC DIY shows awareness of, and engagement with, the connection between the activist activity and the investigation, even though he is not an active member of the investigation within Help Me Investigate.” (Hickman, 2010, p17)

In the next part I explore what qualities made for successful crowdsourcing in the specific instance of Help Me Investigate.

Universities without walls

@majohns Economist believes in future their distinguished and knowledgable audience is as important as their editors #smart_2011

This post forms part of the Carnival of Journalism, whose theme this month is universities’ roles in their local community.

In traditional journalism the concept of community is a broad one, typically used when the speaker really means ‘audience’, or ‘market’.

In a networked age, however, a community is an asset: it is a much more significant source of information than in other media; an active producer of content; and, perhaps most importantly, at the heart of any online distribution system.

You can see this at work in some of the most successful content startups of the internet era – Boing Boing, The Huffington Post, Slashdot – and even in mainstream outlets such as The Guardian, with, for example, its productive community around the Data Blog.

Any fledgling online journalism operation which is not based on a distinct community is, to my thinking, simply inefficient – and any journalism course that features an online element should be built on communities – should be linking in to the communities that surround it.

Teaching community-driven journalism

My own experience is that leaving the walls of academia behind and hosting classes wherever the community meets can make an enormous difference. In my MA in Online Journalism at Birmingham City University, for example, the very first week is not about newsgathering or blogging or anything to do with content: it’s about community, and identifying which one the students are going to serve.

To that end students spend their induction week attending the local Social Media Cafe, meeting local bloggers and understanding that particular community (one of whom this year suggested the idea that led to Birmingham Budget Cuts). We hold open classes in a city centre coffee shop so that people from Birmingham can drop in: when we talked about online journalism and the law, there were bloggers, former newspaper editors, and a photographer whose contributions turned the event into something unlike anything you’d see in a classroom.

And students are sent out to explore the community as part of learning about blogging, or encouraged to base themselves physically in the communities they serve. Andy Brightwell and Jon Hickman’s hyperlocal Grounds blog is a good example, run out of another city centre coffee shop in their patch.

In my online journalism classes at City University in London, meanwhile (which are sadly too big to fit in a coffee shop) I ask students to put together a community strategy as one of their two assignments. The idea is to get them to think about how they can produce better journalism – that is also more widely read – by thinking explicitly about how to involve a community in its production.

Community isn’t a postcode

But I’ve also come to believe that we should be as flexible as possible about what we mean by community. The traditional approach has been to assign students to geographical patches – a relic of the commercial imperatives behind print production. Some courses are adapting this to smaller, hyperlocal, patches for their online assessment to keep up with contemporary developments. This is great – but I think it risks missing something else.

One moment that brought this home to me was when – in that very first week – I asked the students what they thought made a community. The response that stuck in my mind most was Alex Gamela‘s: “An enemy”. It illustrates how communities are created by so many things other than location (You could also add “a cause”, “a shared experience”, “a profession”, “a hobby” and others which are listed and explored in the Community part of the BASIC Principles of Online Journalism).

As journalism departments we are particularly weak in seeing community in those terms. One of the reasons Birmingham Budget Cuts is such a great example of community-driven journalism is that it addresses a community of various types: one of location, of profession, and of shared experience and – for the thousands facing redundancy – cause too. It is not your typical hyperlocal blog, but who would argue it does not have a strong proposition at its core?

There’s a further step, too, which requires particular boldness on the part of journalism schools, and innovativeness in assessment methods: we need to be prepared for students to create sites where they don’t create any journalism themselves at all. Instead, they facilitate its production, and host the platform that enables it to happen. In online journalism we might call this a community manager role – which will raise the inevitable questions of ‘Is It Journalism?’ But in traditional journalism, with the journalism being produced by reporters, a very similar role would simply be called being an editor.

PS: I spoke about this theme in Amsterdam last September as part of a presentation on ‘A Journalism Curriculum for the 21st Century’ at the PICNIC festival, organised by the European Journalism Centre. This is embedded below:

Slides can be found below:

Something for the Weekend #8: the easiest blogging platform in the world: Posterous

Assuming you want them to, how do you get people to blog? It’s a challenge facing most community editors, particularly as they seek to encourage a conversation with readers for whom WordPress or Blogger are still too fiddly.

Enter Posterous, a fantastically intuitive, quick and easy blogging platform. Scrapping the need for registration, or even the need to go onto the web, this has the potential to be a mass blogging tool – as well as a great tool for blogging on the move. Continue reading