Crowdsourcing investigative journalism: a case study (part 2)

Continuing the serialisation of the research underpinning a new Help Me Investigate project, in this second part I describe the basis for the way that the original site was constructed – and the experiences of its first few months. (Part 1 is available here)

Building the site

By 2008 two members had joined the Help Me Investigate team: web developer Stef Lewandowski and community media specialist Nick Booth, and the project won funding that year from Channel 4’s 4iP fund and regional development agency Screen West Midlands.

Two part time members of ‘staff’ were recruited to work one day per week for the site throughout the 12 week funded ‘proof of concept’ period: a support journalist and a community manager.

Site construction began in April 2009, and began by expanding on the four target user profiles in the bid document to outline 12 profiles of users who might be attracted to the site, identifying what they would want to do with the site and how the design might facilitate that – or prevent it (as in the case, for example, of users who might want to hijack or hoax the site).

This was followed by rapid site development, and testing for 6 weeks with a small private beta. The plan was to use ‘agile’ principles of web development – launching when the site was not ‘finished’ to gain an understanding of how users actually interacted with the technology, and saving the majority of the development budget for ‘iterations’ of the software in response to user demand.

The resulting site experience can be described as follows: a user coming across the site was presented with two choices: to join an existing investigation, or start their own. If they started an investigation they would be provided with suggestions for ways of breaking it down into smaller tasks and of building a community around the question being pursued. If they joined an existing investigation they would be presented with those tasks – called ‘challenges’ – that needed completing to take the investigation forward. They could then choose to accept a particular challenge and share the results of their progress underneath.

The concepts of Actor-Network Theory (Paterson and Domingo, 2008) were accounted for in development: this describes how the ‘inventors’ of a technology are not the only actors that shape its use; the technology itself (including its limitations and its relationship with other technologies, and institutional and funding factors), and those who use it would also be vital in what happened from there.

Reserving the majority of the development budget to account for the influence of these ‘actors’ on the development of the technology was a key part of the planning of the site. This proved to be a wise strategy, as user behaviour differed in some respects from the team’s expectations, and development was able to adapt accordingly.

For legal reasons, casual visitors to the site (and search engines) could only see investigation titles (which were pre-moderated) and, later, the Reports and KnowledgeBase sections of the site (which were written by site staff). Challenges and updates (the results of challenges) – which were only post-moderated – could only be seen by registered users of the site.

A person could only become a user of the site if they were invited by another user. There was also a ‘request an invite’ section on the homepage. Non-UK requests were refused for legal reasons but most other requests were granted. At this stage the objective was not to build a huge user base but to develop a strong culture on the site that would then influence its healthy future development. This was a model based on the successful development of the constructive Seesmic video blogging community.

On July 1 went live with no promotion. The day after launch one tweet was published on Twitter, linking to the site. By the end of the week the site was investigating what would come to be one of the biggest stories of the summer in Birmingham – the overspend of £2.2m by the city council on a new website. It would go on to complete further investigations into parking tickets and the use of surveillance powers, as well as much smaller-scale questions such as how a complaint was handled, or why two bus companies were charging different prices on the same route.

In the next part I look at the strengths and limitations of the site’s model of working, and how people used the site in practice.


4 thoughts on “Crowdsourcing investigative journalism: a case study (part 2)

  1. Pingback: Crowdsourcing investigative journalism: a case study (part 3) | Online Journalism Blog

  2. Pingback: Crowdsourcing investigative journalism: a case study (part 1) | Online Journalism Blog

  3. Pingback: A case study in crowdsourcing investigative journalism (part 4): The London Weekly | Online Journalism Blog

  4. Pingback: Crowdsourcing investigative journalism: a case study (part 1) | Online Journalism Blog

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.