Tag Archives: crowdsourcing

Ethics in data journalism: accuracy

The following is the first in a series of extracts from a draft book chapter on ethics in data journalism. This is a work in progress, so if you have examples of ethical dilemmas, best practice, or guidance, I’d be happy to include it with an acknowledgement.

Data journalism ethics: accuracy

Probably the most basic ethical consideration in data journalism is the need to be accurate, and provide proper context to the stories that we tell. That can influence how we analyse the data, report on data stories, or our publication of the data itself.

In late 2012, for example, data journalist Nils Mulvad finally got his hands on veterinary prescriptions data that he had been fighting for for seven years. But he decided not to publish the data when he realised that it was full of errors. Continue reading

Hyperlocal Voices: Rachel Howells, Port Talbot MagNet

The second in a new series of Hyperlocal Voices looks at the Port Talbot MagNet, a not-for-profit community co-operative which has been set up to provide a local news and information service to the people and communities of Port Talbot.

Board Member Rachel Howells took time out to reflect on developments since their launch in 2010 and to tell Damian Radcliffe about some plans for the future.

1. Who were the people behind the blog?

Port Talbot Magnet was started in 2010 by seven professional journalists from South Wales who had all been casualties of redundancy or cuts in freelance budgets in established magazines and newspapers. First and foremost, we are a workers’ co-operative, but we are also a social enterprise and so we are keen to ensure we a force for good in the community. Two and half years on, we still have seven directors, as well as around 20 co-op members and lots of volunteers.

2. What made you decide to set up the blog?

As NUJ members, we found ourselves sitting in so many meetings talking about cuts and closures and it felt sometimes like the local media industry was falling down around our ears. When redundancy hit most of our local Union branch committee we decided that we would do something proactive about the situation to try to ensure good quality journalism was still a viable, sustainable career.

As we were setting up the co-operative, we heard that the weekly newspaper in the town of Port Talbot was closing and it seemed an obvious gap for us to try to fill – here was a town of 35,000 people without a dedicated newspaper and here were seven out-of-work journalists who could supply news. Making the one pay for the other was, and in many ways still is, the problem.

3. When did you set up the blog and how did you go about it?

The blog came along much later. Our first ambition was to go into print and we spent about a year applying for funding and trying to get the project off the ground in some way. The funding applications weren’t successful unfortunately, and we had a crisis meeting where we decided to change tack and concentrate on what we did best – journalism. This turned out to be a good move, because we could show what we were capable of; people suddenly understood what we were trying to achieve.

In a more practical sense, we had no capital apart from donations from the directors and so we set up a WordPress blog, paying a modest amount for a theme, and we got in touch with local companies and the council and asked them to put us on their mailing lists for press releases. Then we spent lots of time learning the patch and making contacts. Facebook has been a particularly good way to reach the online community in Port Talbot (not many are using Twitter yet), and drives about half our website traffic.

4. What other blogs, bloggers or websites influenced you?

We set up our own crowdfunding model called Pitch-in! which was hugely influenced by Spot.Us, although we changed the idea a bit to suit a more hyperlocal audience. I love what Spot.Us has done to empower freelance journalists and as this was at the heart of our enterprise we have been really keen to offer this as a service to our members.

5. How did – and do – you see yourself in relation to a traditional news operation?

We would like to be more like one, I think, but we don’t have the resources at the moment. As we are so reliant on volunteers we don’t have the consistency that a traditional newsroom can offer – for example we can’t always cover local council meetings because our volunteers have other commitments as well. But I think we all believe in the principles behind traditional newsrooms and the power they have to be a force for good in the community as a watchdog or a voice.

For right or wrong, journalists can ask the questions that perhaps get ignored when members of the public ask them, and even with our limitations we are able to perform this aspect of newsroom journalism.

In future we hope we will become more sustainable so we can pay journalists and operate a more professional service, but this will always be in co- operation with the local community. We always have a day every week where people can call in to the office and speak to us, which is what all local newsrooms used to do.

6. What have been the key moments in the blog’s development editorially?

Aside from launching the website in the first place, a successful system has been our ‘editor of the week’ rota, which has seen a team of five journalists taking it in turns to supervise the website, commission volunteers and respond to emails. This has meant there’s always been a clear point of contact every week and that things don’t get missed. Another big milestone has also been paying journalists for their skills, which we have started to do in the last few months. So far we’ve only been able to pay for court reports but we plan to do more of this as finances allow.

7. What sort of traffic do you get and how has that changed over time?

We get a consistent 3,500 unique visitors every month now, which has more than trebled in a year. We have seen some great peaks around some of our coverage, too – notably stories about The Passion, a landscape theatre production which took place in Port Talbot in 2011 and starred locally-raised Hollywood star Michael Sheen. We have also had great responses to our coverage of protests and campaigns, crime and local elections.

8. What is / has been your biggest challenge to date?

The lack of funding and the lack of resources. Three of our seven directors have full time jobs, one has failing health and the other three have freelance or other commitments, and so progress can sometimes be frustratingly slow as we try to recruit or train volunteers and manage the website, finances and keep our contacts live. But we are still here, and the project continues to chalk up successes.

9. What story, feature or series are you most proud of?

I think our coverage of The Passion was pretty impressive.

We had twelve volunteers covering the three days of live theatre and we produced a hugely comprehensive mix of written reporting, photography, video and audio – some of which we still haven’t had time to edit and upload to the website more than a year on.

It was a unique production that took place all over the town in both scheduled and unscheduled performances, and therefore a unique challenge to cover it all. I think our archive shows how daunting a task it was and how well we worked as a team to do it. I don’t think any other media outlet managed the comprehensive coverage we produced. I look back at it now and wonder how on earth we managed it.

10. What are your plans for the future?

There was an anniversary exhibition over Easter which commemorated The Passion and, in partnership with National Theatre Wales, we produced the official souvenir programme for it. This was our first foray into print, and we made a modest profit from advertising. It showed us that going into print would be an obvious move in the future, and so now we are developing ways we could make the website work alongside a printed news-sheet.

More generally, we would like to keep growing, pay journalists and establish a sustainable model that could benefit other communities who are facing similar ‘news black holes’ following the death of a local newspaper.

And we’d really like to persuade the local council to let us film their council meetings…

Creating dynamic visualisations using Google Forms and Google Gadgets

If you need to gather data on the ground – or want to crowdsource data through an online form – this is how you can visualise the results as they come in using 3 Google Docs tools. They are:

  • Google Forms
  • Google Docs spreadsheet
  • Google Gadgets

And here’s the process: Continue reading

“All that is required is an issue about which others are passionate and feel unheard”

Here’s a must-read for anyone interested in sports journalism that goes beyond the weekend’s player ratings. As one of the biggest names in European football goes into administration, The Guardian carries a piece by the author of Rangerstaxcase.com, a blogger who “pulled down the facade at Rangers”, including a scathing commentary on the Scottish press’s complicity in the club’s downfall:

“The Triangle of Trade to which I have referred is essentially an arrangement where Rangers FC and their owner provide each journalist who is “inside the tent” with a sufficient supply of transfer “exclusives” and player trivia to ensure that the hack does not have to work hard. Any Scottish journalist wishing to have a long career learns quickly not to bite the hands that feed. The rule that “demographics dictate editorial” applied regardless of original footballing sympathies.

“[...] Super-casino developments worth £700m complete with hover-pitches were still being touted to Rangers fans even after the first news of the tax case broke. Along with “Ronaldo To Sign For Rangers” nonsense, it is little wonder that the majority of the club’s fans were in a state of stupefaction in recent years. They were misled by those who ran their club. They were deceived by a media pack that had to know that the stories it peddled were false.”

Over at Rangerstaxcase.com, the site expands on this in its criticism of STV for uncritical reporting: Continue reading

Announcing Help Me Investigate the Olympics

My crowdsourced investigative journalism project Help Me Investigate has launched a fourth specialist site: Help Me Investigate the Olympics.

The site is being run by a colleague of mine from Birmingham City University, Jennifer Jones, as part of a project we’re working on which sees students at BCU and other universities connecting to wider online networks in investigating Olympics-related questions.

Jennifer knows those networks particularly well as the coordinator for #media2012, web editor and staff editor for Culture @ the Olympics. She is also writing her PhD on Social Media, Activism and the Olympic Games at the University of the West of Scotland.

If you want to contribute to the site or related investigations, get in touch in the comments or via Olympics@helpmeinvestigate.com

A case study in crowdsourcing investigative journalism part 7: Conclusions

In the final part of the research underpinning a new Help Me Investigate project I explore the qualities that successful crowdsourcing investigations shared. Previous parts are linked below:

Conclusions

Looking at the reasons that users of the site as a whole gave for not contributing to an investigation, the majority attributed this to ‘not having enough time’. Although at least one interviewee, in contrast, highlighted the simplicity and ease of contributing, it needs to be as easy and simple as possible for users to contribute (or appear to be) in order to lower the perception of effort and time needed.

Notably, the second biggest reason for not contributing was a ‘lack of personal connection with an investigation’, demonstrating the importance of the individual and social dimension of crowdsourcing. Likewise, a ‘personal interest in the issue’ was the single largest factor in someone contributing. A ‘Why should I contribute?’ feature on crowdsourcing projects may be worth considering.

Others mentioned the social dimension of crowdsourcing – the “sense of being involved in something together” – what Jenkins (2006, p244) would refer to as “consumption as a networked practice”, a motivation also identified by Yochai Benkler in his work on networks (2006). Looking at non-financial motivations behind people contributing their time to online projects, he refers to “socio-psychological reward”. He also identifies the importance of “hedonic personal gratification”. In other words, fun.

Although positive feedback formed part of the design of the site, no consideration was paid to negative feedback: users being made aware of when they were not succeeding. This element also appears to be absent from game mechanics in other crowdsourcing experiments such as The Guardian’s MPs’ expenses app.

While it is easy to talk about “Failure for free”, more could be done to identify and support failing investigations. A monthly update feature that would remind users of recent activity and – more importantly – the lack of activity might help here. The investigators in a group might be asked whether they wish to terminate the investigation in those cases, emphasising their responsibility for its progress and helping ‘clean up’ the investigations listed on the first page of the site.

However, there is also a danger in interfering too much in reducing failure. This is a natural instinct, and the establishment of a reasonable ‘success rate’ at the outset – based on the literature around crowdsourcing – helps to counter this. That was part of the design of Help Me Investigate: it was the 1-5% of questions that gained traction that would be the focus of the site. One analogy is a news conference where members throw out ideas – only a few are chosen for investment of time and energy, the rest ‘fail’.

It is the management of that tension between interfering to ensure everything succeeds (and so removing the incentive for users to be self-motivated) and not interfering at all (leaving users feeling unsupported and unmotivated) that is likely to be the key to a successful crowdsourcing project. More than a year into the project, this tension was still being negotiated.

In summing up the research into Help Me Investigate it is possible to identify five qualities which successful investigations shared: ‘Alpha users’ (highly active, who drove investigations forward); modularity (the ability to break down a large investigation into smaller discrete elements); public-ness (the ability for others to find out about an investigation); feedback (game mechanics and the pleasure of using the site); and diversity of users.

Relating these findings to other research into crowdsourcing more generally it is possible to make broader generalisations regarding how future projects might be best organised. Leadbeater (2008, p68), for example, identifies five key principles of successful collaborative projects, summed up as ‘Core’ (directly comparable to the need for alpha users identified in this research); ‘Contribute’ (large numbers, comparable to public-ness); ‘Connect’ (diversity); ‘Collaborate’ (self governance – relating indirectly to modularity); and ‘Create’ (creative pleasure – relating indirectly to feedback). Similar qualities are also identified by US investigative reporter and Knight fellow Wendy Norris in her experiments with crowdsourcing (Lavrusik, 2010).

The most notable connections here are the indirect ones. While the technology of Help Me Investigate allowed for modularity, for example, the community structure was rather flat. Leadbeater’s research (2008) and that of Lih (2009) into the development of Wikipedia and Tsui (2010, PDF) into Global Voices indicate that ‘modularity’ may be part of a wider need for ‘structure’. Conversely ‘feedback’ provides a specific, practical way for crowdsourcing projects to address users’ need for creative pleasure.

As Help Me Investigate reached its 18th month a number of changes were made to test these ideas: the code was released as open source, effectively crowdsourcing the technology itself, and a strategy was adopted to recruit niche community managers who could build expertise in particular fields, along with an advisory board that was similarly diverse. The Help Me Investigate design was replicated in a plugin which would allow anyone running a self-hosted WordPress blog to manage their own version of the site.

This separation of technology from community was a key learning outcome of the project. While the site had solved some of the technical challenges of crowdsourcing and identified the qualities of successful crowdsourced investigation, it was clear that the biggest challenge lay in connecting the increasingly networked communities that wanted to investigate public interest issues – and in a way that was both sustainable and scalable beyond the level of individual investigations.

 

References

  1. Arthur, Charles. Forecasting is a notoriously imprecise science – ask any meteorologist, January 29 2010, The Guardian, http://www.guardian.co.uk/technology/2010/jan/29/apple-ipad-crowdsource accessed 14/3/2011
  2. Beckett, Charlie (2008) SuperMedia, Oxford: Blackwell
  3. Belam, Martin. Whatever Paul Waugh thinks, The Guardian’s MPs Expenses crowd-sourcing experiment was no “total failure”, Currybetdotnet, March 10 2010 http://www.currybet.net/cbet_blog/2010/03/whatever-paul-waugh-thinks-the.php accessed 14/3/2011
  4. Belam, Martin. Abort? Retry? Fail? – Judging the success of the Guardian’s MP’s expenses app, Currybetdotnet, March 7 2011, http://www.currybet.net/cbet_blog/2011/03/guardian-mps-expenses-success.php accessed 14/3/2011
  5. Belam, Martin. The Guardian’s Paul Lewis on crowd-sourcing investigative journalism with Twitter, Currybetdotnet, March 10 2011, http://www.currybet.net/cbet_blog/2011/03/paul-lewis-investigative-journalism-twitter.php accessed 14/3/2011
  6. Benkler, Yochai (2006) The Wealth of Networks, New Haven: Yale University Press
  7. Bonomolo, Alessandra. Repubblica.it’s experiment with “Investigative reporting on demand”, Online Journalism Blog, March 21 2011, http://paulbradshaw.wpengine.com/2011/03/21/repubblica-its-experiment-with-investigative-reporting-on-demand/ accessed 23/3/2011
  8. Bradshaw, Paul. Wiki Journalism: Are wikis the new blogs? Paper presented to The Future of Journalism conference, Cardiff University, September 2007, http://onlinejournalismblog.files.wordpress.com/2007/09/wiki_journalism.pdf
  9. Bradshaw, Paul. The Guardian’s tool to crowdsource MPs’ expenses data: time to play, Online Journalism Blog, June 19 2009 http://paulbradshaw.wpengine.com/2009/06/19/the-guardian-build-a-platform-to-crowdsource-mps-expenses-data/ accessed 14/3/2011
  10. Brogan, C., & Smith, J. (2009). Trust Agents: Using the Web to Build Influence, Improve
  11. Reputation, and Earn Trust (1 ed.), New Jersey: Wiley
  12. Bruns, Axel (2005) Gatewatching, New York: Peter Lang
  13. Bruns, Axel (2008) Blogs, Wikipedia, Second Life and Beyond, New York: Peter Lang
  14. De Burgh, Hugo (2008) Investigative Journalism, London: Routledge
  15. Dondlinger, Mary Jo. Educational Video Game Design: A Review of the Literature, Journal of Applied Educational Technology Volume 4, Number 1, Spring/Summer 2007, http://www.eduquery.com/jaet/JAET4-1_Dondlinger.pdf
  16. Ellis, Justin. A perpetual motion machine for investigative reporting: CPI and PRI partner on state corruption project, Nieman Journalism Lab, March 8 2011 http://www.niemanlab.org/2011/03/a-perpetual-motion-machine-for-investigative-reporting-cpi-and-pri-partner-on-state-corruption-project/ accessed 21/3/2011
  17. Graham, John. Feedback in Game Design, Wolfire Blog, April 21 2010 http://blog.wolfire.com/2010/04/Feedback-In-Game-Design accessed 14/3/2011
  18. Grey, Stephen (2006) Ghost Plane, London: C Hurst & Co
  19. Hickman, Jon. Help Me Investigate: the social practices of investigative journalism, Paper presented to the Media Production Analysis Working Group, IAMCR, Braga, 2010, http://theplan.co.uk/help-me-investigate-the-social-practices-of-i
  20. Howe, Jeff. Gannett to Crowdsource News, Wired, November 3 2006, http://www.wired.com/software/webservices/news/2006/11/72067 accessed 14/3/2011
  21. Jenkins, Henry (2006) Convergence Culture, New York: New York University Press
  22. Lavrusik, Vadim. How Investigative Journalism Is Prospering in the Age of Social Media, Mashable, November 24 2010, http://mashable.com/2010/11/24/investigative-journalism-social-web/ accessed 14/3/2011
  23. Leadbeater (2008) We-Think, London: Profile Books
  24. Leigh, David. Help us solve the mystery of Blair’s money, The Guardian, December 1 2009, http://www.guardian.co.uk/politics/2009/dec/01/help-us-solve-blair-mystery accessed 14/3/2011
  25. Lih, Andrew (2009) The Wikipedia Revolution, London: Aurum Press
  26. Marshall, Sarah. Snow map developer creates ‘Cutsmap’ for Channel 4’s budget coverage, Journalism.co.uk, 22 March 2011, http://www.journalism.co.uk/news/snow-map-developer-creates-cutsmap-for-channel-4-s-budget-coverage/s2/a543335/ accessed 22/3/2011
  27. Morozov, Evgeny (2011) The Net Delusion, London: Allen Lane
  28. Nielsen, Jakob. Participation Inequality: Encouraging More Users to Contribute, Jakob Nielsen’s Alertbox, October 9, 2006, http://www.useit.com/alertbox/participation_inequality.html accessed 14/3/2011
  29. Paterson and Domingo (2008) Making Online News: The Ethnography of New Media Production, New York: Peter Lang
  30. Porter, Joshua (2008) Designing for the Social Web, Berkeley: New Riders
  31. Raymond, Eric S. (1999) The Cathedral and the Bazaar, New York: O’Reilly
  32. Scotney, Tom. Help Me Investigate: How working collaboratively can benefit journalists, Journalism.co.uk, August 14 2009, http://www.journalism.co.uk/news-features/help-me-investigate-how-working-collaboratively-can-benefit-journalists/s5/a535469/ accessed 21/3/2011
  33. Shirky, Clay (2008) Here Comes Everybody, London: Allen Lane
  34. Snyder, Chris. Spot.Us Launches Crowd-Funded Journalism Project, Wired, November 10, 2008, http://www.wired.com/epicenter/2008/11/spotus-launches/ accessed 21/3/2011
  35. Surowiecki, James (2005) The Wisdom of Crowds, London: Abacus
  36. Tapscott, Don & Williams, Anthony (2006) Wikinomics, London: Atlantic Books
  37. Tsui, Lokman. A Journalism of Hospitality, unpublished thesis, Presented to the Faculties of the University of Pennsylvania, 2010 http://dl.dropbox.com/u/22048/Tsui-Dissertation-Deposit-Final.pdf accessed 14/3/2011
  38. Weinberger, David (2002) Small Pieces, Loosely Joined, New York: Basic Books

What made the crowdsourcing successful? A case study in crowdsourcing investigative journalism part 6

In the penultimate part of the serialisation of research underpinning a new Help Me Investigate project I explore the qualities that successful crowdsourcing investigations shared. Previous parts are linked below:

What made the crowdsourcing successful?

Clearly, a distinction should be made between what made the investigation successful as a series of outcomes, and what made crowdsourcing successful as a method for investigative reporting. This section concerns itself with the latter.

What made the community gather, and continue to return? One hypothesis was that the nature of the investigation provided a natural cue to interested parties – The London Weekly was published on Fridays and Saturdays and there was a build up of expectation to see if a new issue would indeed appear.

The data, however, did not support this hypothesis. There was indeed a rhythm but it did not correlate to the date of publication. Wednesdays were the most popular day for people contributing to the investigation.

Upon further investigation a possible explanation was found: one of the investigation’s ‘alpha’ contributors – James Ball – had set himself a task to blog about the investigation every week. His blog posts appeared on a Wednesday.

That this turned out to be a significant factor in driving activity suggests one important lesson: talking publicly and regularly about the investigation’s progress is key to its activity and success.

This data was backed up from the interviews. One respondent mentioned the “weekly cue” explicitly. And Jon Hickman’s research also identified that investigation activity related to “events and interventions. Leadership, especially by staffers, and tasking appeared to be the main drivers of activity within the investigation.” (2010, p10)

He breaks down activity on the site into three ‘acts’, although their relationship to the success of the investigation is not explored further:

  • ‘Brainstorm’ (an initial flurry of activity, much of which is focused on scoping the investigation and recruiting)
  • ‘Consolidation’ (activity is driven by new information)
  • ‘Long tail’ (intermittent caretaker activity, such as supportive comments or occasional updates)

Networked utility

Hickman describes the site as a “centralised sub-network that suits a specific activity” (2010, p12). Importantly, this sub-network forms part of a larger ‘network of networks’ which involves spaces such as users’ blogs, Twitter, Facebook, email and other platforms and channels.

“And yet Help Me Investigate still provided a useful space for them to work within; investigators and staffers feel that the website facilitates investigation in a way that their other social media tools could not:

““It adds the structure and the knowledge base; the challenges, integration with ‘what do they know’ ability to pose questions allows groups to structure an investigation logically and facilitates collaboration.” (Interview with investigator)” (Hickman, 2010, p12)

In the London Weekly investigation the site also helped keep track of a number of discussions taking place around the web. Having been born from a discussion on Twitter, further conversations on Twitter resulted in further people signing up, along with comments threads and other online discussion. This fit the way the site was designed culturally – to be part of a network rather than asking people to do everything on-site.

The presence of ‘alpha’ users like James and Judith was crucial in driving activity on the site – a pattern observed in other successful investigations. They picked up the threads contributed by others and not only wove them together into a coherent narrative that allowed others to enter more easily, but also set the new challenges that provided ways for people to contribute. The fact that they brought with them a strong social network presence is probably also a factor – but one that needs further research.

The site had been designed to emphasise the role of the user in driving investigations. The agenda is not owned by a central publisher, but by the person posing the question – and therefore the responsibility is theirs as well. This cultural hurdle – towards acknowledging personal power and responsibility – may be the biggest one that the site has to address, and the offer of “failure for free” (Shirky, 2008), allowing users to learn what works and what doesn’t, may support that.

The fact that crowdsourcing worked well for the investigation is worth noting, as it could be broken down into separate parts and paths – most of which could be completed online: “Where does this claim come from?” “Can you find out about this person?” “What can you discover about this company?”. One person, for example, used Google Streetview to establish that the registered address of the company was a postbox. Other investigations that are less easily broken down may be less suitable for crowdsourcing – or require more effort to ensure success.

Momentum and direction

A regular supply of updates provided the investigation with momentum. The accumulation of discoveries provided valuable feedback to users, who then returned for more. In his book on Wikipedia, Andrew Lih (2009 p82) notes a similar pattern – ‘stigmergy’ – that is observed in the natural world: “The situation in which the product of previous work, rather than direct communication [induces and directs] additional labour”. An investigation without these ‘small pieces, loosely joined’ (Weinberger, 2002) might not suit crowdsourcing so well.

Hickman’s interviews with participants in the Birmingham council website investigation found a feeling of the investigation being communally owned and led:

“Certain members were good at driving the investigation forward, helping decide on what to do next, but it did not feel like anyone was in charge as such.”

“I’d say HMI had pivital role in keeping us together and focused but it felt owned by everyone.” (Hickman 2010, p10)

One problem, however, was that the number of diverging paths led to a range of potential avenues of enquiry. In the end, although the core questions were answered (was the publication a hoax and what were the bases for their claims) the investigation raised many more questions. These remained largely unanswered once the majority of users felt that their questions had been answered. As in a traditional investigation, there came a point at which those involved had to make a judgement whether they wished to invest any more time in it.

Finally, the investigation benefited from a diverse group of contributors who contributed specialist knowledge or access. Some physically visited stations where the newspaper was claiming distribution to see how many copies were being handed out. Others used advanced search techniques to track down details on the people involved and the claims being made, or to make contact with people who had had previous experiences with those behind the newspaper. The visibility of the investigation online also led to more than one ‘whistleblower’ approach providing inside information, which was not published on the site but resulted in new challenges being set.

The final part of this series outlines some conclusions to be taken from the project, and where it plans to go next.

What are the characteristics of a crowdsourced investigation? A case study in crowdsourcing investigative journalism part 5

Continuing the serialisation of the research underpinning a new Help Me Investigate project, in this fifth part I explore the characteristics of crowdsourcing outlined in the literature. Previous parts are linked below:

What are the characteristics of a crowdsourced investigation?

Tapscott and Williams (2006, p269) explore a range of new models of collaboration facilitated by online networks across a range of industries. These include:

  • Peer producers creating “products made of bits – from operating systems to encyclopedias”
  • “Ideagoras … a global marketplace of ideas, innovations and uniquely qualified minds”
  • Prosumer – ‘professional consumer’ – communities which can produce value if given the right tools by companies
  • Collaborative science (“The New Alexandrians”)
  • Platforms for participation
  • “Global plant floors” – physical production lines split across countries
  • Wiki workplaces which cut across organisational hierarchies

Most of these innovations have not touched the news industry, and some – such as platforms for participation – are used in publishing, but rarely in news production itself (an exception here can be made for a few magazine communities, such as Reed Business Information’s Farmer’s Weekly).

Examples of explicitly crowdsourced journalism can be broadly classified into two types. The first – closest to the ‘Global plant floors’ described above – can be described as the ‘Mechanical Turk’ model (after the Amazon-owned web service that allows you to offer piecemeal payment for repetitive work). This approach tends to involve large numbers of individuals performing small, similar tasks. Examples from journalism would include The Guardian’s experiment with inviting users to classify MPs’ expenses in order to find possible stories, or the pet food bloggers inviting users to add details of affected pets to their database.

The second type – closest to the ‘peer producers’ model – can be described as the ‘Wisdom of Crowds’ approach (after James Surowiecki’s 2005 book of the same name). This approach tends to involve smaller numbers of users performing discrete tasks that rely on a particular expertise. It follows the creed of open source software development, often referred to as Linus’ Law, which states that: “Given enough eyeballs, all bugs are shallow” (Raymond, 1999). The Florida News Press example given above fits into this category, relying as it did on users with specific knowledge (such as engineering or accounting) or access. Another example – based explicitly on examples in Surowiecki’s book – is that of an experiment by The Guardian’s Charles Arthur to predict the specifications of Apple’s rumoured tablet (Arthur, 2010). Over 10,000 users voted on 13 questions, correctly predicting its name, screen size, colour, network and other specifications – but getting other specifications, such as its price, wrong.

Help Me Investigate fits into the ‘Wisdom of Crowds’ category: rather than requiring users to complete identical tasks, the technology splits investigations into different ‘challenges’. Users are invited to tag themselves so that it is easier to locate users with particular expertise (tagged ‘FOI’ or ‘lawyer’ for example) or in a particular location, and many investigations include a challenge to ‘invite an expert’ from a particular area that is not represented in the group of users.

Some elements of Tapscott and Williams’s list can also be related to Help Me Investigate’s processes: for example, the site itself was a ‘platform for participation’ which allowed users from different professions to collaborate without any organisational hierarchy. There was an ‘ideagora’ for suggesting ways of investigating, and the resulting stories were examples of peer production.

One of the first things the research analysed was whether the investigation data matched up to patterns observed elsewhere in crowdsourcing and online activity. An analysis of the number of actions by each user, for example, showed a clear ‘power law’ distribution, where a minority of users accounted for the majority of activity.

This power law, however, did not translate into a breakdown approaching the 90-9-1 ‘law of participation inequality’ observed by Jakob Nielsen (2006). Instead, the balance between those who made a couple of contributions (normally the 9% of the 90-9-1 split) and those who made none (the 90%) was roughly equal. This may have been because the design of the site meant it was not possible to ‘lurk’ without being a member of the site already, or being invited and signing up. Adding in data on those looking at the investigation page who were not members may shed further light on this.

In Jon Hickman’s ethnography of a different investigation (into the project to deliver a new website for Birmingham City Council) he found a similar pattern: of the 32 ‘investigators’, thirteen did nothing more than join the investigation. Others provided “occasional or one-off contributions”, and a few were “prolific” (Hickman, 2010, p10). Rather than being an indication of absence, however, Hickman notes the literature on lurking that suggests it provides an opportunity for informal learning. He identifies support for this in his interviews with lurkers on the site:

“One lurker was a key technical member of the BCC DIY collective: the narrative within Help Me Investigate suggested a low level of engagement with the process and yet this investigator was actually quite prominent in terms of their activism; the lurker was producing pragmatic outcomes and responses to the investigation, although he produced no research for the project. On a similar note, several of the BCC DIY activists were neither active nor lurking within Help Me Investigate. For example, one activist’s account of BCC DIY shows awareness of, and engagement with, the connection between the activist activity and the investigation, even though he is not an active member of the investigation within Help Me Investigate.” (Hickman, 2010, p17)

In the next part I explore what qualities made for successful crowdsourcing in the specific instance of Help Me Investigate.