Tag Archives: research

Daily Mail users think it’s less unbiased than Twitter/Facebook

Daily Mail impartiality compared against BBC, Twitter, Facebook and others

Is the Daily Mail less impartial than social media? That’s the takeaway from one of the charts  (shown above) in Ofcom’s latest Communications Market Report.

The report asked website and app users to rate 7 news websites against 5 criteria. The Daily Mail comes out with the lowest proportion of respondents rating it highly for ‘impartiality and unbiased‘, ‘Offers range of opinions‘, and ‘Importance‘.

This is particularly surprising given that two of the other websites are social networks. 28% rated Facebook and Twitter highly on impartiality, compared to 26% for the Daily Mail. Continue reading

How Wikileaks collaborations failed to create objective ‘global’ journalism – research

Truth, they say, is the first casualty of war. Or, as an academic might put it:

“Professional journalism takes the nation as its unit of analysis, which [means] journalists employ ‘‘closed’’ language with respect to international issues when the nation is perceived as threatened, encouraging the citizen to read world events and issues from ‘‘our’’ point of view.”

This is the scene set at the start of Robert L. Handley‘s research into collaborative cross-border journalism. Handley wants to tackle the question of whether “global journalism” can result in the more objective outlook that its proponents hope for.

The partnerships that sprung up around Wikileaks‘s warlogs and cables provide an ideal way to explore that.

Europe vs the US

The overriding finding of Handley’s research is a difference in how European and US newspapers handled the Wikileaks material. European papers, he argues, “behaved as loyal to the nation-as-citizen and, more broadly, to citizens-wherever,” but the reporting of US partner The New York Times “demonstrated a loyalty to nation-as-official.”: Continue reading

Two pieces of information

Two pieces of information that came to my attention today:

Firstly, from a piece of research on aspiring journalists in France:

“Students from the least privileged social sectors are more socially committed and more aware of their civic responsibility: These students want “to reveal cases of corruption, show realities that are unknown to the general public, and to do investigative journalism”.

“The students belonging to disadvantaged social classes value the profession of journalism the most, and have a culture of effort and selflessness, which has been inherited from their families. The force lifting the social elevator to access an intellectual profession like journalism is their constant effort. They consider journalism to be a “useful and noble” profession. They have a more romantic and social view of the profession: they want to be a real communication channel for the village people, the forgotten, and the voiceless … However, these students practice self-censorship by not working in recognised and prestigious media, unlike the students from more privileged social classes who do so because they have greater social capital and contacts in the profession of journalism thanks to their families.”

Secondly, from a number of sources on Twitter:

“Independent.co.uk is offering a rare opportunity to an aspiring young journalist. We’re looking for an exceptionally motivated, intelligent and organised undergraduate with a passion for our brand, the world of news, and student life, to come and gain work experience within our Digital team for three months this summer 2013.

“You must be able to work from Monday 17 June through to 30 August 2013. This is work experience, so it is not a paid opportunity, but your travel and lunch expenses will be covered. You will need to provide a letter from your university, confirming that this work experience placement is beneficial and supports your course.”

Over to you.

Motion graphic video workflow – a video tutorial

Motion graphics has become an increasingly popular way to present data in a compelling visual form. In a series of videos guest contributor Sihlangu Tshuma outlines his workflow process for managing a motion graphics video project, the results of which are shown at the end. All 13 videos are also available in this playlist.

1: Motion graphics introduction

2: Researching the project

3: Motion graphics treatments Continue reading

Stories and Streams: teaching collaborative journalism with peer to peer learning

In January 2012 I was facing an old problem: as I prepared to teach a new undergraduate online journalism class, I wanted to find a way to encourage students to connect with wider networks in the area they were reporting on.

Networks have always been important to journalists, but in a networked age they are more important than ever. The days of starting your contacts book with names and numbers from formal organisations listed in the local phonebook are gone. Now those are instantly available online – but more importantly, there are informal groups and expert individuals accessible too. And they’re publishing for each other.

Because of this, and because of reduced resources, the news industry is increasingly working with these networks to pursue, produce and distribute stories, from Paul Lewis’s investigative work at The Guardian to Neal Mann’s field reporting for Sky, the Farmers’ Weekly team’s coverage of foot and mouth, and Andy Carvin’s coverage of the Arab Spring at NPR.

How could I get students to do this? By rewriting the class entirely.

Continue reading

Research: disengaging from the news and hyperlocal engagement

People who live in areas branded as ‘problem communities’ by the media feel disengaged with the news – but hyperlocal citizen journalism offers an opportunity to re-engage citizens. These are the findings of a piece of research from the Netherlands called ‘When News Hurts‘, which measured mainstream coverage of ‘problem communities’ then followed a hyperlocal project which involved local people.

The findings won’t be a big surprise to those running hyperlocal blogs, which often focus on practical steps to improving their area and building civic participation rather than merely telling the stories of failure. But they do offer some lessons for traditional publishers, not just on what they could do better, but on what they’re doing badly in their current coverage – especially the regional publishers who would be expected to provide more ground-level reporting on local issues:

“Remarkably, in spite of being located close to these areas, the regional press hardly differed in their coverage from their national (quality) counterparts […] National newspapers quoted residents in 23 per cent of their larger reports on Kanaleneiland and 35 per cent of their reports on Overvecht. The regional newspaper quoted residents in only 26 per cent of its larger reports on Kanaleneiland and in 24 per cent of its reports on Overvecht. Unexpectedly, 55 per cent of all news items about a nearby elite neighbourhood (Wittevrouwen) used a resident as source.” Continue reading

Reduced Relevance – the downside of social, mobile news

Facebook Activity Plugin

News moves so quickly that your Facebook ‘friends’ just can’t keep up.

In a guest post for OJB, Neil Thurman highlights a new research report that suggests the increased availability of news on mobile platforms, and its harnessing of social networks—like Facebook—to power recommendations, comes at a price: stories that are less relevant to readers’ interests than those recommended by editors and found on news providers’ traditional websites.

Continue reading

A case study in crowdsourcing investigative journalism part 7: Conclusions

In the final part of the research underpinning a new Help Me Investigate project I explore the qualities that successful crowdsourcing investigations shared. Previous parts are linked below:


Looking at the reasons that users of the site as a whole gave for not contributing to an investigation, the majority attributed this to ‘not having enough time’. Although at least one interviewee, in contrast, highlighted the simplicity and ease of contributing, it needs to be as easy and simple as possible for users to contribute (or appear to be) in order to lower the perception of effort and time needed.

Notably, the second biggest reason for not contributing was a ‘lack of personal connection with an investigation’, demonstrating the importance of the individual and social dimension of crowdsourcing. Likewise, a ‘personal interest in the issue’ was the single largest factor in someone contributing. A ‘Why should I contribute?’ feature on crowdsourcing projects may be worth considering.

Others mentioned the social dimension of crowdsourcing – the “sense of being involved in something together” – what Jenkins (2006, p244) would refer to as “consumption as a networked practice”, a motivation also identified by Yochai Benkler in his work on networks (2006). Looking at non-financial motivations behind people contributing their time to online projects, he refers to “socio-psychological reward”. He also identifies the importance of “hedonic personal gratification”. In other words, fun.

Although positive feedback formed part of the design of the site, no consideration was paid to negative feedback: users being made aware of when they were not succeeding. This element also appears to be absent from game mechanics in other crowdsourcing experiments such as The Guardian’s MPs’ expenses app.

While it is easy to talk about “Failure for free”, more could be done to identify and support failing investigations. A monthly update feature that would remind users of recent activity and – more importantly – the lack of activity might help here. The investigators in a group might be asked whether they wish to terminate the investigation in those cases, emphasising their responsibility for its progress and helping ‘clean up’ the investigations listed on the first page of the site.

However, there is also a danger in interfering too much in reducing failure. This is a natural instinct, and the establishment of a reasonable ‘success rate’ at the outset – based on the literature around crowdsourcing – helps to counter this. That was part of the design of Help Me Investigate: it was the 1-5% of questions that gained traction that would be the focus of the site. One analogy is a news conference where members throw out ideas – only a few are chosen for investment of time and energy, the rest ‘fail’.

It is the management of that tension between interfering to ensure everything succeeds (and so removing the incentive for users to be self-motivated) and not interfering at all (leaving users feeling unsupported and unmotivated) that is likely to be the key to a successful crowdsourcing project. More than a year into the project, this tension was still being negotiated.

In summing up the research into Help Me Investigate it is possible to identify five qualities which successful investigations shared: ‘Alpha users’ (highly active, who drove investigations forward); modularity (the ability to break down a large investigation into smaller discrete elements); public-ness (the ability for others to find out about an investigation); feedback (game mechanics and the pleasure of using the site); and diversity of users.

Relating these findings to other research into crowdsourcing more generally it is possible to make broader generalisations regarding how future projects might be best organised. Leadbeater (2008, p68), for example, identifies five key principles of successful collaborative projects, summed up as ‘Core’ (directly comparable to the need for alpha users identified in this research); ‘Contribute’ (large numbers, comparable to public-ness); ‘Connect’ (diversity); ‘Collaborate’ (self governance – relating indirectly to modularity); and ‘Create’ (creative pleasure – relating indirectly to feedback). Similar qualities are also identified by US investigative reporter and Knight fellow Wendy Norris in her experiments with crowdsourcing (Lavrusik, 2010).

The most notable connections here are the indirect ones. While the technology of Help Me Investigate allowed for modularity, for example, the community structure was rather flat. Leadbeater’s research (2008) and that of Lih (2009) into the development of Wikipedia and Tsui (2010, PDF) into Global Voices indicate that ‘modularity’ may be part of a wider need for ‘structure’. Conversely ‘feedback’ provides a specific, practical way for crowdsourcing projects to address users’ need for creative pleasure.

As Help Me Investigate reached its 18th month a number of changes were made to test these ideas: the code was released as open source, effectively crowdsourcing the technology itself, and a strategy was adopted to recruit niche community managers who could build expertise in particular fields, along with an advisory board that was similarly diverse. The Help Me Investigate design was replicated in a plugin which would allow anyone running a self-hosted WordPress blog to manage their own version of the site.

This separation of technology from community was a key learning outcome of the project. While the site had solved some of the technical challenges of crowdsourcing and identified the qualities of successful crowdsourced investigation, it was clear that the biggest challenge lay in connecting the increasingly networked communities that wanted to investigate public interest issues – and in a way that was both sustainable and scalable beyond the level of individual investigations.



  1. Arthur, Charles. Forecasting is a notoriously imprecise science – ask any meteorologist, January 29 2010, The Guardian, http://www.guardian.co.uk/technology/2010/jan/29/apple-ipad-crowdsource accessed 14/3/2011
  2. Beckett, Charlie (2008) SuperMedia, Oxford: Blackwell
  3. Belam, Martin. Whatever Paul Waugh thinks, The Guardian’s MPs Expenses crowd-sourcing experiment was no “total failure”, Currybetdotnet, March 10 2010 http://www.currybet.net/cbet_blog/2010/03/whatever-paul-waugh-thinks-the.php accessed 14/3/2011
  4. Belam, Martin. Abort? Retry? Fail? – Judging the success of the Guardian’s MP’s expenses app, Currybetdotnet, March 7 2011, http://www.currybet.net/cbet_blog/2011/03/guardian-mps-expenses-success.php accessed 14/3/2011
  5. Belam, Martin. The Guardian’s Paul Lewis on crowd-sourcing investigative journalism with Twitter, Currybetdotnet, March 10 2011, http://www.currybet.net/cbet_blog/2011/03/paul-lewis-investigative-journalism-twitter.php accessed 14/3/2011
  6. Benkler, Yochai (2006) The Wealth of Networks, New Haven: Yale University Press
  7. Bonomolo, Alessandra. Repubblica.it’s experiment with “Investigative reporting on demand”, Online Journalism Blog, March 21 2011, https://onlinejournalismblog.com/2011/03/21/repubblica-its-experiment-with-investigative-reporting-on-demand/ accessed 23/3/2011
  8. Bradshaw, Paul. Wiki Journalism: Are wikis the new blogs? Paper presented to The Future of Journalism conference, Cardiff University, September 2007, https://onlinejournalismblog.files.wordpress.com/2007/09/wiki_journalism.pdf
  9. Bradshaw, Paul. The Guardian’s tool to crowdsource MPs’ expenses data: time to play, Online Journalism Blog, June 19 2009 https://onlinejournalismblog.com/2009/06/19/the-guardian-build-a-platform-to-crowdsource-mps-expenses-data/ accessed 14/3/2011
  10. Brogan, C., & Smith, J. (2009). Trust Agents: Using the Web to Build Influence, Improve
  11. Reputation, and Earn Trust (1 ed.), New Jersey: Wiley
  12. Bruns, Axel (2005) Gatewatching, New York: Peter Lang
  13. Bruns, Axel (2008) Blogs, Wikipedia, Second Life and Beyond, New York: Peter Lang
  14. De Burgh, Hugo (2008) Investigative Journalism, London: Routledge
  15. Dondlinger, Mary Jo. Educational Video Game Design: A Review of the Literature, Journal of Applied Educational Technology Volume 4, Number 1, Spring/Summer 2007, http://www.eduquery.com/jaet/JAET4-1_Dondlinger.pdf
  16. Ellis, Justin. A perpetual motion machine for investigative reporting: CPI and PRI partner on state corruption project, Nieman Journalism Lab, March 8 2011 http://www.niemanlab.org/2011/03/a-perpetual-motion-machine-for-investigative-reporting-cpi-and-pri-partner-on-state-corruption-project/ accessed 21/3/2011
  17. Graham, John. Feedback in Game Design, Wolfire Blog, April 21 2010 http://blog.wolfire.com/2010/04/Feedback-In-Game-Design accessed 14/3/2011
  18. Grey, Stephen (2006) Ghost Plane, London: C Hurst & Co
  19. Hickman, Jon. Help Me Investigate: the social practices of investigative journalism, Paper presented to the Media Production Analysis Working Group, IAMCR, Braga, 2010, http://theplan.co.uk/help-me-investigate-the-social-practices-of-i
  20. Howe, Jeff. Gannett to Crowdsource News, Wired, November 3 2006, http://www.wired.com/software/webservices/news/2006/11/72067 accessed 14/3/2011
  21. Jenkins, Henry (2006) Convergence Culture, New York: New York University Press
  22. Lavrusik, Vadim. How Investigative Journalism Is Prospering in the Age of Social Media, Mashable, November 24 2010, http://mashable.com/2010/11/24/investigative-journalism-social-web/ accessed 14/3/2011
  23. Leadbeater (2008) We-Think, London: Profile Books
  24. Leigh, David. Help us solve the mystery of Blair’s money, The Guardian, December 1 2009, http://www.guardian.co.uk/politics/2009/dec/01/help-us-solve-blair-mystery accessed 14/3/2011
  25. Lih, Andrew (2009) The Wikipedia Revolution, London: Aurum Press
  26. Marshall, Sarah. Snow map developer creates ‘Cutsmap’ for Channel 4’s budget coverage, Journalism.co.uk, 22 March 2011, http://www.journalism.co.uk/news/snow-map-developer-creates-cutsmap-for-channel-4-s-budget-coverage/s2/a543335/ accessed 22/3/2011
  27. Morozov, Evgeny (2011) The Net Delusion, London: Allen Lane
  28. Nielsen, Jakob. Participation Inequality: Encouraging More Users to Contribute, Jakob Nielsen’s Alertbox, October 9, 2006, http://www.useit.com/alertbox/participation_inequality.html accessed 14/3/2011
  29. Paterson and Domingo (2008) Making Online News: The Ethnography of New Media Production, New York: Peter Lang
  30. Porter, Joshua (2008) Designing for the Social Web, Berkeley: New Riders
  31. Raymond, Eric S. (1999) The Cathedral and the Bazaar, New York: O’Reilly
  32. Scotney, Tom. Help Me Investigate: How working collaboratively can benefit journalists, Journalism.co.uk, August 14 2009, http://www.journalism.co.uk/news-features/help-me-investigate-how-working-collaboratively-can-benefit-journalists/s5/a535469/ accessed 21/3/2011
  33. Shirky, Clay (2008) Here Comes Everybody, London: Allen Lane
  34. Snyder, Chris. Spot.Us Launches Crowd-Funded Journalism Project, Wired, November 10, 2008, http://www.wired.com/epicenter/2008/11/spotus-launches/ accessed 21/3/2011
  35. Surowiecki, James (2005) The Wisdom of Crowds, London: Abacus
  36. Tapscott, Don & Williams, Anthony (2006) Wikinomics, London: Atlantic Books
  37. Tsui, Lokman. A Journalism of Hospitality, unpublished thesis, Presented to the Faculties of the University of Pennsylvania, 2010 http://dl.dropbox.com/u/22048/Tsui-Dissertation-Deposit-Final.pdf accessed 14/3/2011
  38. Weinberger, David (2002) Small Pieces, Loosely Joined, New York: Basic Books

What made the crowdsourcing successful? A case study in crowdsourcing investigative journalism part 6

In the penultimate part of the serialisation of research underpinning a new Help Me Investigate project I explore the qualities that successful crowdsourcing investigations shared. Previous parts are linked below:

What made the crowdsourcing successful?

Clearly, a distinction should be made between what made the investigation successful as a series of outcomes, and what made crowdsourcing successful as a method for investigative reporting. This section concerns itself with the latter.

What made the community gather, and continue to return? One hypothesis was that the nature of the investigation provided a natural cue to interested parties – The London Weekly was published on Fridays and Saturdays and there was a build up of expectation to see if a new issue would indeed appear.

The data, however, did not support this hypothesis. There was indeed a rhythm but it did not correlate to the date of publication. Wednesdays were the most popular day for people contributing to the investigation.

Upon further investigation a possible explanation was found: one of the investigation’s ‘alpha’ contributors – James Ball – had set himself a task to blog about the investigation every week. His blog posts appeared on a Wednesday.

That this turned out to be a significant factor in driving activity suggests one important lesson: talking publicly and regularly about the investigation’s progress is key to its activity and success.

This data was backed up from the interviews. One respondent mentioned the “weekly cue” explicitly. And Jon Hickman’s research also identified that investigation activity related to “events and interventions. Leadership, especially by staffers, and tasking appeared to be the main drivers of activity within the investigation.” (2010, p10)

He breaks down activity on the site into three ‘acts’, although their relationship to the success of the investigation is not explored further:

  • ‘Brainstorm’ (an initial flurry of activity, much of which is focused on scoping the investigation and recruiting)
  • ‘Consolidation’ (activity is driven by new information)
  • ‘Long tail’ (intermittent caretaker activity, such as supportive comments or occasional updates)

Networked utility

Hickman describes the site as a “centralised sub-network that suits a specific activity” (2010, p12). Importantly, this sub-network forms part of a larger ‘network of networks’ which involves spaces such as users’ blogs, Twitter, Facebook, email and other platforms and channels.

“And yet Help Me Investigate still provided a useful space for them to work within; investigators and staffers feel that the website facilitates investigation in a way that their other social media tools could not:

““It adds the structure and the knowledge base; the challenges, integration with ‘what do they know’ ability to pose questions allows groups to structure an investigation logically and facilitates collaboration.” (Interview with investigator)” (Hickman, 2010, p12)

In the London Weekly investigation the site also helped keep track of a number of discussions taking place around the web. Having been born from a discussion on Twitter, further conversations on Twitter resulted in further people signing up, along with comments threads and other online discussion. This fit the way the site was designed culturally – to be part of a network rather than asking people to do everything on-site.

The presence of ‘alpha’ users like James and Judith was crucial in driving activity on the site – a pattern observed in other successful investigations. They picked up the threads contributed by others and not only wove them together into a coherent narrative that allowed others to enter more easily, but also set the new challenges that provided ways for people to contribute. The fact that they brought with them a strong social network presence is probably also a factor – but one that needs further research.

The site had been designed to emphasise the role of the user in driving investigations. The agenda is not owned by a central publisher, but by the person posing the question – and therefore the responsibility is theirs as well. This cultural hurdle – towards acknowledging personal power and responsibility – may be the biggest one that the site has to address, and the offer of “failure for free” (Shirky, 2008), allowing users to learn what works and what doesn’t, may support that.

The fact that crowdsourcing worked well for the investigation is worth noting, as it could be broken down into separate parts and paths – most of which could be completed online: “Where does this claim come from?” “Can you find out about this person?” “What can you discover about this company?”. One person, for example, used Google Streetview to establish that the registered address of the company was a postbox. Other investigations that are less easily broken down may be less suitable for crowdsourcing – or require more effort to ensure success.

Momentum and direction

A regular supply of updates provided the investigation with momentum. The accumulation of discoveries provided valuable feedback to users, who then returned for more. In his book on Wikipedia, Andrew Lih (2009 p82) notes a similar pattern – ‘stigmergy’ – that is observed in the natural world: “The situation in which the product of previous work, rather than direct communication [induces and directs] additional labour”. An investigation without these ‘small pieces, loosely joined’ (Weinberger, 2002) might not suit crowdsourcing so well.

Hickman’s interviews with participants in the Birmingham council website investigation found a feeling of the investigation being communally owned and led:

“Certain members were good at driving the investigation forward, helping decide on what to do next, but it did not feel like anyone was in charge as such.”

“I’d say HMI had pivital role in keeping us together and focused but it felt owned by everyone.” (Hickman 2010, p10)

One problem, however, was that the number of diverging paths led to a range of potential avenues of enquiry. In the end, although the core questions were answered (was the publication a hoax and what were the bases for their claims) the investigation raised many more questions. These remained largely unanswered once the majority of users felt that their questions had been answered. As in a traditional investigation, there came a point at which those involved had to make a judgement whether they wished to invest any more time in it.

Finally, the investigation benefited from a diverse group of contributors who contributed specialist knowledge or access. Some physically visited stations where the newspaper was claiming distribution to see how many copies were being handed out. Others used advanced search techniques to track down details on the people involved and the claims being made, or to make contact with people who had had previous experiences with those behind the newspaper. The visibility of the investigation online also led to more than one ‘whistleblower’ approach providing inside information, which was not published on the site but resulted in new challenges being set.

The final part of this series outlines some conclusions to be taken from the project, and where it plans to go next.

A case study in crowdsourcing investigative journalism (part 4): The London Weekly

Continuing the serialisation of the research underpinning a new Help Me Investigate project, in this fourth part I describe how one particular investigation took shape. Previous parts are linked below:

Case study: the London Weekly investigation

In early 2010 Andy Brightwell and I conducted some research into one particular successful investigation on the site. The objective was to identify what had made the investigation successful – and how (or if) those conditions might be replicated for other investigations both on the site and elsewhere online.

The investigation chosen for the case study was ‘What do you know about The London Weekly?’ – an investigation into a free newspaper that was, the owners claimed (part of the investigation was to establish if the claim was a hoax), about to launch in London.

The people behind The London Weekly had made a number of claims about planned circulation, staffing and investment which went unchallenged in specialist media. Journalists Martin Stabe, James Ball and Judith Townend, however, wanted to dig deeper. So, after an exchange on Twitter, Judith logged onto Help Me Investigate and started an investigation.

A month later members of the investigation (most of whom were non-journalists) had unearthed a wealth of detail about the people behind The London Weekly and the facts behind their claims. Some of the information was reported in MediaWeek and The Guardian podcast Media Talk; some formed the basis for posts on James Ball’s blog, Journalism.co.uk and the Online Journalism Blog. Some has, for legal reasons, remained unpublished.


Andrew Brightwell conducted a number of semi-structured interviews with contributors to the investigation. The sample was randomly selected but representative of the mix of contributors, who were categorised as either ‘alpha’ contributors (over 6 contributions), ‘active’ (2-6 contributions) and ‘lurkers’ (whose only contribution was to join the investigation). These interviews formed the qualitative basis for the research.

Complementing this data was quantitative information about users of the site as a whole. This was taken from two user surveys – one conducted when the site was three months’ old and another at 12 months – and analysis of analytics taken from the investigation (such as numbers and types of actions, frequency, etc.)

In the next part I explore some of the characteristics of a crowdsourced investigation and how these relate to the wider literature around crowdsourcing in general.