Tag Archives: research

Crowdsourcing investigative journalism: a case study (part 3)

Continuing the serialisation of the research underpinning a new Help Me Investigate project, in this third part I describe how the focus of the site was shaped by the interests of its users and staff, and how site functionality was changed to react to user needs. I also identify some areas where the site could have been further developed and improved. (Part 1 is available here; Part 2 is here)

Reflections on the proof of concept phase

By the end of the 12 week proof of concept phase the site had also completed a number of investigations that were not ‘headline-makers’ but fulfilled the objective of informing users: in particular ‘Why is a new bus company allowed on an existing route with same number, but higher prices?’; ‘What is the tracking process for petitions handed in to Birmingham City Council?’ and ‘The DVLA and misrepresented number plates’

The site had also unearthed some promising information that could provide the basis for more stories, such as Birmingham City Council receiving over £160,000 in payments for vehicle removals; and ‘Which councils in the UK (that use Civil Enforcement) make the most from parking tickets?’ (as a byproduct, this also unearthed how well different councils responded to Freedom of Information requests#)

A number of news organisations expressed an interest in working with the site, but practical contributions to the site took place largely at an individual rather than organisational level. Journalist Tom Scotney, who was involved in one of the investigations, commented: “Get it right and you’re becoming part of an investigative team that’s bigger, more diverse and more skilled than any newsroom could ever be” (Scotney, 2009, n.p.) – but it was becoming clear that most journalists were not culturally prepared – or had the time – to engage with the site unless there was a story ‘ready made’ for them to use. Once there were stories to be had, however, they contributed a valuable role in writing those stories up, obtaining official reactions, and spreading visibility.

After 12 weeks the site had around 275 users (whose backgrounds ranged from journalism and web development to locally active citizens) and 71 investigations, exceeding project targets. It is difficult to measure ‘success’ or ‘failure’ but at least eight investigations had resulted in coherent stories, representing a success rate of at least 11%: the target figure before launch had been 1-5%. That figure rose to around 21% if other promising investigations were included, and the sample included recently initiated investigations which were yet to get off the ground.

‘Success’ was an interesting metric which deserves further elaboration. In his reflection on The Guardian’s crowdsourcing experiment, for example, developer Martin Belam (2011a, n.p.) noted a tendency to evaluate success “not purely editorially, but with a technology mindset in terms of the ‘100% – Achievement unlocked!’ games mechanic.”. In other words, success might be measured in terms of degrees of ‘completion’ rather than results.

In contrast, the newspaper’s journalist Paul Lewis saw success in terms of something other than pure percentages: getting 27,000 people to look at expense claims was, he felt, a successful outcome, regardless of the percentage of claims that those represented. And BBC Special Reports Editor Bella Hurrell – who oversaw a similar but less ambitious crowdsourcing project on the same subject on the broadcaster’s website, felt that they had also succeeded in genuine ‘public service journalism’ in the process (personal interview).

A third measure of success is noted by Belam – that of implementation and iteration (being able to improve the service based on how it is used):

“It demonstrated that as a team our tech guys could, in the space of around a week, get an application deployed into the cloud but appear integrated into our site, using a technology stack that was not our regular infrastructure.

“Secondly, it showed that as a business we could bring people together from editorial, design, technology and QA to deliver a rapid turnaround project in a multi-disciplinary way, based on a topical news story.

“And thirdly, we learned from and improved upon it.“ (Belam, 2010, n.p.)

A percentage ‘success’ rate of Help Me Investigate, then, represents a similar, ‘game-oriented’ perspective on the site, and it is important to draw on other frameworks to measure its success.

For example, it was clear that the site did very well in producing raw material for ‘journalism’, but it was less successful in generating more general civic information such as how to find out who owned a piece of land. Returning to the ideas of Actor-Network Theory outlined above, the behaviour of two principal actors – and one investigation – had a particular influence on this, and how the site more generally developed over time. Site user Neil Houston was an early adopter of the site and one of its heaviest contributors. His interest in interrogating data helped shape the path of many of the site’s most active investigations, which in turn set the editorial ‘tone’ of the site. This attracted users with similar interests to Neil, but may have discouraged others who did not – further research would be needed to establish this.

Likewise, while Birmingham City Council staff contributed to the site in its earliest days, when the council became the subject of an investigation staff’s involvement was actively discouraged (personal interview with contributor). This left the site short of particular expertise in answering civic questions.

At least one user commented that the site was very ‘FOI [Freedom Of Information request]-heavy’ and risked excluding users interested in different types of investigations, or who saw Freedom of Information requests as too difficult for them. This could be traced directly to the appointment of Heather Brooke as the site’s support journalist. Heather is a leading Freedom of Information activist and user of FOI requests: this was an enormous strength in supporting relevant investigations but it should also be recognised how that served to set the editorial tone of the site.

This narrowing of tone was addressed by bringing in a second support journalist with a consumer background: Colin Meek. There was also a strategic shift in community management which involved actively involving users with other investigations. As more users came onto the site these broadened into consumer, property and legal areas.

However, a further ‘actor’ then came into play: the legal and insurance systems. Due to the end of proof of concept funding and the associated legal insurance the team had to close investigations unrelated to the public sector as they left the site most vulnerable legally.

A final example of Actor-Network Theory in action was a difference between the intentions of the site designers and its users. The founders wanted Help Me Investigate to be a place for consensus, not discussion, but it was quickly apparent users did not want to have to go elsewhere to have their discussions. Users needed to – and did – have conversations around the updates that they posted.

The initial challenge-and-result model (breaking investigations down into challenges with entry fields for the subsequent results, which were required to include a link to the source of their information) was therefore changed very early on to challenge-and-update: people could now update without a link, simply to make a point about a previous result, or to explain their efforts in failing to obtain a result.

One of the challenges least likely to be accepted by users was to ‘Write the story up’. It seemed that those who knew the investigation had no need to write it up: the story existed in their heads. Instead it was either site staff or professional journalists who would normally write up the results. Similarly, when an investigation was complete, it required site staff to update the investigation description to include a link to any write-up. There was no evidence of a desire from users to ‘be a journalist’. Indeed, the overriding objective appeared rather to ‘be a citizen’.

In contrast, a challenge to write ‘the story so far’ seemed more appealing in investigations that had gathered data but no resolution as yet. The site founders underestimated the need for narrative in designing a site that allowed users to join investigations while they were in progress.

As was to be expected with a ‘proof of concept’ site (one testing whether an idea could work), there were a number of areas of frustration in the limitations of the site – and identification of areas of opportunity. When looking to crowdfund small amounts for an investigation, for example, there were no third party tools available that would allow this without going through a nonprofit organisation. And when an investigation involved a large crowdsourcing operation the connection to activity conducted on other platforms needed to be stronger so users could more easily see what needed doing (e.g. a live feed of changes to a Google spreadsheet, or documents bookmarked using Delicious).

Finally investigations often evolved into new questions but had to stay with an old title or risk losing the team and resources that had been built up. The option to ‘export’ an investigation team and resources into a fresh question/investigation was one possible future solution.

‘Failure for free’ was part of the design of the site in order to allow investigations to succeed on the efforts of its members rather than as a result of any top-down editorial agenda – although naturally journalist users would concentrate their efforts on the most newsworthy investigations. In practice it was hard to ‘let failure happen’, especially when almost all investigations had some public interest value.

Although the failure itself was not an issue (and indeed the failure rate lower than expected), a ‘safety net’ was needed that would more proactively suggest ways investigators could make their investigation a success, including features such as investigation ‘mentors’ who could pass on their experience; ‘expiry dates’ on challenges with reminders; improved ability to find other investigators with relevant skills or experience; a ‘sandbox’ investigation for new users to find their feet; and developing a metric to identify successful and failing investigations.

Communication was central to successful investigations and two areas required more attention: staff time in pursuing communication with users; and technical infrastructure to automate and facilitate communication (such as alerts to new updates or the ability to mail all investigation members)

The much-feared legal issues threatened by the site did not particularly materialise. Out of over 70 investigations in the first 12 weeks, only four needed rephrasing to avoid being potentially libellous. Two involved minor tweaks; the other two were more significant, partly because of a related need for clarity in the question.

Individual updates within investigations, which were post-moderated, presented even less of a legal problem. Only two updates were referred for legal advice, and only one of those rephrased. One was flagged and removed because it was ‘flamey’ and did not contribute to the investigation.

There was a lack of involvement by users across investigations. Users tended to stick to their own investigation and the idea of ‘helping another so they help you’ did not take root. Further research is needed to see if there was a power law distribution at work here – often seen on the internet – of a few people being involved in lots of investigations, most being involved in one, and a steep upward curve between.

In the next part I look at one particular investigation in an attempt to identify the qualities that made it successful.

If you want to get involved in the latest Help Me Investigate project, get in touch on paul@helpmeinvestigate.com

Hyperlocal research: “Can Big Media do ‘Big Society’?”

A research paper I’ve contributed to, with Jean-Christophe Pascal and Neil Thurman, on a regional publisher’s experiment with hyperlocal publishing, has now been published on City University’s website. You can download the full PDF from here.

Hold The Front Page (which is part-owned by Northcliffe, the subject of the research), reported on the research here, which includes a response from Northcliffe.

Help Me Investigate – anatomy of an investigation

Earlier this year I and Andy Brightwell conducted some research into one of the successful investigations on my crowdsourcing platform Help Me Investigate. I wanted to know what had made the investigation successful – and how (or if) we might replicate those conditions for other investigations.

I presented the findings (presentation embedded above) at the Journalism’s Next Top Model conference in June. This post sums up those findings.

The investigation in question was ‘What do you know about The London Weekly?‘ – an investigation into a free newspaper that was (they claimed – part of the investigation was to establish if this was a hoax) about to launch in London.

The people behind the paper had made a number of claims about planned circulation, staffing and investment that most of the media reported uncritically. Martin Stabe, James Ball and Judith Townend, however, wanted to dig deeper. So, after an exchange on Twitter, Judith logged onto Help Me Investigate and started an investigation.

A month later members of the investigation had unearthed a wealth of detail about the people behind The London Weekly and the facts behind their claims. Some of the information was reported in MediaWeek and The Media Guardian podcast Media Talk; some formed the basis for posts on James Ball’s blog, Journalism.co.uk and the Online Journalism Blog. Some has, for legal reasons, remained unpublished. Continue reading

Online News Survey – suggestions wanted

Global news provider Small World News Service and online research company OnePoll are looking to undertake a large study which will research how the public access and use news online.

After discussing possible angles to take with the survey, it was decided that it would be good to work with the Online Journalism Blog to crowdsource possible avenues to take with the research.

The goal is to produce a number of studies that can help news professionals, journalists and anyone else with an interest understand the attitude and behaviours of online news consumers.

Our method will be to conduct a survey with a large representative sample of UK internet users.

After the study has been completed we will publish both the report and the data on the OnePoll website and make it freely available.

So if you have any suggestions for questions or possible angles then I would be delighted to hear about them.

You can contact me on Twitter @oliconner or email oli2706@gmail dot com

Research: the limits of social networks for organising the social

Ulises Mejias has written a wonderful paper (subscription required) on how social networks don’t just enable participation – but limit them. Or as he asks: “Whether social network services engender publics (where opinion can be expressed freely) or masses (where opinion can be expressed freely but is not realised in action)”.

It’s a fascinating counterpoint to the ‘revolutionary’ rhetoric (think Twitter and the ‘Iran revolution’) that surrounds so much writing on social networks.

If you’re able to get hold of a copy, I recommend reading the paper in full, as there’s far too much of interest to summarise here. But if you can’t, here are some of the points that Mejias makes: Continue reading

Using news stories on Facebook: what the BBC found

Great post by Claire Wardle and Matthew Eltringham on some research they conducted into how social network users use news. Here are the highlights. Firstly, news as a social object:

“They all saw comment and discussion as a key component of enjoying news on Facebook. They shared and posted stories they were interested in, sure, but also so they could make a point or start a conversation. But the vast majority really only wanted to have that conversation within their own group of friends, partly because that was where they felt comfortable.”

And secondly, it’s all about the niche: Continue reading

Online journalism and the promises of new technology PART 3: Hypertext

This post is cross-published from my new journalism/new media-blog. Previous posts in this series:

In the third part of this series I will take a closer look at the research on hypertext in online journalism and to what degree this asset of new technology has been and is utilized in online journalism. The general assumption of researchers interested in hypertextual online journalism is that if hypertext is used innovatively it would provide a range of advantages over print journalism: Continue reading

Online journalism and the promises of new technology PART 2: The assets

This post is cross-published from my new journalism/new media-blog.

In the first post in this series I argued that technology may not play such an important role to the development of journalism in new media as people seem to believe. In this post I will look at the three assets of new technology that are generally portrayed as the most significant for journalism in new media: multimedia, interactivity and hypertext (see for instance this article by Mark Deuze for arguments on why these three assets have been considered the most important for online journalism).

The general assumption of the “techno-researchers” has been that an innovative approach to online journalism implies utilizing these three assets of new technology. There are, of course, lots of other technological assets and/or concept related to technology that keeps popping up in the discourse on online journalism: Continue reading

Why do people read online news? (Research summary)

Ioana Epure summarises “Harnessing the potential of online news: Suggestions from a study on the relationship between online news advantages and its post-adoption consequences”, a study by An Nguyen (University of Stirling)

In the last decade journalism has entered a stage in which news organisations are less reluctant to invest in online operations, but An Nguyen’s study starts from the premise that they do so driven not by the desire to innovate and fully exploit the potential of online news, but because of the fear that the internet will replace traditional media in the news market.

As a consequence, they haven’t actually tried to understand what users want from online news and how what they want will affect their behaviour after receiving it.

Surprisingly, the results of Nguyen’s study show that traditional press still has a battle to carry, provided that practitioners understand why people have turned to online news and try to offer them something similar. Continue reading

Must user-generated-content threaten quality journalism?

The BBC’s User Generated Content (UGC) Hub does not further meaningful civil participation in the news, and the routine inclusion of UGC does not significantly alter news selection criteria or editorial values. So concludes Jackie Harrison’s study on audience contributions and gatekeeping practices at the BBC.

The study found many of the previous barriers to news selection have been removed or are not applicable to UGC.

“User generated content has been absorbed into BBC newsroom practices and is now routinely considered as an aspect of, or dimension to, many stories. In this sense the traditional barriers which formed the gatekeeping criteria of the 1990s have been altered forever.”

Harrison sees the changes to selection criteria as a real and worrying threat to quality and standards at the public broadcaster. Her study raises interesting questions about the value of UGC and how it should be measured. She fears the growing tendency to utilise audience content, often for convenience, risks an increase in “soft news” at the expense of quality journalism, and worse, the degradation of public knowledge.

Harrison does not see the hub as progressing civil debate or public engagement on a meaningful level, and she anticipates future use of UGC may grow more opportunistic. This is obviously at odds with the active debate and participation the hub set out to foster, and which has dominated previous ideals of audience participation.

Selection and moderation

In an earlier study, Harrison looked at what caused some stories to be used by the BBC and others to be rejected. Here she reinvestigates these reasons in the context of UGC, finding that in many cases UGC can, if not make these previous concerns irrelevant, make the case for automatic rejection less compelling.

While the hub is subject to resource-intensive moderation and methodical processes to ascertain UGC authenticity and quality it is, like all news organisations, still learning how to most effectively utilise audience participation.

There are growing and unresolved tensions for journalists in balancing the BBC’s traditional journalistic standards while fostering open communication, promoting free speech, and at the same time protecting the site and the audience against possible offence.

Inevitably, this gives rise to judgement calls which are necessarily subjective.

Harris suggests two questions then arise from this:

  • Does UGC reflect public opinion and
  • two, are they simply generating noise…of little value, and,
  • is it a public service broadcaster’s job to provide a platform for all sorts of views including unpalatable or unpleasant ‘‘non-majoritarian’’ comment and, if it is not, why not?

BBC journalists told Harrison, “The difficulty with opening up the floodgates to participation is that ‘the full spectrum” of opinions must be considered to further the aims of the ‘global conversation’.”

Should we be concerned, as Harrison seems to be, that material gathered at the hub is not always deemed of particular quality? Or does the value, as Stuart Purvis suggests, lie in the telling, the fact that new and possibly previously unheard voices are given a platform?

We are right to expect quality content from the public broadcaster, but opinions on what that means differ widely.

This can be seen in the debate between Paul Bradshaw and his students, and the BBC staff regarding UGC content and external links. It seems while hub head Matthew Eltringham spoke about the relevance of content, what he was really talking about was quality content. If the BBC opened up linking to contributors’ sites, would it have to do it for all contributors, and what kinds of complications would this pose?

The future of UGC

Perhaps we should not be viewing the growing tendency for “soft journalism” through UGC as a degradation in quality, but part of the evolution of the BBC. Unless of course, it does come at the cost of investigative, serious journalism, which clearly the BBC has a mandate to invest in.

Harrison rightly points out the hub is only one part of the newsroom, but a part that is increasingly relied upon as an additional source of information, shared between departments at the BBC.

What the study doesn’t address is how successful the UGC hub has been in engaging people who have previously not interacted with the BBC, or who have not taken part in public debate in general. I suspect it is unlikely to have encouraged society’s voiceless. We must assume at the least, that people taking part have access to technology, which is of course, one of the major difficulties of the idea of the new electronic, egalitarian public sphere.

The hub does represent a deliberate and conscious effort to seek audience interaction and better serve the public interest, though what this will mean for the BBC, and for the public, in the long-term is still unclear.

It will be interesting to see how the hub develops and where UGC can go. Is Harrison right in predicting it will grow more meaningless or, more drastically, has meaningful civil engagement in the news already met its untimely death, as Steve Borris declared?