Is data journalism ‘time consuming’ or ‘resource intensive’? The excuse – and I think it is an excuse – seems to come up at an increasing number of events whenever data journalism is discussed. “It’s OK for the New York Times/Guardian/BBC,” goes the argument. “But how can our small team justify the resources – especially in a time of cutbacks?“
The idea that data journalism inherently requires extra resources is flawed – but understandable. Spectacular interactives, large scale datasets and investigative projects are the headliners of data journalism’s recent history. We have oohed and aahed over what has been achieved by programmer-journalists and data sleuths…
But that’s not all there is.
Data journalism is a broader church than computer assisted reporting – it takes in processes and techniques from visualisation and automation to interactivity and spreadsheet work. And just as journalism as a whole can be investigative or observational, simple or complex, data journalism can also be used in a variety of ways.
When integrated into newsroom processes, data journalism techniques can save time, they can save resources – indeed, they can bring extra resources into the newsroom.
As the statement above was being tweeted, I was training a local newspaper reporter. Alongside the techniques for doing deeper journalism, he also saw opportunities for automating existing manual processes. He later emailed to say that he had been able to “create two spreadsheets which will help my colleagues create useful articles for our readers in half the time”
The previous week I had been working with a team from the Press Association, who were pleased to discover much quicker and more accurate ways of dealing with official statistics and FOI responses.
UPDATE: Here data journalist Esa Makinen from Finland’s Helsingin Sanomat talks about how they use templates to produce quick visualisations. visualisations are “content that cannot be copied”, he points out.
Mapping Olympic torchbearers from a newspaper’s patch may have taken up significant time of both reporters and designers a decade ago – now I can do that alone, with free tools, in less than an hour. And how about reporting elections? One of my students writes here how he used data journalism skills – and preparation – to turn around results more quickly than would otherwise have been possible.
Data journalism doesn’t have to be spectacular to be useful.
A false economy
But it’s not just about saving time – it’s also about drawing on the energy and expertise of your users. Look at how The Guardian benefits from the design and development work of the thousands of developers and designers who use their API or contribute to their Flickr group.
And data journalism contributes to the bottom line in other ways too: Data Blog editor Simon Rogers said recently that, while they had expected the blog to have a “very niche audience of developers and techies”:
“In fact it has become one of the most popular on guardian.co.uk and has developed a very mainstream audience. They find that for almost any set of data they produce, there are people passionately invested in that topic prepared to argue the toss over specific data points and the issues they illustrate below the line in the comments at great length.”
Likewise, at the BBC’s Data Journalism Day yesterday – where the ‘limited resources’ argument was raised again – the New York Times’ Aron Pilhofer said that his team’s interactive features attract huge sponsorship deals – advertisers like “new, shiny, unique” things they can attach their name to. And their interactives produce great engagement figures – which advertisers also like.
But while interactives are useful commercially we still run the danger of seeing data journalism techniques as something separate to everyday journalism.
As Charles Arthur said on this blog previously: if you find yourself doing something over and over, get a computer to do it. That’s what Adrian Holovaty did with crime reports and other civic information on ChicagoCrime.org and Everyblock. It’s what Chris Taggart has done with council spending on OpenlyLocal. They save time for the work that only humans can do: double-checking, fleshing out, and communicating.
So don’t tell me that data journalism is “very time consuming”. In-depth journalism is often very time consuming. Data journalism is just a technique – and if your heart’s in the right place – it can be used to make time for the deeper stuff.
*From one trainee since: “Using Google Spreadsheets to compile a table from a Word document of turbine applications helped save the journalist probably four or five hours of manual cutting and pasting. [Here's the story that resulted]“