The latest in my series of FAQ posts comes from the National University of Sciences & Technology (NUST) in Pakistan. As always, I’m publishing my answers to their questions here in case it’s of use to anyone else.
Q. What would you say to convince journalists — especially journalists working in developing countries where even the acquisition of public records is often a tedious task — about the importance of data journalism?
If you believe that journalism has a duty to be factual, accurate, and to engage an audience in subjects which have a clear public and civic importance, then data journalism is going to be very important to your work. Continue reading →
The Bureau and the BBC: 2 networked models for supporting data journalism
2017 saw the launch of two projects with a remit to generate and stimulate data journalism at a local level: the Bureau of Investigative Journalism’s Bureau Local project, and the BBC’s Shared Data Unit. Continue reading →
Law, Regulation and Institutions (including security); and
Specialist Journalism, Investigations and Coding
The modules develop both a broad understanding of a range of data journalism techniques before you choose to develop some of those in greater depth on a specialist project.
The course is designed for those working in industry who wish to gain accredited skills in data journalism, but who cannot take time out to study full time or may not want a full Masters degree (a PGCert is 60 credits towards the 180 credits needed for a full MA).
In a special guest post Anders Eriksen from the #bord4editorial development and data journalism team at Norwegian news website Bergens Tidende talks about how they manage large data projects.
Do you really know how you ended up with those results after analyzing the data from Public Source?
Well, often we did not. This is what we knew:
We had downloaded some data in Excel format.
We did some magic cleaning of the data in Excel.
We did some manual alterations of wrong or wrongly formatted data.
We sorted, grouped, pivoted, and eureka! We had a story!
Then we got a new and updated batch of the same data. Or the editor wanted to check how we ended up with those numbers, that story.
…And so the problems start to appear.
How could we do the exact same analysis over and over again on different batches of data?
And how could we explain to curious readers and editors exactly how we ended up with those numbers or that graph?
We needed a way to structure our data analysis and make it traceable, reusable and documented. This post will show you how. We will not teach you how to code, but maybe inspire you to learn that in the process. Continue reading →
Using a combination of contact-led information and FOI requests, they uncovered the extent of the ambitions to dig deep into Scottish soil.
It was part of a steady flow of fracking stories from the Ferret team, ensuring those involved in making decisions were in no doubt of their responsibilities and recognised that every step would be scrutinised. Continue reading →