In the history chapter of the Online Journalism Handbook you will find a timeline of key events in web journalism. While working on the forthcoming second edition I recently revisited and updated the timeline. Below are the 41 key events I have settled on — but have I missed any? Let me know what you think. Continue reading
Spanish citizens are now a step closer to understanding how power operates in the country, and how decisions affect them, thanks to the work of organisations like Civio fighting for transparency and access to public data. In October their work was recognised with the Gabriel Garcia Marquez award in innovative journalism for their investigations Medicamentalia. In a guest post for OJB, Nuria Riquelme Palazón spoke with Javier de la Vega, one of the members of Civio.
Access to public information, accountability and participatory democracy may have been a reality in many countries for some time — but in Spain they sounded like a utopia. Entrepreneur Jacobo Elosua and computer technician David Cabo decided that this had to change.
The pair used their savings to build an organisation with the intention of serving those active citizens who, like them, believed in transparency: Civio Foundation.
I’d wondered for a while why no-one who had talked about scraping at conferences had actually demonstrated the procedure. It seemed to me to be one of the most sought-after skills for any investigative journalist.
Then I tried to do so myself in an impromptu session at the first Data Journalism Conference in Birmingham (#DJUK16) and found out why: it’s not as easy as it’s supposed to look.
To anyone new to data journalism, a scraper is as close to magic as you get with a spreadsheet and no wand. Continue reading
Data scientist David Robinson was behind one of the most striking data stories of this US election season, when his analysis of Donald Trump tweets appeared to confirm that Trump was posting the angriest comments on that account (jointly managed by his campaign staff). Barbara Maseda spoke to Robinson about the story behind that text analysis and what comes next.
It was August 9 when David Robinson published his analysis of Trump tweets on his blog. Robinson had used a series of libraries in the programming language R to collect, clean, process and visualise the data. The process took just 12 hours, from Saturday night through Tuesday morning.
In the following days, the piece would be re-posted and cited by multiple websites, including The Washington Post and Mashable. The original piece alone had hundreds of thousands of views in just a few days.
The result wasn’t just one election story, but one of the biggest indications yet of the potential of text analysis for journalists, with three takeaways in particular: Continue reading
Data Journalism UK 2016, in Birmingham on November 22, will be focusing on the latest wave of regional data journalism projects, from the data journalists at Trinity Mirror and BBC Scotland to startups like Northern Ireland’s The Detail and winners of Google Digital News Initiative funding Talk About Local’s News Engine and the Bureau of Investigative Journalism.
I’m particularly pleased to have one of the most experienced data journalists in the country, Claire Miller, speaking too.
The event will mix industry speakers and experts with practical sessions: there’ll be drop-in sessions on getting started with data journalism, an information security ‘surgery’, and some speakers have been asked to focus on practical skills too.
On top of all that, attendees will have the opportunity to nominate skills they want to learn – we’ll put on workshops for the most popular topics!
The event is being jointly sponsored by the University of Stirling and Birmingham City University.
After winning two prestigious data journalism awards since launching in 2015, the Peruvian medium Convoca has launched its first crowdsourcing campaign to build a global community around its investigations. Nuria Riquelme spoke to founder Aramis Castro about the project.
Convoca has become a reference point for data journalism in South America. With a team of around ten people including system engineers, computer technicians and journalists, led by Milagros Salazar, a professional with over 15 years journalistic experience, they have pioneered data journalism in Peru. Continue reading
Peter Yeung has a good point: why is it so difficult to get editors to pay for data journalism?
In a series of tweets we tried to find some answers.
Firstly, commissioning isn’t set up for data journalism. Editors instead try to fit it into established structures for commissioning text-based news and features, with the result that:
a) The pricing doesn’t reflect the work involved; and
b) Any interactivity and visuals become incidental to the process instead of integral.
And yet the value of data journalism has been repeatedly proven, and organisations are spending money on it: just not on commissioning. As Yeung added:
“I find it strange publications invest in data editors and journalists, but not data budgets”
The FT’s Martin Stabe suspected it wasn’t just a data journalism problem:
“This probably extends to lots of digital-only content, not just data journalism.”
A related problem is the lack of standardisation in data journalism: there is no equivalent to the payment by wordcount which print journalists have so long worked by.
Instead, organisations ‘insource‘ data journalism work to internal teams, either data teams or ad hoc teams formed from existing personnel (think the MPs’ expenses or Wikileaks investigations…
…Or they ‘outsource‘ data journalism work to external agencies etc.
This is a problem also highlighted by Alfred Hermida in his research into Canadian data journalism, ‘Finding the Data Unicorn‘: only one job title showed up four times “and that was the general reporter/journalist category.”
That’s our take. What about yours? Why isn’t data journalism properly commissioned? And how do freelance data journalists get work?