Monthly Archives: May 2010

The full LibDem-Conservative coalition agreement

The coalition has published the full document defining their Programme for Government today. It covers policy areas not included in the initial document, but there are also many policies from the initial document not mentioned which will just be “read through”.

These are the sections which mention the media:

Section 7: Culture, Olympics, Media and Sport

  1. We will maintain the independence of the BBC, and give the National Audit Office full access to the BBC’s accounts to ensure transparency.
  2. We will enable partnerships between local newspapers, radio and television stations to promote a strong and diverse local media industry.
  3. We will cut red tape to encourage the performance of more live music.
  4. We will introduce measures to ensure the rapid roll-out of superfast broadband across the country.We will ensure that BT and other infrastructure providers allow the use of their assets to deliver such broadband, and we will seek to introduce superfast broadband in remote areas at the same time as in more populated areas. If necessary, we will consider using the part of the TV licence fee that is supporting the digital switchover to fund broadband in areas that the market alone will not reach.

And the media overseas (Section 18: International Development):

  1. We will use the aid budget to support the development of local democratic institutions, civil society groups, the media and enterprise; and support efforts to tackle corruption.

Here is the full document as a PDF.

Lib Con Coalition Programme for Government

Wiki journalism: the experiences of WikiCity Guides

I asked Pat Lazure, co-founder of the wiki journalism project WikiCity Guides, to tell me more about his experiences with the project. This is what he said:

Key Factors Driving Citizen Journalism

There has been a lot written about citizen and crowd-sourced journalism, and to this end, several entrepreneurs and creative folks have aggressively explored the widening opportunities within this space. I could write a chapter on why this is happening but instead, boiling it all down, there are two key factors driving these opportunities: Continue reading

A journalistic tour of the Argentinian Bicentenary

On May 25th we celebrate the Argentinian Bicentenary. And while the big media aren’t showing any really interesting initiatives, we have Tu Bicentenario, an independent and experimental journalistic project that aims to give real-time coverage to the main events of the celebrations with social tools and user-collaboration.

With a highly customizable website that integrates different movable boxes, including Facebook, Twitter, YouTube, Vimeo, Google Maps and mobile streaming, they are trying to facilitate the creation and publication of content not only by the creators but by the audience too.

The most interesting content that came out of the project so far -in my opinion- is the survey in pictures and videos of historic sites, contrasted with old images to show the changing of cities. This material is being geolocated in Google Maps.

Some Argentinean Google Maps users also upload 3D models of the most important sights so you can do a virtual tour of the country.

Dear Peter Preston: universities shun the NCTJ too

I have an enormous amount of respect for Peter Preston, and much of what he says in Sunday’s Observer piece about careers in journalism is spot-on. But this line strikes me as just wrong:

“If you want to be a journalist, try to get on one of the 68 National Council for the Training of Journalists accredited courses, along with 1,800 or so other hopefuls, but in general beware courses the NCTJ shuns (of which there are far too many).”

There are so many assumptions underlying this sentence that it’s a challenge to unpick them, but here are the main two:

  1. That ‘being a journalist’ is limited to newspapers – regional newspapers, to be specific. Most other employers of journalists – national, broadcast, magazines and online – rarely ask for NCTJ qualifications. Even regional newspapers – the heartland of the NCTJ – do not recruit a majority of trainees with an existing NCTJ qualification.
  2. Secondly, that courses not accredited by the NCTJ have been ‘shunned’. I teach on a journalism degree which chose, a decade ago, not to pay for NCTJ accreditation. The decision was taken by the then-head of journalism, the redoubtable and wonderful Sharon Wheeler, for reasons both financial (the money that would be paid to the NCTJ for a shiny badge would be better spent elsewhere) and educational (the NCTJ  strictures make it hard to be flexible in a changing media environment). That decision was restated by our current head of journalism Sue Heseltine, and I agreed with it: I didn’t see what we would gain for the money we pay to the NCTJ other than a marketing tool that we do not need (we receive around 10 applicants for every place).

That decision was also informed by the problems universities have had with the NCTJ, which I’ve written about elsewhere (the comments to which are particularly interesting). I’ve also written about the assumption that journalism degrees are comparable to training courses.

I don’t have a problem with NCTJ training in particular – indeed, I wish more journalists had the sort of understanding of local government and law that their courses teach – but I do have a problem when it is seen as the only, or best, route into journalism (an image perpetuated by the NCTJ’s own marketing materials). The same is true of university courses, which vary wildly in quality and scope (the latter is not such a bad thing; a one-size-fits-all approach cannot be good for any creative industry).

The only good advice I can think of for aspiring journalists is to simply go out there and do journalism – because there’s no longer anything stopping you – me or Peter Preston included.

5 data visualisation tips from David McCandless

Here’s another snippet from my data journalism book chapter (now published). As part of my research David McCandless, author of the very lovely book and website Information is Beautiful gave  his 5 tips for visualising data:

  1. Double source data wherever possible – even the UN and WorldBank can make mistakes
  2. Take information out – there’s a long tradition among statistical journalists of showing everything. All data points. The whole range. Every column and row. But stories are about clear threads with extraneous information fuzzed out. And journalism is about telling stories. You can only truly do that when you mask out the irrelevant or the minor data. The same applies to design which is about reducing something to its functional essence.
  3. Avoid standard abstract units – tons of carbon, billions of dollars – these kinds of units are over-used and impossible to imagine or relate to. Try to rework or process units down to ‘everyday’ measures. Try to give meaningful context for huge figures whenever possible.
  4. Self-sufficency – all graphs, charts and infographics should be self-sufficient. That is, you shouldn’t require any other information to understand them. They’re like interfaces. So each should have a clear title, legend, source, labels etc. And credit yourself. I’ve seen too many great visuals with no credit or name at the bottom.
  5. Show your workings – transparency seems like a new front for journalists. Google Docs makes it incredibly easy to share your data and thought processes with readers. Who can then participate.

Dealing with live data and sentiment analysis: Q&A with The Guardian's Martyn Inglis

As part of the research for my book on online journalism, I interviewed Martyn Inglis about The Guardian’s Blairometer, which measured a live stream of data from Twitter as Tony Blair appeared before the Chilcot inquiry. I’m reproducing it in full here, with permission:

How did you prepare for dealing with live data and sentiment analysis?

I think it was important to be aware of our limitations. We can process a limited amount of data – due to Twitter quotas and so on. This is not a definitive sample. Once we accept that (a) we are not going to rank every tweet and (b) this is therefore going to be a limited exercise it frees us to make concessions that provide an easier technology solution.

Sentiment analysis is hard programatically, given the short time span of the event in which we can do this manually. We had an interface view onto incoming tweets which we had pulled from a twitter search. This allows us to be really accurate in our assessment. This does not work over a long period of time – the Chilcot inquiry is one thing, you couldn’t do it for an event lasting a week or so on. Continue reading