Monthly Archives: December 2009

FAQ: has the role of the journalist changed? (and other questions)

Here’s another set of questions from a student I’m publishing as part of my FAQ series:

Do you think the role of the journalist has changed in modern media and if so why? What further changes do you envisage over the next
few years?

Both questions are tackled in detail in the 6th part of my model for the 21st century newsroom. I think we have new types of information which is changing the role of the journalist, and that post sketches out how that might pan out.

Do you think investigative journalism, the old journalism, is dead?

No. And I don’t think investigative journalism is ‘the old journalism’. If you read any history of journalism you’ll find what we consider ‘investigative journalism’ to be very much an exception rather than the rule in most journalism. In fact you could argue it is exceptional by definition. Is it dead? From a global perspective, if you look at the number of investigative journalism organisations being established, you could say it actually looks very healthy. From a UK perspective, it’s mainly moving out of the newsroom and into the freelance world, into the world of activist organisations, and onto the web. It is by no means dead at all.

Is the modern journalist simply someone who collects information that already exists and puts it all together to form “news,” rather than discovering things for themselves. Is this sort of thing journalism?

This is a curious question, and I’ll try not to take it too literally because the language is unclear. I think you’re talking about the reprocessing of easy content rather than the ‘unearthing’ of it, and I think the answer is twofold: firstly, even after all the layoffs we’ve got more journalists than we ever had up until around 10-20 years ago, and that expansion of the media in recent times has seen the employment of a lot of ‘processors’ – that doesn’t mean we have fewer journalists who ‘dig’. Secondly, the availability of information has changed, as I outline in part 6 of the 21st century newsroom linked above. Your question betrays a discourse of ‘discovering things for themselves’ which needs to be critically addressed. If information that previously had to be ‘discovered’ is now more publicly available because of the web, is that information less valid?

Let me give a concrete example: does the fact that a journalist can use Google to find a government document rather than go to the library change anything about journalism?

The idea of ‘journalism’ is a complex one that covers everything from live reporting to opinion, analysis, interviewing, document analysis, editing and plenty else besides. And I’m not sure how important the question of ‘what is journalism’ is unless we’re trying to pretend it’s something amazing which, really, it isn’t.

With the advance of community journalism, twitter and the like, do we still need journalists?

I’m assuming you mean professionally paid ones? Probably yes. If we look at why the job arose in the first place it was because of a commercial and political need for information. Even with information overload that need still exists – either as a filter of all of that information, or someone who gathers the information that isn’t being gathered, or who compiles it and presents context.

Here comes the iTunes of news? News Corp, Time Inc et al plan ‘store’

The WSJ reports on News Corp. joining “a consortium of magazine companies that are working on creating a digital store and common technology and advertising standards to sell their titles on electronic readers, mobile devices and other digital devices.

“The new venture is likely to be announced next week, according to people familiar with its plans, though it will be longer before the project is up and running. It will be owned jointly by the five participating companies, which in addition to News Corp. are Time Warner Inc.’s Time Inc., Conde Nast Publications Inc., Hearst Corp. and Meredith Corp.”

I’ve written before on why such an ‘iTunes model’ isn’t as easy a route as it may appear, but it is a step up from the basic paywall model. If they can make it convenient enough or include features worth paying for – rather than focusing blindly on the value of their ‘content’ – then there may be something in it.

UPDATE: PaidContent has more detail on the project.

National newspaper Twitter account growth gets ever slower …

UK national newspaper Twitter accounts are continuing to grow – but the rate is getting slower and slower, according to the latest figures for the 129 accounts I’m tracking:

The detail

These accounts had 1,801,044 followers on November 2nd (ignoring one FT account that has been shut). On December 2nd they had 1,919,770 followers in total.

Of the 118,726 increase, 76,812 or 65% was for the @guardiantech account (which benefits from being on Twitter’s suggested user list).

As ever, you can see the figures for each account here. (And yes, sorry about no Scottish ones. I’ll redo the list soon, honest).

-LIVE BLOG- Matt Brittin (Google UK) and OJB’s Paul Bradshaw on The Future of Local and Regional Media

On Thursday 3rd December at 10:30am,  The House of Commons Culture, Media and Sports Committee will hear from Matt Brittin (Managing Director, Google UK) and Paul Bradshaw (Lecturer in Journalism, Birmingham City University / Online Journalism Blog).

This is the sixth oral evidence session on The future for local and regional media.

For a live blog of this session, click here

Why offering free wifi could be one way for publishers to save journalism

The recent announcement that Swindon will be the first UK town to offer free wifi to all its citizens has piqued my curiosity on a number of levels. MA Online Journalism student Andrew Brightwell first got me thinking when he pointed out that the ability for the local council (which owns a 35% stake) to sell advertising represented a new threat to the local paper.

But think beyond the immediate threat and you have an enormous opportunity here. Because offering universal wifi could present a real opportunity for publishers to recapture some of the qualities that made their print products so successful. Continue reading

Ditching the template: the rise of the ‘blogazine’

Take a look at this:

blogazine screengrab

These are blog posts, tackled with the attitude of a magazine designer. There’s a whole lot more in this post at Smashing Magazine, which looks at the rise of the ‘blogazine’, and interviews four of its leading exponents. Stunning stuff – well worth a read. Now, is there a plugin that makes this as easy to do as magazine layout?

Does news aggregation benefit consumers? Does it harm journalists? (another response to govt)

Here’s a second question I’m expecting on Thursday when I give evidence to the Department for Culture, Media and Sport committee‘s sixth evidence session on The future for local and regional media. As Google are on before me (some act to follow) and aggregators are being waved around as the Big Baddie of traditional journalism, the question’s going to be asked: are aggregators really bad? And for whom?

What is aggregation?

The first point I’ll need to pick apart is what is meant by aggregation. The biggest news aggregators are local broadcasters and national newspapers, who habitually lift stories from local newspapers to fill their newshole. ‘But we add value!’ they might cry. Yes, and so do Google News and most of the other aggregators out there: by displaying them alongside each other for context, by using algorithms to identify which ones came first or are most linked to.

Of course the biggest value that aggregators add is by driving traffic back to the original material. Given that a) around a third of traffic to a typical news site comes from search engines and aggregators and b) most news sites have visitor numbers far in excess of their print readerships it’s fair to say that aggregators are not “parasites” eating into news website traffic. A more accurate description would be symbiotes, using content for mutual benefit.

Much of the objections to aggregators appear to me to come down to control and monopoly: Google is making a lot out of advertising, and newspapers are not. That’s simply the result of a competitor doing something better than you, which is selling advertising.

They do not sell that against newspaper content, they sell it against their index, and their functionality. A good analogy is map-makers: the owners of Tussaud’s Waxwork Museum don’t sue mapmakers for making money from featuring their attraction in their map, because visitors still have to go into the museum to enjoy its content. (And yes, any website publisher can instantly take itself off Google with a simple script anyway).

A second objection, it feels to me, comes from the fact that Google actually makes it much easier for you to bypass the parasites of the news industry who pass off other people’s hard work as their own, and go straight to the source. So if a local paper has a story you can read that instead of the national newspaper’s rewrite; if a scientific organisation did the research you can read that instead of the sensationalist write-up; the politician’s statement can be read in full, not out of context.

Is it good for consumers? Bad for journalists?

For the reasons given above, I don’t feel that aggregators are bad for consumers, although it would be over-simplistic to suggest they are therefore purely good. Aggregators require a different media literacy and are subject to the oddities of a selection process based on mathematical formula. In short, it’s not worse, or better, just different. But as a potential avenue to more information around a story, as well as a way of highlighting just how much news organisations reproduce the same material, I welcome them.

As for local journalists, aggregators don’t make things worse. In terms of their work processes, journalists benefit from aggregation by being able to find out information more quickly and efficiently. The downside, of course, is that so can their readers and so rewriting copy from elsewhere becomes less appropriate. Journalists have to add value, which seems to me a better and more rewarding use of their skills. On a basic level that might be through becoming expert aggregators themselves – better than the algorithms – or it may be by adding extra information, context or analysis that they can see is missing elsewhere. Either way, I can’t see how that is a bad thing.

But these are just my thoughts on the question of aggregation and their influence on the news industry. I’d welcome other perspectives as I prepare my responses.

What quality guarantees do blogs have? (response to government)

On Thursday I’ll be giving evidence to the Department for Culture, Media and Sport committee‘s sixth evidence session on The future for local and regional media. Based on the series of responses to their consultation earlier this year, I expect to be asked questions around particular themes. One of these revolves around the quality of blogs and how you guarantee that.

The quality issue is an interesting one that I expect to rear its head increasingly as hyperlocal startups become taken more seriously, lobby for equal treatment, and compete with established players for funding and advertising. We’ve already seen it, in fact, in some of the talk by ITN and PA around the bidding for local news consortia, and their talk of experience and reliability. The implication, of course, is that you can’t expect that from these ‘Johnny Come Latelies’.

When you look at it, the mainstream media can actually make claim to guarantees of quality (regardless of whether that quality exists) through a number of avenues: firstly, from being answerable to the market and to regulators, secondly, through professional codes of conduct, training and internal procedures, and finally through membership of professional organisations like the NUJ.

Bloggers, by contrast, can’t call on any of those same guarantees to ‘quality’. Many come from journalistic backgrounds and so have the same standards, but they don’t generally adhere to a formal code. Any time a ‘Bloggers’ Code of Conduct’ has been mooted it’s been greeted with derision because of the sheer diversity of practitioners. Still, I do think having individual codes that express your values and how people can obtain redress could count for a lot here.

What guarantees the quality of blogs?

Bloggers’ guarantees of quality, it appears to me, are enshrined in two key generic practices: the right of reply (comments) and transparency (linking). And a key overarching guarantee: accountability.

I’m not sure how to conceptualise this accountability, but it’s something of the web that needs exploration. You might call this ‘Google Juice’ or PageRank or simply reputation – what I’m trying to express is that the medium itself makes it difficult to get away with Bad Journalism as often as happened in less conversational media.

There’s also another guarantee of quality: lack of pressure from production deadlines, sales, proprietors and need to fill space. I’m not sure how long these will last, and in many cases they don’t apply (e.g. blogs who churn content for hits), but still, broadly, they deserve mention. Bloggers can pursue a story on its own merits, and indeed, when the collaboration of users is a major factor, they are reliant on serving their interests rather than those of advertisers or owners. I guess that’s another aspect of accountability.

Production versus Post-Publication

Looking at those claims you’ll notice that there’s a clear divide between Old and New Media. Almost all Old Media’s guarantees of quality relate to the production phase of journalism: once it’s published, there is very little ‘guarantee’ of quality at all. If it’s wrong, it’s wrong, and there’s little chance of that being changed.

New Media’s guarantees are more about post-publication – bloggers can’t guarantee that it will be balanced but they can guarantee that it will be fixed quickly if there’s something not quite correct, or missing, or that’s happened since.

Once again it’s the divide between the filter-then-publish and the publish-then-filter models.

And this brings us to the fact that the whole question rests on what you assume is ‘quality’. I can guess that MPs will assume that ‘quality’ means, for example, ‘objectivity’ and ‘balance’. I’m not saying that those are not good qualities to have, but we should be careful of assuming they are the only qualities, or that they carry the same importance in a world of universal publishing as they did in a world where you could count the number of publishers on two hands.

In short, the importance of traditional values of news quality is changing and that needs to be recognised.

Equally, then, there are the qualities of being ‘accurate’, ‘up to date’, ‘comprehensive’ and ‘correctable’. The quality of being ‘up to date’, for example, had little meaning beyond the production deadline in a pre-web world. Its importance is much more important now that content is always accessible. ‘Accuracy’ was a quality subject to the limitations of time, sources and newsroom knowledge, but now it’s possible for experts and eyewitnesses to contribute. I could go on.

But for now let me hang this question out and, in the spirit of its subject, invite you to improve the quality of this blog post and answer the question: what guarantees can blogs draw on for their quality? What exactly is quality in a networked age? And how do we articulate that to those from a different era?