Tag Archives: statistics

Why the “Cost to the economy” of strike action could be misleading

It’s become a modern catchphrase. When planes are grounded, when cars crash, when computers are hacked, and when the earth shakes. There is, it seems, always a “cost to the economy”.

Today, with a mass strike over pensions in the UK, the cliche is brought forth again:

“The Treasury could save £30m from the pay forfeited by the striking teachers today but business leaders warned that this was hugely outbalanced by the wider cost to the economy of hundreds of thousands of parents having to take the day off.

“The British Chambers of Commerce said disruption will lead to many parents having to take the day off work to look after their children, losing them pay and hitting productivity.”

Statements like these (by David Frost, the director general, it turns out) pass unquestioned (also here, here and elsewhere), but in this case (and I wonder how many others), I think a little statistical literacy is needed.

Beyond the churnalism of ‘he said-she said’ reporting, when costs and figures are mentioned journalists should be asking to see the evidence.

Here’s the thing. In reality, most parents will have taken annual leave today to look after their children. That’s annual leave that they would have taken anyway, so is it really costing the economy any more to take that leave on this day in particular? And specifically, enough to “hugely outbalance” £30m?

Stretching credulity further is the reference to parents losing pay. All UK workers have a statutory right to 5.6 weeks of annual leave paid at their normal rate of pay. If they’ve used all that up halfway into the year (or 3 months into the financial year) – before the start of the school holidays no less – and have to take unpaid leave, then they’re stupid enough to be a cost to the economy without any extra help.

And this isn’t just a fuss about statistics: it’s a central element of one of the narratives around the strikes: that the Government are “deliberately trying to provoke the unions into industrial action so they could blame them for the failure of the Government’s economic strategy.”

If they do, it’ll be a good story. Will journalists let the facts get in the way of it?

UPDATE: An inverse – and equally dubious – claim could be made about the ‘boost’ to the economy from strike action: additional travel and food spending by those attending rallies, and childcare spending by parents who cannot take time off work. It’s like the royal wedding all over again… (thanks to Dan Thornton in the comments for starting this chain of thought)

One ambassador’s embarrassment is a tragedy, 15,000 civilian deaths is a statistic

Few things illustrate the challenges facing journalism in the age of ‘Big Data’ better than Cable Gate – and specifically, how you engage people with stories that involve large sets of data.

The Cable Gate leaks have been of a different order to the Afghanistan and Iraq war logs. Not in number (there were 90,000 documents in the Afghanistan war logs and over 390,000 in the Iraq logs; the Cable Gate documents number around 250,000) – but in subject matter.

Why is it that the 15,000 extra civilian deaths estimated to have been revealed by the Iraq war logs did not move the US authorities to shut down Wikileaks’ hosting and PayPal accounts? Why did it not dominate the news agenda in quite the same way?

Tragedy or statistic?

I once heard a journalist trying to put the number ‘£13 billion’ into context by saying: “imagine 13 million people paying £1,000 more per year” – as if imagining 13 million people was somehow easier than imagining £13bn. Comparing numbers to the size of Wales or the prime minister’s salary is hardly any better.

Generally misattributed to Stalin, the quote “The death of one man is a tragedy, the death of millions is a statistic” illustrates the problem particularly well: when you move beyond scales we can deal with on a human level, you struggle to engage people in the issue you are covering.

Research suggests this is a problem that not only affects journalism, but justice as well. In October Ben Goldacre wrote about a study that suggested “People who harm larger numbers of people get significantly lower punitive damages than people who harm a smaller number. Courts punish people less harshly when they harm more people.”

“Out of a maximum sentence of 10 years, people who read the three-victim story recommended an average prison term one year longer than the 30-victim readers. Another study, in which a food processing company knowingly poisoned customers to avoid bankruptcy, gave similar results.”

In the US “scoreboard reporting” on gun crime – “represented by numbing headlines like, “82 shot, 14 fatally.”” – has been criticised for similar reasons:

“”As long as we have reporting that gives the impression to everyone that poor, black folks in these communities don’t value life, it just adds to their sense of isolation,” says Stephen Franklin, the community media project director at the McCormick Foundation-funded Community Media Workshop, where he led the “We Are Not Alone” campaign to promote stories about solution-based anti-violence efforts.

“Natalie Moore, the South Side Bureau reporter for the Chicago Public Radio, asks: “What do we want people to know? Are we just trying to tell them to avoid the neighborhoods with many homicides?” Moore asks. “I’m personally struggling with it. I don’t know what the purpose is.””

Salience

This is where journalists play a particularly important role. Kevin Marsh, writing about Wikileaks on Sunday, argues that

“Whistleblowing that lacks salience does nothing to serve the public interest – if we mean capturing the public’s attention to nurture its discourse in a way that has the potential to change something material. “

He is right. But Charlie Beckett, in the comments to that post, points out that Wikileaks is not operating in isolation:

“Wikileaks is now part of a networked journalism where they are in effect, a kind of news-wire for traditional newsrooms like the New York Times, Guardian and El Pais. I think that delivers a high degree of what you call salience.”

This is because last year Wikileaks realised that they would have much more impact working in partnership with news organisations than releasing leaked documents to the world en masse. It was a massive move for Wikileaks, because it meant re-assessing a core principle of openness to all, and taking on a more editorial role. But it was an intelligent move – and undoubtedly effective. The Guardian, Der Spiegel, New York Times and now El Pais and Le Monde have all added salience to the leaks. But could they have done more?

Visualisation through personalisation and humanisation

In my series of posts on data journalism I identified visualisation as one of four interrelated stages in its production. I think that this concept needs to be broadened to include visualisation through case studies: or humanisation, to put it more succinctly.

There are dangers here, of course. Firstly, that humanising a story makes it appear to be an exception (one person’s tragedy) rather than the rule (thousands suffering) – or simply emotive rather than also informative; and secondly, that your selection of case studies does not reflect the more complex reality.

Ben Goldacre – again – explores this issue particularly well:

“Avastin extends survival from 19.9 months to 21.3 months, which is about 6 weeks. Some people might benefit more, some less. For some, Avastin might even shorten their life, and they would have been better off without it (and without its additional side effects, on top of their other chemotherapy). But overall, on average, when added to all the other treatments, Avastin extends survival from 19.9 months to 21.3 months.

“The Daily Mail, the ExpressSky News, the Press Association and the Guardian all described these figures, and then illustrated their stories about Avastin with an anecdote: the case of Barbara Moss. She was diagnosed with bowel cancer in 2006, had all the normal treatment, but also paid out of her own pocket to have Avastin on top of that. She is alive today, four years later.

“Barbara Moss is very lucky indeed, but her anecdote is in no sense whatsoever representative of what happens when you take Avastin, nor is it informative. She is useful journalistically, in the sense that people help to tell stories, but her anecdotal experience is actively misleading, because it doesn’t tell the story of what happens to people on Avastin: instead, it tells a completely different story, and arguably a more memorable one – now embedded in the minds of millions of people – that Roche’s £21,000 product Avastin makes you survive for half a decade.”

Broadcast journalism – with its regulatory requirement for impartiality, often interpreted in practical terms as ‘balance’ – is particularly vulnerable to this. Here’s one example of how the homeopathy debate is given over to one person’s experience for the sake of balance:

Journalism on an industrial scale

The Wikileaks stories are journalism on an industrial scale. The closest equivalent I can think of was the MPs’ expenses story which dominated the news agenda for 6 weeks. Cable Gate is already on Day 9 and the wealth of stories has even justified a live blog.

With this scale comes a further problem: cynicism and passivity; Cable Gate fatigue. In this context online journalism has a unique role to play which was barely possible previously: empowerment.

3 years ago I wrote about 5 Ws and a H that should come after every news story. The ‘How’ and ‘Why’ of that are possibilities that many news organisations have still barely explored. ‘Why should I care?’ is about a further dimension of visualisation: personalisation – relating information directly to me. The Guardian moves closer to this with its searchable database, but I wonder at what point processing power, tools, and user data will allow us to do this sort of thing more effectively.

‘How can I make a difference?’ is about pointing users to tools – or creating them ourselves – where they can move the story on by communicating with others, campaigning, voting, and so on. This is a role many journalists may be uncomfortable with because it raises advocacy issues, but then choosing to report on these stories, and how to report them, raises the same issues; linking to a range of online tools need not be any different. These are issues we should be exploring, ethically.

All the above in one sentence

Somehow I’ve ended up writing over a thousand words on this issue, so it’s worth summing it all up in a sentence.

Industrial scale journalism using ‘big data’ in a networked age raises new problems and new opportunities: we need to humanise and personalise big datasets in a way that does not detract from the complexity or scale of the issues being addressed; and we need to think about what happens after someone reads a story online and whether online publishers have a role in that.

CCTV spending by councils/how many police officers would that pay? – statistics in context

News organisations across the country will today be running stories based on a report by Big Brother Watch into the amount spent on CCTV surveillance by local authorities (PDF). The treatment of this report is a lesson in how journalists approach figures, and why context is more important than raw figures.

BBC Radio WM, for example, led this morning on the fact that Birmingham topped the table of spending on CCTV. But Birmingham is the biggest local authority in the UK by some distance, so this fact alone is not particularly newsworthy – unless, of course, you omit this fact or allow anyone from the council to point it out (ahem).

Much more interesting was the fact that the second biggest spender was Sandwell – also in the Radio WM region. Sandwell spent half as much as Birmingham – but its population is less than a third the size of its neighbour. Put another way, Sandwell spent 80% more per head of population than Birmingham on CCTV (£18 compared to Birmingham’s £10 per head).

Being on a deadline wasn’t an issue here: that information took me only a few minutes to find and work out.

The Press Association’s release on the story focused on the Birmingham angle too – taking the Big Brother Watch statements and fleshing them out with old quotes from those involved in the last big Birmingham surveillance story – the Project Champion scheme – before ending with a top ten list of CCTV spenders.

The Daily Mail, which followed a similar line, at least managed to mention that some smaller authorities (Woking and Breckland) had spent rather a lot of money considering their small populations.

There’s a spreadsheet of populations by local authority here.

How many police officers would that pay for?

A few outlets also repeated the assertions on how many nurses or police officers the money spent on surveillance would have paid for.

The Daily Mail quoted the report as saying that “The price of providing street CCTV since 2007 would have paid for more than 13,500 police constables on starting salaries of just over £23,000”. The Birmingham Mail, among others, noted that it would have paid the salaries of more than 15,000 nurses.

And here we hit a second problem.

The £314m spent on CCTV since 2007 would indeed pay for 13,500 police officers on £23,000 – but only for one year. On an ongoing basis, it would have paid the wages of 4,500 police officers (it should also be pointed out that the £314m figure only covered 336 local authorities – the CCTV spend of those who failed to respond would increase this number).

Secondly, wages are not the only cost of employment, just as installation is not the only cost of CCTV. The FOI request submitted by Big Brother Watch is a good example of this: not only do they ask for installation costs, but operation and maintenance costs, and staffing costs – including pension liabilities and benefits.

There’s a great ‘Employee True Cost Calculator‘ on the IT Centa website which illustrates this neatly: you have to factor in national insurance, pension contributions, overheads and other costs to get a truer picture.

Don’t blame Big Brother Watch

Big Brother Watch’s report is a much more illuminating, and statistically aware, read than the media coverage. Indeed, there’s a lot more information about Sandwell Council’s history in this area which would have made for a better lead story on Radio WM, juiced up the Birmingham Mail report, or just made for a decent story in the Express and Star (which instead simply ran the PA release UPDATE: they led the print edition with a more in-depth story, which was then published online later – see comments).

There’s also more about spending per head, comparisons between councils of different sizes, and between spending on other things*, and spending on maintenance, staffing (where Sandwell comes top) and new cameras – but it seems most reporters didn’t look beyond the first page, and the first name on the leaderboard.

It’s frustrating to see news organisations pass over important stories such as that in Sandwell for the sake of filling column inches and broadcast time with the easiest possible story to write. The result is a homogenous and superficial product: a perfect example of commodified news.

I bet the people at Big Brother Watch are banging their heads on their desks to see their digging reported with so little depth. And I think they could learn something from Wikileaks on why that might be: they gave it to all the media at the same time.

Wikileaks learned a year ago that this free-to-all approach reduced the value of the story, and consequently the depth with which it was reported. But by partnering with one news organisation in each country Wikileaks not only had stories treated more seriously, but other news organisations chasing new angles jealously.

*While we’re at it, the report also points out that the UK spends more on CCTV per head than 38 countries do on defence, and 5 times more in total than Uganda spends on health. “UK spends more on CCTV than Bangladesh does on defence” has a nice ring to me. That said, those defence spending figures turn out to be from 2004 and earlier, and so are not exactly ideal (Wolfram Alpha is a good place to get quick stats like this – and suggests a much higher per capita spend)

Statistics and data journalism: seasonal adjustment for journalists

seasonal adjustment image from Junk Charts

When you start to base journalism around data it’s easy to overlook basic weaknesses in that data – from the type of average that is being used, to distribution, sample size and statistical significance. Last week I wrote about inflation and average wages. A similar factor to consider when looking at any figures is seasonal adjustment.

Kaiser Fung recently wrote a wonderful post on the subject:

“What you see [in the image above] is that almost every line is an inverted U. This means that no matter what year, and what region, housing starts peak during the summer and ebb during the winter.

“So if you compare the June starts with the October starts, it is a given that the October number will be lower than June. So reporting a drop from June to October is meaningless. What is meaningful is whether this year’s drop is unusually large or unusually small; to assess that, we have to know the average historical drop between October and June.

“Statisticians are looking for explanations for why housing starts vary from month to month. Some of the change is due to the persistent seasonal pattern. Some of the change is due to economic factors or other factors. The reason for seasonal adjustments is to get rid of the persistent seasonal pattern, or put differently, to focus attention on other factors deemed more interesting.

“The bottom row of charts above contains the seasonally adjusted data (I have used the monthly rather than annual rates to make it directly comparable to the unadjusted numbers.)  Notice that the inverted U shape has pretty much disappeared everywhere.”

The first point is not to think you’ve got a story because house sales are falling this winter – they might fall every winter. In fact, for all you know they may be falling less dramatically than in previous years.

The second point is to be aware of whether the figures you are looking at have been seasonally adjusted or not.

The final – and hardest – point is to know how to seasonally adjust data if you need to.

For that last point you’ll need to go elsewhere on the web. This page on analysing time series takes you through the steps in Excel nicely. And Catherine Hood’s tipsheet on doing seasonal adjustment on a short time series in Excel (PDF) covers a number of different types of seasonal variation. For more on how and where seasonal adjustment is used in UK government figures check out the results of this search (adapt for your own county’s government domain).

What inflation has to do with the price of fish

Inflation image by Gregor Rohrig

Inflation image by Gregor Rohrig - click to see source

One of the forms of data that journalists frequently have to deal with is prices. And while it’s one thing to say that things are getting more expensive, making a meaningful comparison between what things cost now and what things cost then is a different kettle of fish altogether.

Factoring in inflation can make all the difference between arbitrary comparisons that provide no insight whatsoever, and genuinely meaningful reporting.

Thanks to computing power it’s actually quite easy for journalists to factor inflation into their reporting – by using an inflation calculator. It’s also easier to find historical price data with data-driven search engines like Wolfram Alpha.

But inflation is only half of the calculation you need. The other is earnings.

Professor Ian Stewart illustrates this perfectly in this article in The Telegraph:

“[A] 1991 pint cost around £1.40, which is £1.80 in today’s money. The current price is around £2.80, so beer really is more expensive. On the other hand, the average salary in 1991 was £19,000, and today it is £38,000. Relative to what we earn, a pint costs exactly the same as it did 19 years ago.

“Our house? That would be £125,000 today, so it has gone up by 84 per cent. Relative to average earnings, however, the increase is only 10 per cent.

“The Guardian knows about inflation, and said that the pub pint has increased by 68 per cent in real terms. But this compares the real increase in new money with the original price in old money. If I did the calculation like that for my house it would have gone up by 850 per cent. Calculated sensibly, the rise in the price of beer is about 55 per cent relative to inflation, and zero per cent relative to earnings.”

Of course the danger in averages is that they only illustrate aggregate change, and if you’re talking about a purchase that a particular section of the population makes – or you’re only talking to a particular region – then a national average may not be as meaningful a comparison to make.

If the poor are getting poorer and the rich richer then a pint of beer really is more expensive for some – and cheaper for others – than it used to be. Likewise, particular parts of the country might be suffering more from house price increases than others because of local average wages and local house prices.

It’s also worth pointing out that, when talking about financial data, a median is a much more useful measure to take than a mean.

Finally, aside from the statistical considerations it’s worth coming back to some of the basics of pricing. Ian again:

“There are two things to remember about prices. One is basic economics: if something gets too expensive for people to buy it, they don’t. So prices and wages have to stay in step, broadly speaking – though with big fluctuations in some commodities, such as housing. The other is inflation. We all know it exists, but we forget that when we start comparing prices. ‘My God! A Ford Anglia cost only £295 in 1940!’ True, but the average salary then was £370. The equivalent price today is £30,000, which will buy you a Jaguar XF.”

Statistical analysis as journalism – Benford’s law

 

drug-related murder map

I’m always on the lookout for practical applications of statistical analysis for doing journalism, so this piece of work by Diego Valle-Jones, on drug-related murders, made me very happy.

I’ve heard of the first-digit law (also known as Benford’s law) before – it’s a way of spotting dodgy data.

What Diego Valle-Jones has done is use the method to highlight discrepancies in information on drug-delated murders in Mexico. Or, as Pete Warden explains:

“With the help of just Benford’s law and data sets to compare he’s able to demonstrate how the police are systematically hiding over a thousand murders a year in a single state, and that’s just in one small part of the article.”

Diego takes up the story:

“The police records and the vital statistics records are collected using different methodologies: vital statistics from the INEGI [the statistical agency of the Mexican government] are collected from death certificates and the police records from the SNSP are the number of police reports (“averiguaciones previas”) for the crime of murder—not the number of victims. For example, if there happened to occur a particular heinous crime in which 15 teens were massacred, but only one police report were filed, all the murders would be recorded in the database as one. But even taking this into account, the difference is too high.

“You could also argue that the data are provisional—at least for 2008—but missing over a thousand murders in Chihuahua makes the data useless at the state level. I could understand it if it was an undercount by 10%–15%, or if they had added a disclaimer saying the data for Chihuahua was from July, but none of that happened and it just looks like a clumsy way to lie. It’s a pity several media outlets and the UN homicide statistics used this data to report the homicide rate in Mexico is lower than it really is.”

But what brings the data alive is Diego’s knowledge of the issue. In one passage he checks against large massacres since 1994 to see if they were recorded in the database. One of them – the Acteal Massacre (“45 dead, December 22, 1997″) – is not there. This, he says, was “committed by paramilitary units with government backing against 45 Tzotzil Indians … According to the INEGI there were only 2 deaths during December 1997 in the municipality of Chenalho, where the massacre occurred. What a silly way to avoid recording homicides! Now it is just a question of which data is less corrupt.”

The post as a whole is well worth reading in full, both as a fascinating piece of journalism, and a fascinating use of a range of statistical methods. As Pete says, it is a wonder this guy doesn’t get more publicity for his work.

Statistical analysis as journalism – Benford's law

drug-related murder map

I’m always on the lookout for practical applications of statistical analysis for doing journalism, so this piece of work by Diego Valle-Jones, on drug-related murders, made me very happy.

I’ve heard of the first-digit law (also known as Benford’s law) before – it’s a way of spotting dodgy data.

What Diego Valle-Jones has done is use the method to highlight discrepancies in information on drug-delated murders in Mexico. Or, as Pete Warden explains:

“With the help of just Benford’s law and data sets to compare he’s able to demonstrate how the police are systematically hiding over a thousand murders a year in a single state, and that’s just in one small part of the article.”

Diego takes up the story:

“The police records and the vital statistics records are collected using different methodologies: vital statistics from the INEGI [the statistical agency of the Mexican government] are collected from death certificates and the police records from the SNSP are the number of police reports (“averiguaciones previas”) for the crime of murder—not the number of victims. For example, if there happened to occur a particular heinous crime in which 15 teens were massacred, but only one police report were filed, all the murders would be recorded in the database as one. But even taking this into account, the difference is too high.

“You could also argue that the data are provisional—at least for 2008—but missing over a thousand murders in Chihuahua makes the data useless at the state level. I could understand it if it was an undercount by 10%–15%, or if they had added a disclaimer saying the data for Chihuahua was from July, but none of that happened and it just looks like a clumsy way to lie. It’s a pity several media outlets and the UN homicide statistics used this data to report the homicide rate in Mexico is lower than it really is.”

But what brings the data alive is Diego’s knowledge of the issue. In one passage he checks against large massacres since 1994 to see if they were recorded in the database. One of them – the Acteal Massacre (“45 dead, December 22, 1997”)is not there. This, he says, was “committed by paramilitary units with government backing against 45 Tzotzil Indians … According to the INEGI there were only 2 deaths during December 1997 in the municipality of Chenalho, where the massacre occurred. What a silly way to avoid recording homicides! Now it is just a question of which data is less corrupt.”

The post as a whole is well worth reading in full, both as a fascinating piece of journalism, and a fascinating use of a range of statistical methods. As Pete says, it is a wonder this guy doesn’t get more publicity for his work.

Internet use in the UK – implications from Ofcom's research for publishers

Apart from photo sharing and social networking, most internet users have little interest in UGC

UPDATE: The Office for National Statistics has also released some data on internet access which paints a more positive picture. Their data puts the numbers who haven’t been online at 18%. And 45% had accessed the web on the move .

I’ve just been scanning through the internet section of Ofcom’s latest report on The Communications Market 2010. As always, it’s an essential read and this year the body have done a beautiful job in publishing it online with unique URLs for each passage of the document, and downloadable CSV and PDF files for each piece of data.

Here are what I think are the key points for those specifically interested in online journalism and publishing: Continue reading

Review: Heather Brooke – The Silent State

The Silent State

In the week that a general election is called, Heather Brooke’s latest book couldn’t have been better timed. The Silent State is a staggeringly ambitious piece of work that pierces through the fog of the UK’s bureaucracies of power to show how they work, what is being hidden, and the inconsistencies underlying the way public money is spent.

Like her previous book, Your Right To Know, Brooke structures the book into chapters looking at different parts of the power system in the UK – making it a particularly usable reference work when you want to get your head around a particular aspect of our political systems.

Chapter by chapter

Chapter 1 lists the various databases that have been created to maintain information on citizens – paying particular focus to the little-publicised rack of databases holding subjective data on children. The story of how an old unpopular policy was rebranded to ride into existence on the back of the Victoria Climbie bandwagon is particularly illustrative of government’s hunger for data for data’s sake.

Picking up that thread further, Chapter 2 explores how much public money is spent on PR and how public servants are increasingly prevented from speaking directly to the media. It’s this trend which made The Times’ outing of police blogger Nightjack particularly loathsome and why we need to ensure we fight hard to protect those who provide an insight into their work on the ground.

Chapter 3 looks at how the misuse of statistics led to the independence of the head of the Office of National Statistics – but not the staff that he manages – and how the statistics given to the media can differ quite significantly to those provided when requested by a Select Committee (the lesson being that these can be useful sources to check). It’s a key chapter for anyone interested in the future of public data and data journalism.

Bureaucracy itself is the subject of the fourth chapter. Most of this is a plea for good bureaucracy and the end of unnamed sources, but there is still space for illustrative and useful anecdotes about acquiring information from the Ministry of Defence.

And in Chapter 5 we get a potted history of MySociety’s struggle to make politicians accountable for their votes, and an overview of how data gathered with public money – from The Royal Mail’s postcodes to Ordnance Survey – is sold back to the public at a monopolistic premium.

The justice system and the police are scrutinised in the 6th and 7th chapters – from the twisted logic that decreed audio recordings are more unreliable than written records to the criminalisation of complaint.

Then finally we end with a personal story in Chapter 8: a reflection on the MPs’ expenses saga that Brooke is best known for. You can understand the publishers – and indeed, many readers – wanting to read the story first-hand, but it’s also the least informative of all the chapters for journalists (which is a credit to all that Brooke has achieved on that front in wider society).

With a final ‘manifesto’ section Brooke summarises the main demands running across the book and leaves you ready to storm every institution in this country demanding change. It’s an experience reminiscent of finishing Franz Kafka’s The Trial – we have just been taken on a tour through the faceless, logic-deprived halls of power. And it’s a disconcerting, disorientating feeling.

Journalism 2.0

But this is not fiction. It is great journalism. And the victims caught in expensive paper trails and logical dead ends are real people.

Because although the book is designed to be dipped in as a reference work, it is also written as an eminently readable page-turner – indeed, the page-turning gets faster as the reader gets angrier. Throughout, Brooke illustrates her findings with anecdotes that not only put a human face on the victims of bureaucracy, but also pass on the valuable experience of those who have managed to get results.

For that reason, the book is not a pessimistic or sensationalist piece of writing. There is hope – and the likes of Brooke, and MySociety, and others in this book are testament to the fact that this can be changed.

The Silent State is journalism 2.0 at its best – not just exposing injustice and waste, but providing a platform for others to hold power to account. It’s not content for content’s sake, but a tool. I strongly recommend not just buying it – but using it. Because there’s some serious work to be done.

Summary of "Magazines and their websites" – Columbia Journalism Review study by Victor Navasky and Evan Lerner

The first study (PDF) of magazines and their various approaches to websites, undertaken by Columbia Journalism Review, found publishers are still trying to work out how best to utilise the online medium.

There is no general standard or guidelines for magazine websites and little discussion between industry leaders as to how they should most effectively be approached.

Following the responses to the multiple choice questionnaire and the following open-ended questions –

  • What do you consider to be the mission of your website, does this differ from the mission of your print magazine?
  • What do you consider to be the best feature of aspect of your website?
  • What feature of your website do you think most needs improvement or is not living up to its potential?

– the researchers called for a collective, informed and contemporary approach to magazine websites with professional body support.

The findings were separated into the following 6 categories: Continue reading