Here’s a second question I’m expecting on Thursday when I give evidence to the Department for Culture, Media and Sport committee‘s sixth evidence session on The future for local and regional media. As Google are on before me (some act to follow) and aggregators are being waved around as the Big Baddie of traditional journalism, the question’s going to be asked: are aggregators really bad? And for whom?
What is aggregation?
The first point I’ll need to pick apart is what is meant by aggregation. The biggest news aggregators are local broadcasters and national newspapers, who habitually lift stories from local newspapers to fill their newshole. ‘But we add value!’ they might cry. Yes, and so do Google News and most of the other aggregators out there: by displaying them alongside each other for context, by using algorithms to identify which ones came first or are most linked to.
Of course the biggest value that aggregators add is by driving traffic back to the original material. Given that a) around a third of traffic to a typical news site comes from search engines and aggregators and b) most news sites have visitor numbers far in excess of their print readerships it’s fair to say that aggregators are not “parasites” eating into news website traffic. A more accurate description would be symbiotes, using content for mutual benefit.
Much of the objections to aggregators appear to me to come down to control and monopoly: Google is making a lot out of advertising, and newspapers are not. That’s simply the result of a competitor doing something better than you, which is selling advertising.
They do not sell that against newspaper content, they sell it against their index, and their functionality. A good analogy is map-makers: the owners of Tussaud’s Waxwork Museum don’t sue mapmakers for making money from featuring their attraction in their map, because visitors still have to go into the museum to enjoy its content. (And yes, any website publisher can instantly take itself off Google with a simple script anyway).
A second objection, it feels to me, comes from the fact that Google actually makes it much easier for you to bypass the parasites of the news industry who pass off other people’s hard work as their own, and go straight to the source. So if a local paper has a story you can read that instead of the national newspaper’s rewrite; if a scientific organisation did the research you can read that instead of the sensationalist write-up; the politician’s statement can be read in full, not out of context.
Is it good for consumers? Bad for journalists?
For the reasons given above, I don’t feel that aggregators are bad for consumers, although it would be over-simplistic to suggest they are therefore purely good. Aggregators require a different media literacy and are subject to the oddities of a selection process based on mathematical formula. In short, it’s not worse, or better, just different. But as a potential avenue to more information around a story, as well as a way of highlighting just how much news organisations reproduce the same material, I welcome them.
As for local journalists, aggregators don’t make things worse. In terms of their work processes, journalists benefit from aggregation by being able to find out information more quickly and efficiently. The downside, of course, is that so can their readers and so rewriting copy from elsewhere becomes less appropriate. Journalists have to add value, which seems to me a better and more rewarding use of their skills. On a basic level that might be through becoming expert aggregators themselves – better than the algorithms – or it may be by adding extra information, context or analysis that they can see is missing elsewhere. Either way, I can’t see how that is a bad thing.
But these are just my thoughts on the question of aggregation and their influence on the news industry. I’d welcome other perspectives as I prepare my responses.
couple of points:
to extend your metaphor, it is common for mapmakers to charge attractions heavily to be on the map. google already does that for mainstream content, yet few realise that the first three pages of a popular search are paidfors. if murdoch was paying top dollar for his content to be top of google we would be complaining about that too. (BTW i still think NI will use the bskyb sms to charge)
you touch on but perhaps don’t make enough of the fact that the trad media has always hunted in packs and redisplays, by the art of ‘rewriting’ each other’s content on a large scale. so it is rich for them to complain when someone else does it to the industry in aggregate. I am yet to get my local johnson press paper to give me a link when they rewrite one of my kings cross stories (they are polite enough to ask first, but won’t give me a link nor make a donation to a local kids charity).
Depending on the site, I may or may not welcome aggregation in one form or another.
For example, for a straight news site that is very local (such as our newspaper), I do not seek out aggregation, especially from Drudge, Fark, Digg, etc. They offer nothing to me but cheap page views. They are like third-party circulation (bad for local advertisers) or empty calories (bad for arteries). I am not particularly concerned about SEO, either. I am more interested in slowly building a local, high-quality audience, and I am willing to do this one reader at a time, if I can convince that reader to come back over and over by providing relevant, high-quality content.
Now, for a niche site that is not necessarily local, I do not mind being stronger in search engines to try to capture an audience interested in specialized content. But again, my job is to go out and find that audience, then provide content where the audience is with the hopes of converting it to being a regular on my Web site. I would much rather have strong, steady – even slow – growth that is high in audience quality vs. cheap, unrepeatable traffic.
–Interesting piece on the FLYP editor, and what the young can teach the seasoned
Pingback: What’s your problem with the internet? A crib sheet for news exec speeches