Category Archives: citizen journalism

What’s good for TwitPic may be bad for photojournalists

Yesterday Mashable ran an interesting story about how iPhone will soon become the top camera for images uploaded onto Flickr. Previously that spot belonged to the Canon EOS Digital Rebel XT, which is basically the DSL-R for beginners.

With each production cycle, mobile phone cameras are getting more sophisticated. Meanwhile it’s incredibly easy to upload a just-taken photo from your sophisticated camera phone onto the web. I recently upgraded my BlackBerry to the 8900, which has a 3.2MP auto-focus camera. Not a lot of megapixels, but the autofocus is what makes it a great camera phone. Taking a photo and uploading it to TwitPic takes less than a minute. The quality of the photos are pretty good, too.

The proliferation of iPhones, BlackBerrys and other camera phone brands has meant more people are photographing the things they do and putting them up on the web. For small and mid-size papers, getting art for a story could be as easy as doing a TwitPic search by keyword and see what pops up. If a user-taken photo of an event pops up, you could contact the author, ask for permission and post it. At worst, they’d ask for a small fee, which when paid would still be a money saver compared to sending a photojournalist to an event.

The same could be said for videos. If a video of an event is uploaded to YouTube or any of the other video hosting sites, a news organisation could contact the person who shot it and ask permission to use it.

As the line between reporter and reader becomes further blurred, technological advances and the will of the people may mean that photojournalists are primarily employed by news organisations who feel they can both print the photos and sell the originals for a nice profit.

If the public is providing printable photos either for free or at a fraction of the cost of employing a photojournalist, that won’t be a terribly difficult decision for any executive editor to make.

Opportunities for local news blogs: Trends in Blogging

In the last year or so there have been a number of new blog / news sites developing which provide commentary for a geographically identified area, covering politics but also giving a more rounded view of life in the area.

The site which has drawn my attention recently is The Lichfield Blog, which I mention on the Wardman Wire or on Twitter (follow me to keep up to date) from time to time. There are examples of sites with a similar ethos established for some time, including some personal blogs, and I’d mention Londonist and Dave Hill’s Clapton Pond Blog (Hackney), but also sites such as Created in Birmingham (Birmingham Arts, mainly) and Curley’s Corner Shop (South Tyneside).

Some areas have a range of local blogs. The tiny Isle of Thanet, for example, has Bignews Margate, Thanet Life and Thanet Online, in addition to the more idiosyncratic Thanet Coast Life, Eastcliff Richard and even Naked in Thanet. It’s worth noting that – once again – this set of blogs are all edited by men.

And if you think that Thanet is small to have all those local blogs, try the Plight of Pleasley Hill, an ultra-local blog specifically created to foster community in an area of 3 or 4 streets in the Nottinghamshire village of Pleasley Hill, near Mansfield. I did a podcast interview with Mark Jones, who has triggered the project, for the Politalks podcast. One interesting point is how the creation of a website has helped “institutionalise” a small group internally, but also how it can help externally in the process of persuading large bureaucracies (e.g., the local council) to engage with the group.

Some of those sites have political stances, and some don’t. The common factor is that they provide coverage of local life and grounded politics, and don’t pay unnecessary attention to the Westminster Punch and Judy show.

Occasionally “ultra-local” has been used to refer to areas the size of a London Borough, or a provincial city. I’d suggest that we need to think in *much* smaller areas. I wonder if the one-horse-town newspaper of settlers’ America, but written by local people for themselves, is where we are going to end up, and then with sites covering larger communities, areas and specialist themes which are able to draw an audience.

I’d suggest that there is also a new opportunity opening up for these independent commentary and reporting sites due to a pair of current trends:

  • The drive by national media sites to find new ways of persuading their readers to pay for parts of their web content – pay-walls, charges for special services and anything else they can dream up. As the editor of an independent “politics and life” commentary site with a number of excellent contributors, I can’t wait for the age of “Pay 4 Polly” to arrive.
  • The continuing liquidation of our local newspapers and regional media.

Locally focused blogs with a more rounded coverage may provide an answer to consistent criticisms made of “the political blogosphere”:

  • Political bloggers only do partisan politics (which is wrong, but it can sometimes look as if it is true).
  • There is too much coverage of the Westminster Village (which is right, but someone has to do it, and it is the place where many decisions are made).

I think group blogs with varied teams of contributors may be best placed to provide a decent level of coverage and draw a good readership, while competing effectively with other media outlets. That is a trend we have seen in the political blog niche over several years – the sites which have established themselves and maintain a position as key sites have developed progressively larger teams of editors, and provided a wider range of commentary and services.

A team of contributors allows a site to benefit from the presence of real enthusiasts in each area of reporting, from the minutiae of the Council Meetings to Arts Events at the local galleries.

I’m developing a list of sites aiming to rounded provide coverage of a defined local area, town, or community. If you run a good one, or know of one, please could you drop me a line via the Contact Form on the Wardman Wire. Alternatively, use the form below:

Loading…

(Note: if you want to know more about local news blogs in general rather than what I think can be done with them, the go-to place is Talk About Local.)

Gatewatching for local news

Among the many good things about Internet news consumption is the fact that audiences can seek any sort of information to suit their interests and inclinations. No longer stifled by editorial, corporate or advertiser monopoly, readers browse everything from obscure blogs to mainstream news sites to get the information they want.

Ever since Internet media started going mainstream, however, many have raised the question of whether this vast and tolerant space is causing people to replace news that informs and educates with that which merely entertains. One has only to look at the slew of sensational Internet videos that go viral, or the latest online reiteration of Jessica Simpson’s gaffe to accept that this is a legitimate concern. In addition, people have more options than ever before to confine themselves to fragmented communities and echo chambers to get the news they want in lieu of what they need.

As Charlie Beckett points out in Supermedia, while the diversity provided by the Internet with regard to information dissemination is important, it also tends to further the divide between those looking for real, relevant information and those who merely want instant gratification through the latest celebrity gossip.

Of course, blaming new media for its endless possibilities would be sort of like blaming that decadent chocolate cake for existing. Just because it is there, doesn’t mean you need to seek it.

This has been a more major concern with regard to local news. Citizens might tend to focus on the latest iPhone application released by Apple at the expense of important news happening at home – information that would be vital to them as contributors to a democracy.

But while lack of reader interest is a problem, it is often spurred on by scarcity of engaging content from news organizations – if all a local paper can provide is a string of wire service accounts and press releases, how do they expect to keep readers motivated? This was hard enough to accept in an age where the newspaper or the evening news broadcast was the only source of information. It is simply untenable in the Web 2.0 world, where readers can get actual, eyewitness accounts from their Twitter followers and view firsthand pictures through Flickr groups. In other words, in this age of social media and online networks, local journalists seem almost out of touch with the community they live in.

The question then is, can residents of a community do well as their own gatewatchers?

The New York-based site NYC.is, which functions as a “Digg” for the city and its surrounding areas is trying to do just that. “Our goal is to connect bloggers, independent reporters and activists in different parts of the five boroughs, rewarding the best work by sending it traffic and increasing potential for impact,” reads the mission statement.

I got a chance to talk to Susannah Vila, a graduate student at Columbia University, who launched the site. “The inspiration behind the concept is [it provides] ways of democratizing the Web.  This was part of what excited me about making the site,” she says.

Readers themselves direct attention to local news that they deem important, while also channeling traffic to independent bloggers, regional Web sites and mainstream sites. Anything from New York City mayor Michael Bloomberg’s job approval ratings to rising prices of a pizza slice in Brooklyn can turn up on the front page.  “The point is, it is not just one type of story that gets popular. There is a lot of range,” says Vila. The common thread is relevance to people of the community. In true Digg fashion, the top contributors get a mention on the home page, as do the most popular stories.

Can this go one step further, and actually motivate people to do original reporting or garner data for a new story? “Once I get more of a community on the site with more engaged readers there is definitely a possibility to prompt them to investigate certain things or to [urge them] to go to community board meetings,” Vila says. ““It would also be cool to let people vote on ideas for stories.”

A gatewatching site at a local community level may not be sufficient to provide all the information residents need, but it certainly allows a comprehensive look at what readers are looking for, and what is important to them as residents, and as citizens: it can sometimes be an aspiring young band, or the New York Mets’ dismal season, but more often than not, it is about hard issues, such as the annual decline in household incomes, grassroots candidates for City Council, and governmental oversight of local schools.

Add context to news online with a wiki feature

In journalism school you’re told to find the way that best relates a story to your readers. Make it easy to read and understand. But don’t just give the plain facts, also find the context of the story to help the reader fully understand what has happened and what that means.

What better way to do that than having a Wikipedia-like feature on your newspaper’s web site? Since the web is the greatest causer of serendipity, says Telegraph Communities Editor Shane Richmond, reading a story online will often send a reader elsewhere in search of more context wherever they can find it.

Why can’t that search start and end on your web site?

What happens today

Instead of writing this out, I’ll try to explain this with a situation:

While scanning the news on your newspaper’s web site, one story catches your eye. You click through and begin to read. It’s about a new shop opening downtown.

As you read, you begin to remember things about what once stood where the new shop now is. You’re half-way through the story and decide you need to know what was there, so you turn to your search engine of choice and begin hunting for clues.

By now you’ve closed out the window of the story you were reading and are instead looking for context. You don’t return to the web site because once you find the information you were looking for, you have landed on a different news story on a different news web site.

Here’s what the newspaper has lost as a result of the above scenario: Lower site stickiness, fewer page views, fewer uniques (reader could have forwarded the story onto a friend), and a loss of reader interaction through potential story comments. Monetarily, this all translates into lower ad rates that you can charge. That’s where it hurts the most.

How it could be

Now here’s how it could be if a newspaper web site had a wiki-like feature:

The story about the new shop opening downtown intrigues you because, if memory serves, something else used to be there years ago. On the story there’s a link to another page (additional page views!) that shows all of the information about that site that is available in public records.

You find the approximate year you’re looking for, click on it, and you see that before the new shop appeared downtown, many years ago it was a restaurant you visited as a child.

It was owned by a friend of your father’s and it opened when you were six years old. Since you’re still on the newspaper web site (better site stickiness!), you decide to leave a comment on the story about what was once there and why it was relevant to you (reader interaction!). Then you remember that a friend often went there with you, so you email it to them (more uniques!) to see if they too will remember.

Why it matters to readers

For consumers, news is the pursuit of truth and context. Both the news organization and the journalists it employs are obligated to give that to them. The hardest part of this is disseminating public records and putting it online.

The option of crowd-sourcing it, much like Wikipedia does with its records, could work out well. However just the act of putting public records online in a way that makes theme contextually relevant would be a big step forward. It’s time consuming, however the rewards are great.

Newspapers on Twitter – how the Guardian, FT and Times are winning

National newspapers have a total of 1,068,898 followers across their 120 official Twitter accounts – with the Guardian, Times and FT the only three papers in the top 10. That’s according to a massive count of newspaper’s twitter accounts I’ve done (there’s a table of all 120 at that link).

The Guardian’s the clear winner, as its place on the Twitter Suggested User List means that its @GuardianTech account has 831,935 followers – 78% of the total …

@GuardianNews is 2nd with 25,992 followers, @TimesFashion is 3rd with 24,762 and @FinancialTimes 4th with 19,923.

Screenshot of the data

Screenshot of the data

Other findings

  • Glorified RSS Out of 120 accounts, just 16 do something other than running as a glorified RSS feed. The other 114 do no retweeting, no replying to other tweets etc (you can see which are which on the full table).
  • No following. These newspaper accounts don’t do much following. Leaving GuardianTech out of it, there are 236,963 followers, but they follow just 59,797. They’re mostly pumping RSS feeds straight to Twitter, and  see no reason to engage with the community.
  • Rapid drop-off There are only 6 Twitter accounts with more than 10,000 followers. I suspect many of these accounts are invisible to most people as the newspapers aren’t engaging much – no RTing of other people’s tweets means those other people don’t have an obvious way to realise the newspaper accounts exist.
  • Sun and Mirror are laggards The Sun and Mirror have work to do – they don’t seem to have much talent at this so far and have few accounts with any followers. The Mail only seems to have one account but it is the 20th largest in terms of followers.

The full spreadsheet of data is here (and I’ll keep it up to date with any accounts the papers forgot to mention on their own sites)… It’s based on official Twitter accounts – not individual journalists’. I’ve rounded up some other Twitter statistics if you’re interested.

8% of Telegraph.co.uk traffic from social sites

Telegraph.co.uk gets an amazing 8% of its visitors from social sites like Digg, Delicious, Reddit and Stumbleupon, Julian Sambles, Head of Audience Development, has revealed.

The figure explains how the Telegraph is now the most popular UK newspaper site.

75,000 visitors a day

The Telegraph had about 28 million unique visitors in March, which means social sites are sending it almost 75,000 unique visitors a day.

Search engines are responsible for about a third of the Telegraph’s traffic Julian also revealed – or about 300,000 unique visitors a day.

This means the Telegraph gets 1 social visitor for every 4 search ones – an astonishingly high ratio.

You can read more of what Julian said about the Telegraph’s social media strategy here. The statistics were originally given for an article on social sites on FUMSI.

Do blogs make reporting restrictions pointless?

The leaked DNA test on 13-year-old alleged dad Alfie Patten has revealed a big problem with court-ordered reporting restrictions in the internet age. (NB This is a cut down version of a much longer original post on blogging and reporting restrictions that was featured on the Guardian).

Court orders forbidding publication of certain facts apply only to people or companies who have been sent them. But this means there is nothing to stop bloggers publishing material that mainstream news organisations would risk fines and prison for publishing.

Even if a blogger knows that there is an order, and so could be considered bound by it, an absurd catch 22 means they can’t found out the details of the order – and so they risk contempt of court and prison.

Despite the obvious problem the Ministry of Justice have told me they have no plans to address the issue. Continue reading

3 weeks in: launching a Midlands environmental news site

3 weeks ago my class of online journalism students were introduced to the website they were going to be working on: BirminghamRecycled.co.uk – environmental news for Birmingham and the West Midlands.

The site has been built by final year journalism degree student Kasper Sorensen, who studied the online journalism module last year.

In building and running the service Kasper has done a number of clever, networked things I thought I should highlight. They include:

  • Creating a Delicious network for the site – every journalist in the team has a Delicious account; this gathers together all of the useful webpages that journalists are bookmarking
  • Tweetgrid of all journalists’ tweets – again, every journalist has a Twitter account. This pulls them all together.
  • Twitter account @bhamrecycled
  • Kasper sent the whole team an OPML file of subscriptions to RSS feeds of searches for every Midlands area and environmentally related keywords. In other words, journalists could import this into their Google Reader and at a stroke be monitoring any mention of certain key words (e.g. ‘pollution’, ‘recycling’) in Birmingham areas.
  • He also shared a Google calendar of relevant events

The site itself is clever too.

  • The About page has a list of all contributing journalists with individual RSS feeds.
  • In addition, each author has a link to their own profile page which not only displays their articles but pulls Twitter tweets, Delicious bookmarks and blog posts.

Kasper wanted to explicitly follow a Mashable-style model rather than a traditional news service: he felt an overly formal appearance would undermine his attempts to build a community around the site.

And community is key. When unveiling the site to the journalists Kasper made the following presentation – a wonderful distillation of how journalists need to approach news in a networked world:

External links: the 8 stages of linking out denial

Do you have a link problem? You can handle linking. It’s just one post/article/page without a link. You can link whenever you want to. Or can you?

Where are you on this scale …? (Originally posted here.)

1 Don’t link to anyone

Link to other sites? But people will leave my site. They won’t click on my ads. They won’t read other pages. I’ll leak page rank. No way.

2 Add URLs but don’t make them hyperlinks

OK, that’s a bit ridiculous. If I’m talking about other organisations, I can’t pretend they don’t have a website. I know, I’ll put web addresses in. But i won’t make them hyperlinks. Brilliant, yes?

3 Add an ‘external links’ box

Even I’m finding that no hyperlink thing annoying when I go back to an old page and have to copy and paste the damn things.

I suppose I should have some links on my page. I’ll put them in a box. Over there (down a bit …). I’m going to use some sort of internal redirect or annoying javascript, though, to make sure I don’t pass any page rank. Mwah, hah hah.

4 Put some links in the copy

I don’t seem to be getting many inbound links. I guess I’m not playing fair. I know, I’ll sort out my workflow so that it’s possible to add links easily inside the actual copy. But I’m still not passing any pagerank. I’m going to put “rel=nofollow” on every link.

5 Give my users some google juice

Commenters seem thin on the ground. Maybe I’ll let them link to their own sites. I’ll use some annoying javascript to hide the links from google though. Most of my commenters are probably spammers, and I can’t trust them to police their own community, after all.

6 Link when I have to. And remove nofollow and any other annoying tricks

That seemed to make everyone happier. There are a few proper links on my pages. And people seem to want to link to me now that I’m playing fair with my links.

7 Acknowledge my sources

Oops. Spoke to soon. Been outed as pinching someone else’s idea and not attributing it. From now on, I’m going to make sure I always link to everyone I should.

8 Enlightenment: Make linking part & parcel of what I do

Internet. Inter as in inter-connected. Net as in network.

I get it now. I’m going to become a trusted source for information and advice, AND of what else people should read elsewhere on the internet. Blimey, more and more people are visiting my site and linking to it. Continue reading

Could moderators collect potential leads from comments?

Guardian community moderator Todd Nash* makes an interesting suggestion on his blog about the difficulties journalists face in wading through comments on their stories:

“there is potential for news stories to come out of user activity on newspaper websites. Yet, as far as I know, it is not a particularly well-utlised area. Time is clearly an issue here. How many journalists have time to scroll through all of their comments to search for something that could well resemble a needle in a haystack? It was commented that, ironically, freelancers may make better use of this resource as their need for that next story is greater than their staff member counterparts.

“The moderation team at guardian.co.uk now has a Twitter feed @GuardianVoices which highlights good individual comments and interesting debate. Could they be used as a tool to collect potential leads? After all, moderators will already be reading the majority of content of the publication they work for. However, it would require a rather different mindset to look out for story leads compared to the more usual role of finding and removing offensive content.”

It’s an idea worth considering – although, as Todd himself concludes:

“Increased interactivity with users builds trust, which in turn produces a higher class of debate and, with it, more opportunities for follow-up articles. Perhaps it is now time for the journalists to take inspiration from their communities as well.”

That aside, could this work? Could moderators work to identify leads?

*Disclosure: he’s also a former student of mine