You may remember ‘investigation’ by The Hull Daily Mail into HU17.net, a hyperlocal publisher that was operating on its patch back in March, and the resulting backlash against the newspaper by observers who saw this as a commercially motivate hatchet job. Now the Press Complaints Commission has upheld a complaint on the basis “that readers would have been misled as to the scale of the complainant’s involvement in adult websites. The result was a breach of Clause 1 of the Editors’ Code.”
Last week I spent a thoroughly fascinating day at a hackday for journalists and web developers organised by Scraperwiki. It’s an experience that every journalist should have, for reasons I’ll explore below but which can be summed up thus: it will challenge the way you approach information as a journalist.
Disappointingly, the mainstream press and broadcast media were represented by only one professional journalist. This may be due to the time of year, but then that didn’t prevent journalists attending last week’s Liverpool event in droves. Senior buy-in is clearly key here – and I fear the Birmingham media are going to left behind if this continues.
Because on the more positive side there was a strong attendance from local bloggers such as Michael Grimes, Andy Brightwell (Podnosh), Clare White (Talk About Local) and Nicola Hughes (Your Local Scientist) – plus Martin Moore from the Media Standards Trust and some journalism students.
How it worked
After some brief scene-setting presentations, and individual expressions of areas of interest, the attendees split into 5 topic-based groups. They were:
- The data behind the cancellation of Building for Schools projects
- Leisure facilities in the Midlands.
- Issues around cervical smear testing
- Political donations
- And our group, which decided to take on the broad topic of ‘health’, within the particular context of plans to give spending power to GP consortia.
By the end of the day all groups had results – which I’ll outline at the end of the post – but more interesting to me was the process of developers and journalists working together, and how it changed both camp’s working practices.
Facebook has launched a Media page offering “best practices for journalists”. It’s a rather breathless creation, filled with ad-speak, but if you can put up with that it’s a pretty useful resource providing both a basic introduction to how the site can be used by journalists, through to tips and case studies for those who already use Facebook.
Although the page promotes Facebook’s own ‘Posts by everyone’ search facility that allows you to track the buzz around a particular topic,
Openbook is better [UPDATE Oct 2012: Openbook is no longer active. Social Buzz – a “real-time search engine for Facebook, Twitter and Google+” according to its CEO – may be another alternative].
For more links on Facebook, see my Delicious bookmarks under that tag.
Computerworld reports on plans by Wikileaks to allow “newspapers, human rights organizations, criminal investigators and others to embed an “upload a disclosure to me via Wikileaks” form onto their Web sites”.
“We will take the burden of protecting the source and the legal risks associated with publishing the document,” said Julien Assange, an advisory board member at Wikileaks, in an interview at the Hack In The Box security conference in Kuala Lumpur, Malaysia.
It’s a first class idea that addresses two major problems with investigative journalism: the risk of legal costs in pursuing investigations; and the need to build relationships between potential whistleblowers and Wikileaks’ technology.
In a nutshell, it’s a networked solution that piggybacks on the trust, relationships and audience built by publishers, NGOs and bloggers, and distributes the technology of Wikileaks so that users aren’t expected to come to them.
Sue Llewellyn asks if there’s a way to filter out Foursquare tweets. There is.
The first thing to do is work out something that all the tweets share. Well, every Foursquare tweet includes a link that begins http://4sq.com – so that’s it.
If you’re using Tweetdeck this is how you do it. At the bottom of every column in Tweetdeck are 6 buttons. The second one in – a downward-pointing arrow – is the ‘Filter this column’ button. Click this. A new row appears where you can filter the tweets. Select ‘Text’ then ‘-‘ and type ‘http://4sq.com’ in the third box. You should see tweets automatically filtered accordingly.
Seesmic desktop has a similar filtering function.
And on iPhone a few Twitter clients have filtering options, including Twittelator.
Let me know if you know of any others.
Journalism.co.uk have a list of this year’s “leading innovators in journalism and media”. I have some additions. You may too.
I brought Nick in to work with me on Help Me Investigate, a project for which he doesn’t get nearly enough credit. It’s his understanding of and connections with local communities that lie behind most of the successful investigations on the site. In addition, Nick helped spread the idea of the social media surgery, where social media savvy citizens help others find their online voice. The idea has spread as far as Australia and Africa.
Matt Buck and Alex Hughes
Matt and Alex have been busily reinventing news cartoons for a digital age with a number of projects, including Drawnalism (event drawing), animated illustrations, and socially networked characters such as Tobias Grubbe. Continue reading
A forthright post over at Boing Boing accuses Ofcom of copping out of their responsibility to sort out just where the burden of proof would fall in the Digital Economy Act’s proposals to disconnect people accused of breaking copyright laws. It’s based on an analysis by the Open Rights Group of Ofcom’s draft code.
“Ofcom’s proposal denies us the ability to check whether the methods of collecting of the evidence are trustworthy. Instead, copyright holders and Internet Service Providers will just self-certify that everything’s ok. If they get it wrong, there’s no penalty.
“The Act requires the evidential standards to be defined – but Ofcom are leaving this up the rights holders and ISPs to decide in the future. We ask, how is anyone meant to trust this code if we can’t see how the evidence is gathered or checked?”