ScraperWiki has rediscovered its old free scraping tool – and is now calling it QuickCode

A screenshot from before the 2013 relaunch of Scraperwiki

A screenshot from before the 2013 relaunch of Scraperwiki

7 years ago ScraperWiki launched with a plan to make scraping accessible to a wider public. It did this by creating an online space where people could easily write and run scrapers; and by making it possible to read and adapt scrapers written by other users (the ‘wiki’ part).

I loved it. The platform inspired me to learn Python, write Scraping for Journalists, and has been part of my journalism workflow since.

But ScraperWiki changed. Over time, as the company tried to find a sustainable business model, the ‘wiki’ part of the plan was dropped. Over 2012 and into 2013 they relaunched. Old scrapers were archived (you can still browse them here) and the emphasis shifted to selling “professional services”.

Individuals wanting to use the scraping tool themselves could pay a monthly fee. It was now harder for aspiring data journalists to get started with the tool – but it was also adding new features like visualisation.

Scraperwiki homepage

The second iteration of ScraperWiki

But the great news was that the company survived: seven years for a web company with an ambition to provide a public service is an enormous achievement. They were right to change tack, and their work had laid a track for others to follow: in 2014 the Open Australia Foundation launched Morph.io to fill the ‘wiki’ gap for scraping, while Kimono Labs (now closed) and Import.io had already launched more entry-level products.

Now, ScraperWiki is changing again. The company has been renamed The Sensible Code Company, and the ScraperWiki product is now QuickCode, pitched as an internal programming tool rather than something specific to scraping.

But it still scrapes. And journalists can still use it.

CEO Aine Maguire says they’re still happy to set up journalists with accounts on the service.

“We’re supporting journalism as before,” she says . “Nothing has changed. If anything am hoping it will get a boost if we can manage to add functionality which will benefit all.”

“Investigative journalists have always held a special place for us and in truth managing the signups is not a chore. It is the only community we continue to support.”

You can request an account hello@scraperwiki.com with the subject “Journalist”. More details here.

4 thoughts on “ScraperWiki has rediscovered its old free scraping tool – and is now calling it QuickCode

  1. Paul

    Shameless plug from webrobots.io – we give free workspace for data journalists. Pre-canned robots from past journalism projects are shared for reuse. Fair warning: our tool is more powerful and flexible than point and click stuff (import.io), but requires some javascript code. Journalists, just contact us from a website form, will be happy to add you to workspace.

    Reply
  2. Pingback: ScraperWiki: New Pivot, Old Tools | ResearchBuzz: Firehose

  3. Pingback: Nat Turner, Holocaust Testimonies, Pesticide Companies, More: Saturday Buzz, October 1, 2016 | ResearchBuzz

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.