“Spatiotemporal storytelling” at Le Parisien: how one newspaper is aggregating data to provide a public service

LiveCity is a new data-driven project from Le Parisien that aims to bring together a range of public data sources to serve audiences across its webpages and apps. In a guest post for OJB, Project Director Stanislas de Livonnière, spoke to Steve Carufel about the challenge of aggregating one city’s dispersed open and live data feeds into a single set of dashboards and widgets that could be integrated into the outlet’s website and apps.

It’s early Monday morning, and you’re weighing up commuting options, but you need information. Will you look at Google Maps, the Uber app, your local transportation authority’s website, the city bike sharing app, or listen to the radio?

Or it’s Friday evening, and you want to go out with friends — will you browse options through Yelp, TripAdvisor, Google Reviews, Facebook pages, your local events & lifestyle mag, or all of the above?

In either situation you’ll have to navigate through numerous apps and websites to get all the information — or simply stick to one or two places with incomplete information.

This was the problem that a team at Le Parisien set out to solve. Bringing together two developers, one data scientist and a chief of project under the codename LiveCity, the project has been under development for more than a year now.

Its Project Director, Stanislas de Livonnière, dubs it “Spatiotemporal storytelling“.

Plugging in all the APIs you can find

The aim of codename LiveCity is for users to see different widgets, stories and information depending on the time of the day and your location, from mobile mode to office mode or home mode,

That might be transportation updates, suggested restaurants nearby, air quality updates, major sporting events updates, upcoming events in your surroundings, weather alerts, important news, or something else entirely.

The team in Paris look at any source of live public data they can find — on anything of informational value in the city.

“Whenever there’s an API or feed, we plug it in!” says de Livionnière.

“Our app is called LiveCity because we will also concentrate our work beyond strictly editorial content, around all urban data. It will be useful for people and real-time geolocated — it is on this purpose that we are focusing.”

That means finding and plugging in all of a city’s available open and live data feeds and APIs into a single coherent ecosystem online.

Most big cities are likely to have a whole range of data feeds on transportation, news, events, job offers, accommodation, the environment, and culture — but every single of those categories themselves translates into multiple websites and apps that need individual browsing.

Those feeds may not be from pubic bodies — national and international platforms like TripAdvisor, Airbnb, Google Maps, Uber, as well as apps and websites that exist and are useful only within a given area all provide potential opportunities.

A mess of data feeds with different standards

The range of these services means a nightmarish mess of differently-designed platforms and data feeds to plug in. No two APIs work identically, and no two datasets have been imagined or conceptualised in the same way.

This may be why, although the idea sounds like a no-brainer, it hasn’t been routinely done in major cities.

“I think that [this idea of a city data aggregation app and platform] hasn’t been done yet because there is a huge forest of data out there. It’s a mess. Before even starting to plug in data feeds we created an enormous machine to collect and harmonise them. This takes ages and is tedious, but once done, it’s convenient.”

An old practice in a new context

Stanislas de Livonnière points out that this concern for presenting information and live data feeds that are relevant to your audience draws inspiration from the earliest days of modern newspapers.

“In a way, this isn’t so unrelated. Le Parisien was born in the 1940s, and when you look at the prints of that decade, and of the 1950s and the 1960s, you can see that the practice of presenting services information is quite old.

“Historically, our LiveData team is part of the newsroom. A prerogative for us since 2014 has been to detect stories through data, or reporters come to us and ask us for data analysis, and then we send back the results to them.”

“Like throwing food away” — opportunities for sharing stories

An obvious section of news that would benefit from this approach for interested outlets would be transportation and commuting.

“For instance, during a strike at SCNF [France’s national public railway authority], we showed that it had to reimburse a number of commuters. The SCNF has a provision stating that whenever transit services can’t reach 30% of their goal, it has to offer refunds.”

The agency was publishing figures about its efficacy, but at the national level only — when the team looked at data at the station level, they found figures as low as 5%.

“And when the speed limit of some of our highways went down to 80 kilometres per hour, we were able to look up the road accidents data and see whether the now-slower roads were the ones where the most deadly accidents occurred. Data was broken down by Department and shared across all regional outlets.

“Sometimes we have investigation results at the national level that we will only exploit at the regional level, but there’s no reason we shouldn’t communicate the results with other region’s outlets. It would be like throwing food away.”

The team has also been able to show that traffic- and pollution-reduction measures did work where it was implemented — but simply increased congestion everywhere else.

“We now dub it ‘live’ or ‘real-time data’, but back in the days newspapers were already printing convenient tables — public transport schedules for instance — and on numerous other things.

“In the end, for the local and informational aspect — not necessarily journalistic — what we are doing is only in line with an old state of affairs.”

For instance, the team tested implementing widgets of article-related data. There were numerous stories about transportation strikes where they tried adding live data in the article, and people were able to enter the name of their train station for the latest information on departures.

“So it is about both services and editorial, it’s complementary.”

Another example revolves around the real estate market and the automatic detection of significant changes in data feeds.

With their other media partners, the team has been looking at changes in regional average property prices. When there is any significant change, this generates an article that can be offered to outlets covering the affected area: copy that can then be modified or enhanced.

The initiative is similar to what Urbs Media is doing in the United Kingdom in offering automated content to local outlets.

Automation: “mechanise the mechanics and humanise the human beings”

When it comes to robot journalism, Stanislas is confident that his team’s experiments are showing promise — and he largely dismisses fears that automated reporting will come at the expense of human jobs in the news industry.

“If there is any sector that is the least threatened by automation, it’s journalism,” he says.

“All that [automation] can replace is the writing of weather, sports or stock exchange bulletins. These are not pieces of analysis, they are simply basic repetitive bulletins. In short, anything that reporters hate to write.

“Because it’s already automatic, it’s always the same thing, so this contributes to mechanise the mechanics and humanise the human beings.

“Reporters can then focus on anything that adds value. A robot won’t reveal an affair, it won’t carry out an interview in a Syrian refugee camp.”

The money question

When it comes to revenue de Livonnière says the focus for now is on developing the application. “But in the long run if everything goes well we do not exclude the monetisation of data feeds integration.

“Elsewhere we talk about branded content, here it could be branded data. But the sine qua non condition [for any partnership] is having cultivated an audience and hence being relevant to each other.

“And more indirectly, there are B2B [business-to-business] perspectives. This will help in developing arguments to come and partner with us.

“Potential income in the future might come from subscriptions, syndication, affiliation, brand data and then, potentially, business-to-business software sales.”

Steve Carufel is a former student of the MA in Multiplatform and Mobile Journalism at Birmingham City University.

1 thought on ““Spatiotemporal storytelling” at Le Parisien: how one newspaper is aggregating data to provide a public service

  1. Pingback: Spatiotemporal storytelling: aggregating data to provide a public service

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.