21 Nov 2018 | Daniel Nüst
This article reports on a project, integrating Stencila and Binder, which started at the eLife Innovation Sprint 2018. It has been cross-posted on multiple blogs (eLife Labs, Stencila, Jupyter). We welcome comments and feedback on any of them!
eLife, an open science journal published by the non-profit organisation eLife Sciences Publications from the UK, hosted the first eLife Innovation Sprint 2018 as part of their Innovation Initiative in Cambridge, UK:
“[..] a two-day gathering of 62 researchers, designers, developers, technologists, science communicators and more, with the goal of developing prototypes of innovations that bring cutting-edge technology to open research communication.”
One of the 13 projects at the excellently organised event was an integration of Binder and Stencila…
14 Aug 2018 | Daniel Nüst
We’ve been working on demonstrating our reference-implementation during spring an managed to create a number of example workspaces.
We now decided to publish these workspaces on our demo server.
Screenshot 1: o2r reference implementation listing of published Executable Research Compendia. The right-hand side shows a metadata summary including original authors.
The papers were originally published in…
13 Jul 2018 | Daniel Nüst
Today a new journal article lead by o2r team member Daniel was published in the journal PeerJ:
Reproducible research and GIScience: an evaluation using AGILE conference papers by Daniel Nüst, Carlos Granell, Barbara Hofer, Markus Konkol, Frank O. Ostermann, Rusne Sileryte, Valentina Cerutti
PeerJ. 2018.doi: 10.7717/peerj.5072
The article is an outcome of a collaboration around the AGILE conference, see https://o2r.info/reproducible-agile/ for more information.
Please retweet and spread the word!
Your questions & feedback are most welcome.
Here is Daniel’s attempt at a non-specialist summary:
More and more research use data and algorithms to answer a question.
That makes it harder for researchers to understand a scientific publication, because you need more than just the text to understand what is really going on.
You need the software and the data to be able to tell if everything is done correctly, and to be able to re-use new and exciting methods.
We took a look at the existing guides for such research and created our own criteria for research in sciences using environmental observations and maps.
We used the criteria to test how reproducible a set of papers from the AGILE conference actually are.
The conference is quite established and the papers are of high quality because they were all suggested for the “best paper” awards at the conference.
The results are quite bad!
We could not re-create any of the analyses.
Then we asked the authors of the papers we evaluated if they had considered that someone else might want to re-do their work.
While they all think the idea is great, many said they do not have the time for it.
The only way for researchers to have the time and resources to work in a way that is transparent to others and reusable openly is either to convince them of the importance or to force them.
We came up with a list of suggestions to publishers and scientific conference organisers to create enough reasons for researchers to publish science in a re-creatable way.