Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
<p><strong>From the introduction chapter of <em>Implementing Reproducible Research</em>:</strong> Conducting reproducible research requires more than the existence of good tools. Ensuring reproduciblity requires the integration of useful tools into a larger workflow that is rigorous in keeping track of research activities. One metaphor is that of the lab notebook, now extended to computational experiments. Jarrod Millman and Fernando Perez raise important points about how computational scientists should be trained, noting that many are not formally trained in computing, but rather pick up skills "on the go." They detail skills and tools that may be useful to computational scientists and describe a web-based notebook system developed in IPython that can be used to combined text, mathematics, computation, and results into a reproducible analysis. Titus Brown discusses tools that can be useful in the area of bioinformatics as well as good programming practices that can apply to a broad range of areas.</p> <p>Holger Hoefling and Anthony Rossini present a case study in how to produce reproducible research in a commercial environment for a large scale data analysis involving teams of investigators, analysts, and stakeholders/clients. All scientific practice, whether in academia or industry, can be informed by their experience and discussion of tools they used to organize their work.</p> <p>Closely coupled with the idea of reproducibility is the notion of "open science," whereby published results are made available to the widest audience possible through journal publications or other means. Luis Ibanez and colleagues give some thoughts on open science and reproducibility and trends that are either encouraging or discouraging it. Bill Howe discusses the role of cloud computing in reproducible research. He describes how virtual machines can be used replicate a researcher's entire software environment and allow researchers to easily transfer that environment to a large number of people. Other researchers can then copy this environment and conduct their own research without having to go through the difficult task of reconstructing the environment from scratch.</p> <p>Brian Nosek, representing the Open Science Collaboration, outlines the need for reproducibility in all science and details why most scientific findings are rarely reproduced. Reasons include a lack of incentives on the part of journals and investigators to publish reproductions or null findings. He describes the Reproducibility Project, whose goal is to estimate the reproducibility of scientific findings in psychology. This massive undertaking represents a collaboration of over 100 scientists to reproduce a sample of fi ndings in the psychology literature. By spreading the effort across many people, the Project overcomes some of the disincentives to reproducing previous work.</p> <p><a href="https://openscienceframework.org/project/s9tya/wiki/home/">Return to Table of Contents</a></p> <p><a href="https://openscienceframework.org/project/pk46g/files/">View Practices and Guidelines chapters for download</a></p>
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.