Open Science Collaboration
Abstract: Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects (Mr = .197, SD = .257) were half the magnitude of original effects (Mr = .403, SD = .188), representing a substantial decline. Ninety-seven percent of original studies had significant results (p < .05). Thirty-six percent of replications had significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and, if no bias in original results is assumed, combining original and replication results left 68% with significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.
Citation: Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716. Doi: 10.1126/science.aac4716
Summary Report: Read the Science article and supplementary material summarizing the results of the Reproducibility Project: Psychology. Or, read the Green OA version with supplementary information in the same file.
Supplement only. Supplementary materials to "Estimating the Reproducibility of Psychological Science." Includes additional graphs and details on analyses.
Replicated Studies: Explore the preregistrations, materials, data, and result reports of the individual replication projects.
Guide to Analyses: Reproduce the analyses of the individual projects and the aggregate results.
RPP Process: Learn more about the design, management, and operation of this large-scale crowdsourced project.
Presentations: Find articles, slides, notes, and videos of presentations of the Reproducibility Project: Psychology and related efforts.
Comments: Read comments on the publication and responses made by members of the Open Science Collaboration.
Center for Open Science: Learn more about the organization that facilitated the project and its initiatives to increase the transparency and reproducibility of research.
Open Science Framework: Learn more and get started using the free, open-source Open Science Framework for your own project management, archiving, manuscript sharing, and research registration.
TOP Guidelines: The Transparency and Openness Promotion Guidelines are a collective effort to improve transparency and reproducibility across disciplines.
The Reproducibility Project: Psychology began in November 2011, finished primary data collection in December 2014, and published a summary of the results in August 2015. The project was coordinated by the Center for Open Science. Replication teams followed a research protocol and received logistical assistance as they collected materials, identified the key finding for replication, ran their experiment, conducted analyses, and reported their findings.
As stated in an initial report from 2012, "The Reproducibility Project uses an open methodology to test the reproducibility of psychological science. It also models procedures designed to simplify and improve reproducibility" (Open Science Collaboration, 2012). To that end, all project materials, data, and findings are posted on the Open Science Framework, a free service of the Center for Open Science. Moreover, the project models reproducibility by making it easy to reproduce the analyses of each individual project, and the results of the aggregate report.
As the first in-depth exploration of its kind, the project results provide insight into reproducibility and its correlates. With a large, open dataset, many additional research questions can be investigated.
The project was designed to be a collaborative endeavor. Ultimately over 270 contributors earned authorship on the summary report and 86 others provided volunteer support. Replication teams designed, ran, and reported their replication studies. Brian Nosek, Johanna Cohoon, and Mallory Kidwell provided project coordination. Marcel van Assen, Chris Hartgerink, and Robbie van Aert led the analysis of results, Fred Hasselman generated the figures, and Sacha Epskamp led the analysis audit. Scores of additional volunteers assisted with coding of articles, analyses, and administrative tasks.
Since its inception, other similar initiatives have begun in other scientific domains. The Center for Open Science coordinates one of these such efforts, the Reproducibility Project: Cancer Biology.
Questions about the project can be directed to email@example.com.