Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
[RP:P Home][1] ## Reports about the Reproducibility Project: Psychology The Reproducibility Project: Psychology was a collaborative, crowdsourced effort of 270 authors and 86 additional volunteers. Across multiple criteria, we successfully reproduced fewer than half of the 100 original findings investigated. Multiple analysis methods were used, examining p-values, effect sizes, a meta-analysis combining original and replication effects, and subjective assessments to evaluate the success of each replication. Stronger original effects were correlated with stronger replication effects. Other correlates of reproducibility were investigated, examining possible influences on replication success. This suggest that reproducibility is difficult to achieve and changes in some research practices could increase the reproducibility of published research. ***Science* publication summarizing aggregate results:** [Article and Supplement.][2] "Estimating the Reproducibility of Psychological Science" [Supplement only.][3] Supplementary materials to "Estimating the Reproducibility of Psychological Science." Includes additional graphs and details on analyses. [Figure 1.][4] Density plots of p-values and effect sizes from original and replication studies. [Figure 2.][5] Scatterplot of p-values from original and replication studies. [Figure 3.][6] Scatterplot comparing effect sizes from original and replication studies. [Table 1.][7] Summary of reproducibility rates and effect sizes for original and replication studies overall and by journal/discipline. [Table 2.][8] Spearman's rank-order correlations of reproducibility indicators with summary original and replication study characteristics [Appendices.][9] Descriptive text of analysis scripts. Details also available in the [guide to analyses][10]. **Book chapter describing the project methodology** Open Science Collaboration. (2014). [The Reproducibility Project: A Model of Large-Scale Collaboration for Empirical Research on Reproducibility][11]. In V. Stodden, F. Leisch, & R. Peng (Eds.), *Implementing Reproducible Computational Research (A Volume in The R Series)* (pp. 299-323). New York, NY: Taylor & Francis. ***Perspectives on Psychological Science* publication introducing the project:** Open Science Collaboration. (2012). [An open, large-scale, collaborative effort to estimate the reproducibility of psychological science][12]. *Perspectives on Psychological Science, 7*, 657-660. DOI: 10.1177/1745691612462588 **Helpful Links:** [Replicated Studies][13] [Guide to Analyses][14] [1]: https://osf.io/ezcuj/wiki/home/ [2]: https://osf.io/phtye/ [3]: https://osf.io/k9rnd/ [4]: https://osf.io/7js8c/ [5]: https://osf.io/47zks/ [6]: https://osf.io/a9djv/ [7]: https://osf.io/jq7v6/ [8]: https://osf.io/c9b8v/ [9]: https://osf.io/z7aux/ [10]: https://osf.io/ytpuq/wiki/home// [11]: https://osf.io/9h47z/ [12]: http://pps.sagepub.com/content/7/6/657.abstract [13]: https://osf.io/ezcuj/wiki/Replicated%20Studies/ [14]: https://osf.io/ytpuq/wiki/home/
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.