Main content

This registration is a frozen, non-editable version of this project



Loading wiki pages...

Wiki Version:
## [Rewarding Transparent and Reproducible Scholarship][1] ## ---------- ### Materials and background information for the Center for Open Science at the [8th Conference on Open Access Scholarly Publishing][2] ### ---------- Contact: David Mellor,, [@EvoMellor][3] ---------- #### Background #### Though scientists value transparency and reproducibility ([Anderson, Martinson, & De Vries, 2007][4]), we are rewarded for novelty and for presenting clean results. These rewards come from publications, grants, hiring, and promotion decisions that present a classic collective action problem. The incentives lead to behaviors that make scientific findings less reproducible than expected. This crisis in reproducibility can be addressed if scientists are rewarded for rigorous methods and best practices, instead of for presenting the most surprising and tidy results. **Increasing transparency improves rigor by allowing expert evaluation of every part of the process of science.** ---------- #### Major Initiatives #### Our poster will cover three initiatives that the Open Access community can adopt in order to improve the transparency and rigor of the work that they publish. 1. The Transparency and Openness Promotion ([TOP][5]) Guidelines are a set of eight standards that publishers can adopt in order to reward best practices in transparent research. Journals can adopt each standard in one of three tiers of increasing rigor. This removes barriers to adoption, while still guiding future improvement. 2. [Badges][6] are a simple and effective way of rewarding best practices. When authors have the option of receiving a visual indicator on their work for sharing data and research materials, the rates of these behaviors increase ([Kidwell et. al., 2016][7]). ![badges][8] 3. [Registered Reports][9] are a publishing format in which peer review occurs before results are known. This focuses expert evaluation on the research questions and the proposed methods to answer them. Unlike traditional peer review, the findings are surfaced regardless of outcome, so the incentive is to address pressing questions as rigorously as possible. ---------- #### Supporting Materials #### - [Draft poster][10] - Badges: [project page][11] and [handout][12] - TOP Guidelines: [website][13] - Prereg Challenge: [website][14] and [handout][15] ---------- #### About COS #### The [Center for Open Science][16] was founded in 2013 with a mission to increase the openness, integrity, and reproducibility of scientific research. We do this in the following ways: - We work with the scientific community to align incentives with scientific values ([summary handout][17]). - We build tools for researchers that enable the actions we promote (e.g. the [Open Science Framework][18]) - We measure the extent of the problem (e.g. the Reproducibility Project: Psychology, [OSC, 2015][19]; and the [Reproducibility Project: Cancer Biology][20] ) and the effectiveness of our initiatives (e.g. [Kidwell et. al., 2016][21]). [1]: [2]: [3]: [4]: [5]: [6]: [7]: [8]: [9]: [10]: [11]: [12]: [13]: [14]: [15]: [16]: [17]: [18]: [19]: [20]: [21]: