Loading wiki pages...

Wiki Version:
<blockquote> <p>Project Goal</p> </blockquote> <p>Replication is vital for scientific progress. However, scientists have limited resources for conducting research. Replicating every study is not a feasible or productive use of limited resources. It is more important to replicate findings that are based on a small amount of evidence than those that are well-established and have been demonstrated many times. Also, it is more important to replicate findings that are having a substantial impact on later research than those having little influence. The goal of this project is to define a statistic or statistics that can support researchers’, journal editors’, and funders’ decision-making about which findings are important to replicate. </p> <blockquote> <p>Project Phases</p> </blockquote> <p>This project is a crowdsourced collaboration open to all interested contributors. The product of this project will be a paper introducing one or more statistics defining Replication Value -- the value of conducting replications of individual findings. The project community will work individually and collectively to identify and refine candidate formulas for Replication Value. The accumulated candidate formulas will be subjected to logical and statistical critique and compared with simulated and sample data. The community will work collaboratively to define the strengths and weaknesses of candidate algorithms and then report the surviving approaches and rationale in a summary article. The project is divided into several phases that will be completed in a series of two week sessions.</p> <p><strong>Phase I - Generating Candidate Formulas</strong> Starting January 2016</p> <p>In the first phase, we will collect candidate formulas for the Replication Value. Replication value may be lower for findings that are based on strong evidence, but may be higher for those that have a substantial impact on later research. Individuals or teams will independently create and submit their own formula. Each submission should address:</p> <p>The formula and how to calculate Replication Value using it What is the unit of analysis for your formula? Individual studies? Entire papers? Is the output of your formula an absolute or a relative value? How should the output be interpreted?</p> <p>A very simple example would be: Replication Value = number of citations / total number of participants This formula would be based on entire papers and would interpreted relative to other papers.</p> <p>Candidate formulas can be submitted prior to February 5, 2016. Formulas should be sent to Courtney Soderberg (courtney@cos.io) and Charlie Ebersole (cebersole@virginia.edu).</p> <p><strong>Phase 2 - Feedback on Candidate Formulas</strong></p> <p>The project coordinators will organize all submissions and present a summary to the project community. All contributors will then be free to provide comments and critiques to the candidate formulas.</p> <p><strong>Phase 3 - Revision</strong></p> <p>During this period, teams will make any revisions to their formula that they see fit based upon received feedback. Teams are encouraged to work with one another on these revisions and formulas may be combined/redundant formulas can be removed.</p> <p>At this point, the community will review the pool of remaining formulas and decide to either move forward to testing or to have another round of submission, feedback, and revision.</p> <p><strong>Phase 4 - Testing the Formulas</strong></p> <p>We (Courtney and Charlie) will provide 10 research articles for testing. Each contributing team will calculate the Replication Value for each article based on their formula and one other formula (which will be randomly assigned) and share the results with the community. The community will then evaluate each formula by its results and provide feedback.</p> <p>Formulas will be evaluated on the following criteria: Do formula results mirror expectations and judgements of the community? How long did it take to calculate the Replication Value for all of the articles? Ratings of ease of use Generalizability (could you apply the formula to different types of tests, different types of papers [1 vs. multistudy], different domains?) Consistency with other formulas results Clarity and informativeness of results produced by formula</p> <p>If formula authors use any tools (code, excel spreadsheets, etc.) to help in the calculation or information gathering for their formulas, these should be shared with testers for the testing phase. Additionally, if the formula is chosen for publication these same materials will be published with article so that the formula is equally feasible to use for testers and readers.</p> <p><strong>Phase 5 - Formula Selection</strong></p> <p>The community will select the formula(s) to be included in the final report. Formulas will be judged by the criteria of the Testing Phase. At this point, there may only be a one or a few formulas remaining (or simply consensus on the formula(s) to include), making a selection process unnecessary. However, if there are still several candidate formulas at this point we will create a selection process for picking the formulas to include in the project report. This will likely involve all members of the community voting on the formula(s) they think should be included. </p> <p><strong>Phase 6 - Report Write-up</strong></p> <p>The community will write up a final report on the project to submit for publication. This report will describe the rationale, generation and testing strategy, evaluation output, selection process, and how to use the formula(s). All individuals who contributed at each stage of the project will be invited to be an author on the final report.</p> <p><strong>How to join the project</strong></p> <p>Send your/your team members’ name and preferred email address to Courtney Soderberg (courtney@cos.io) and Charlie Ebersole (cebersole@virginia.edu). We will add you as an editing contributor to all documents and OSF projects. You can join the community prior to submitting your/your team’s Replication Value submission. Please email any questions you have to us as well. </p>
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.