Main content

Files | Discussion Wiki | Discussion | Discussion
default Loading...

Geting Started - A step-by-step guide to developing a replication project


Loading wiki pages...

Wiki Version:
Conducting a replication study is a great way to contribute to the field while also learning about the nuts and bolts of how studies are designed and conducted. As you might expect, a good replication takes careful planning. This page provides a step-by-step guide to planning, conducting, and reporting a replication study. ---------- Preliminaries: Identifying a Study and Obtaining Materials ---------- The first step is to identify the study you will replicate. This requires careful thought and judgement--you need to find a study you are interested in that is **feasible** to replicate given the time, expertise, and resources you have available. [This page][1] provides extensive resources for identifying a suitable study. To conduct a replication study you will need to obtain or re-develop the materials from the original study (informed consent, scales/measures, scripts, etc.). [This page][2] offers resources for finding studies with materials posted online for replication. Other studies may have an appendix with original materials or may have materials that are very straightforward to reproduce. In many cases, though, you may need to contact the corresponding author to inquire (politely) if original materials can be shared with you. ---------- Develop an Initial Plan for the Replication ---------- Before beginning to pilot your replication you will need: * A complete set of materials (acquired or re-developed). This would typically include an informed consent form, script, materials, debriefing form, and a plan for how the study will be described during recruitment. * A sampling plan specifiying your target sample size, how you selected your target sample size, and your **stopping rule**--the rule you will use to decide when the study is complete. Your sampling plan should also explain how participants will be recruited and how (if at all) they will be compensated. * Exclusion rules - a specific set of rules stating in what circumstances a participant's responses will be excluded from analysis. For example, you might exclude participants who guess they hypothesis during a prompt during debriefing to guess the purpose of the study. If so, you should be very clear about what counts as an accurate guess. In general, you should have the same exclusions as the original study, though you may also need to add more (e.g. to exclude participants who have read about the original study since it was published). * Transformation plan - specifying how (if at all) data will be transformed. For example, you may have a 7-item scale that will be averaged to form an overall score for each participant on that measure. In general, you should use the same transformations as the original study. * An analysis plan specifying the specific hypotheses you will test and how (exactly) you will test them. This may seem quite daunting! Fortunately, the Open Science Framework has developed a pre-registration template that walks you through this entire process. You can download this template [here][3]. If you'd like to see what a completed template looks like, here is an [example][4] that plans a replication of a study from [Ottati et al. (2015)][5]. Of course, you will also need to obtain IRB approval for your study. Fortunately, the IRB application will ask for some of the same information you pulled together for your pre-registration plan--feel free to copy/paste across applications. ---------- Consider Some Important Best Practices ---------- * Consider requesting permission to post anonymized data. You shoudl request such permission in your informed consent and should describe in your IRB analysis how you will ensure the data will be fully anonymized. This can be some extra work, but it will make it possible to share your data, a key Open Science practice. * Consider adding positive control to your study to help indicate the quality of the data you collected. More information is [here][6]. * Consider measuring naivete of your participants to the research design (check to see if they are already familiar with this type of study and/or study materials). This is especially important if you are conducting the study using online workers (e.g. MTurk), as such participants may regularly encounter the types of materials used in your study--this is known to alter the way participants respond. Think carefully about how best to measure naivete and what exclusion rules you will use. * Consider adding a manipulation check if none was used in the original study. * Consider potential moderators that you might be able to explore in your replication attempt. Of course, a key for science is to "Keep it Simple", but if you have the resources it might be worth adding an extra condition to test a potential moderator. At the very least, consider the ways in which your replication effort might vary from the original and if any of these differences might be strong moderators (e.g. differences in ethnicity or age of participants). Be sure potential strong moderators are measured, if possible. Be careful, though, in your analysis plan, not to mine through many different potential moderators, as you can easily end up chasing noise. ---------- Refine Your Plan for the Replication ---------- Before you rush headlong into conducting the replication, it is a good idea to refine your replication plan. Here are some steps to consider: * Pilot test! You need to pilot to become proficient at conducting the protocol. In addition, piloting can help you identify problems with the materials, additional exclusion rules you might need, etc. * Inspect the range of responses from your pilot data. Even when materials match, your participant pool may give a very different range of responses. For example, the original study might have measured skill at darts. Your particpant pool, though, may play darts more regularly, leading to scores in the control group that are already much higher than in the original study. This could require adjusting the measure to try to obtain the same range of scores within the contrl groups. * Make a protocol video that demonstrates how the study will be run. [Here's an example][7] of a protocol video. * Send your protocol video and pre-registration plan to the original author; invite them to review and comment if they have the time and inclination. * Consider submitting your study to the OSF Pre-registration Challenge (inforomation about this is [here][8]). Once submitted, your pre-registration plan will be reviewed by an expert and any missing or vague details will be flagged. What a great service! You will also become eligible for a $1000 stipend if your study is properly pre-registered and then accepted for publication in a participating journal (while funding lasts). Be sure that any substantive changes to your protocol are submitted as ammendments to your IRB. ---------- Pre-Register Your Replication! ---------- With lots of planning you are *almost* ready to start collecting data. There is one more crucial step, though--pre-registering your study with the Open Science Framework. Fortunately, with the groundwork you have laid, this will be easy. * If you don't already have one, creaste an account on the [Open Science Framework][9]. * Once logged in, go to the "My Projects" taband click *"Create New Project"*. Give your project a descriptive title that describes the effect of interest, that it is a replication, and what study is being replicated (e.g. Replication of Gervais & Norenzayan (2012) - Analytic Thinking Promotes Religious Disbellief). * If you are working in a group, be sure to invite your team members to collaborate on the project. Be sure to make these invitations using the email each collaborator used to register with OSF. * In the new project page that appears, use the Wiki page to give a brief description of the original study, that you will be attempting to replicate it, and the current status of your replication project (e.g. Data collection about to begin). Be sure to include a reference (and hopefully a link) to the original study. * In the files section, you can upload materials for your study. Be sure, though, that you have permissions to share these, or else be sure that you don't mark the folder with materials public. If you received materials from an author on the original study, be sure they consent to having the materials posted. * If desired, make the project public so that others may view it. * **Upload your pre-registration document to the files section**--this will serve as the permanent record of your sample and analysis plans prior to data collection. * Click on the "Registration" tab and click the green button labelled "New Registration" * Fill in the forms that appear, then confirm the email you are sent to lock in your registration. Ta Da! You may want to navigate to your OSF project page and click "Registrations" to be sure that your registration has gone through and is complete (is your pre-registration file included?) ---------- Conduct the Replication - Some tips ---------- With all that planning completed you are finally ready to conduct your replication! Here are some tips to keep in mind as you collect the data: * Be alert for potential problems. Pay attention during debriefing for any confusion reported by participants. If you are collecting written responses, try entering/coding a handful of these to make sure the coding process can proceed as expected. * Even though it is important to monitor the quality of the data coming in, it is essential that you not analyze the data to make decisions about the study. Specifically, do **not** use the data collected to make decisions about extending the study, finishing the study early, excluding a participant, etc. * Try to stick to your pre-registration plan. But also keep a good notebook so that any deviations are carefully recorded and can be reported in your final manuscript. ---------- Analyze the Data ---------- When you reach your stopping rule you are ready to analyze the data. Follow your pre-registered plan carefully--apply the exclusions and transformations specified and then conduct the analyses to test your research hypothesis. Of course, in conducting the research you may have discovered issues that require alterations to your pre-registered plans. That's ok, but be sure to keep good notes on all deviations. If a change in your exclusions, transformations, and/or analysis is warranted, it is usually best to first proceed as planned, then to conduct a secondary analysis with the adjustments you feel are necessary. Be sure to report both sets of analyses and to flag the one developed ad-hoc as exploratory. ---------- Share your Results ---------- It is an ethical obligation for scientists to share their work publically. This helps make good on our promise to participants to seek benefit from the time and effort they have invested in being part of the research. It also helps ensure that publication bias does not lead to distortions of our scientific understanding. Psychologists typically share their results by writing an APA-styled manuscript and seeking peer-reviewed publication. That's a great route to explore, including the possibility of submitting to an undergraduate research journla (e.g. Psi Chi journal). If a peer-reviewed publication is not feasible given the time/resources you have available, be sure to at least post summary results on your Open Science Page so that others can learn from and benefit from your work. If you develop a manuscript for a class, post that as well! Finally, you can **consider** posting the raw data from your study. Before you do, though, be sure: * That sharing of the data is explicitly permitted in your IRB protocol * That the data you will post is fully anonymized and that the information provided is not detailed enough for others to reconstruct the identities of your participants (e.g. if you go to a small school with only a 2 art history majors, including majors in your data set may well be enough for others to work out some participant identities.) ---------- Continue Replicating! ---------- In psychology, one study is rarely definitive--it is the pattern of results across multiple studies which helps reveal the truth (hopefully!). The same is true with replication--any one replication could be misleading, so it is often important to plan and conduct a series of replications to fully explore a putative effect. So keep thinking, keep planning, and keep replicating! [1]: [2]: [3]: [4]: [5]: [6]: [7]: [8]: [9]:
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.