Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
Stage 2 – Replication Study --------------------------- **Problem statement** Recent reports on the lack of reproducibility of important psychological findings and growing evidence for a systematic positive bias in the published research reports is often interpreted as a ‘confidence crisis’ in psychological science. One of the factors that has been suggested to lie behind this calamity is a collection of ‘questionable research practices’ (QRPs, Steneck, 2006). Much debated research areas, such as parapsychology, suffer even more of the burden of the confidence crisis, because QRPs offer a convenient and parsimonious explanation for anomalous findings, especially if they do not fit the status quo theories. Recently, several publications reported positive results in support of ‘psi’ phenomena in high profile psychology journals (Bem, 2011; Storm, Tressoldi, & Di Risio, 2010) Nevertheless, these reports met poor reception. The interpretation that the results would be evidence for extrasensory perception (ESP) was criticized on several accounts (Fiedler & Krueger, 2013; Rouder & Morey, 2011; Schwarzkopf, 2014), and those who offered a counter-explanation for the positive findings usually mentioned some type of QRP, and problems with the execution of the studies (Wagenmakers, Wetzels, Borsboom, Kievit, & van der Maas, 2015; Wagenmakers, Wetzels, Borsboom, & van der Maas, 2011). Such low level of confidence in the quality of studies can lead to the unwarranted dismissal of good research findings. This problem is prominent in ESP research, but it can be easily generalized to other controversial research areas and to science in general. One detrimental effect of low credibility for science is that researchers have to spend valuable resources on verifying published findings instead of being able to accept them as objective observations, or they may choose to disregard reports altogether because of unverifiable credibility, making the original study a waste of resources. Therefore, there is a critical need for methodological approaches that can increase the credibility and acceptability of research, and we need to set up clear criteria for credibility that can be adhered to in future scientific ventures. One possible solution to improve credibility of research findings is to eliminate opportunities for QRPs. Recently, there has been a rise in initiatives for transparency which could help reduce the prevalence of QRPs, such as encouragement of pre-registration of experiments by professional and governmental agencies, a call to publish research reports irrespective of their outcome, making reports available to a wider audience through self-archiving and open access publication, open publishing platforms, data repositories, and initiatives for large scale multi-lab replications. Furthermore, [best practice guidelines][1] have been set up to further improve credibility and integrity of research. However, inconsistencies in carrying out the research study, result driven exclusion or imputation of data, and fraud are unaffected by these interventions and guidelines, because the study and data collection itself is still performed ‘in the dark’. So, to eliminate this last safe haven for QRPs, another innovation is in order. **Solution** We plan to make the data collection and study process itself fully transparent. Specifically, transparency of the data collection process can be achieved in computerized psychology experiments through implementing a **real-time data publication pipeline**, which would make data publically accessible immediately after it is collected. This real-time data publication pipeline would increase credibility of research by making data integrity fully verifiable. Additionally, the involvement of **independent auditors** can further improve confidence in that study integrity being maintained throughout the research process. **Specific Aim** In Stage 2 of our research project our primary aim is to carry out a replication of Bem’s (2011) Experiment 1 in a manner that eliminates all possibilities of questionable research practice (QRP). [See our detailed research plan here][2] **Funding** The research program is funded by the Bial Foundation. ![Bial Foundation logo][3] **References** Bem, D. J. (2011). Feeling the future: experimental evidence for anomalous retroactive influences on cognition and affect. Journal of Personality and Social Psychology, 100(3), 407-425. doi:10.1037/a0021524 Fiedler, K., & Krueger, J. I. (2013). Afterthoughts on precognition: No cogent evidence for anomalous influences of consequent events on preceding cognition. Theory & Psychology, 23(3), 323-333. doi:10.1177/0959354313485504 Rouder, J. N., & Morey, R. D. (2011). A Bayes factor meta-analysis of Bem’s ESP claim. Psychonomic Bulletin & Review, 18(4), 682-689. doi:10.3758/s13423-011-0088-7 Schwarzkopf, D. S. (2014). We should have seen this coming. Frontiers in human neuroscience, 8(332). doi:10.3389/fnhum.2014.00332 Steneck, N. H. (2006). Fostering integrity in research: Definitions, current knowledge, and future directions. Science and engineering ethics, 12(1), 53-74. Storm, L., Tressoldi, P. E., & Di Risio, L. (2010). Meta-analysis of free-response studies, 1992–2008: Assessing the noise reduction model in parapsychology. Psychological Bulletin, 136(4), 471-485. doi:http://dx.doi.org/10.1037/a0019457 Wagenmakers, E.-J., Wetzels, R., Borsboom, D., Kievit, R., & van der Maas, H. L. J. (2015). A skeptical eye on psi. In E. May & S. B. Marwaha (Eds.), Extrasensory Perception: Support, Skepticism, and Science (pp. 153-176). NB: ABC-CLIO. Wagenmakers, E.-J., Wetzels, R., Borsboom, D., & van der Maas, H. L. (2011). Why psychologists must change the way they analyze their data: the case of psi: comment on Bem (2011). Journal of Personality and Social Psychology, 100(3), 426-432. doi:10.1037/a0022790 [1]: https://osf.io/jk2zf/wiki/Best%20practice%20guidelines%20to%20improve%20research%20integrity/ [2]: https://osf.io/rh5hb/wiki/Initial%20research%20plan/ [3]: https://static1.squarespace.com/static/533155cae4b08b6d16d34ec4/t/555c9b76e4b0e543f2e387b2/1432132471283/?format=300w
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.