Main content

Date created: | Last Updated:

: DOI | ARK

Creating DOI. Please wait...

Create DOI

Category: Project

Description: This paper tackles the Garden of Forking Paths (Gelman & Loken, 2013), as a challenge for replicability and reproducibility of psychology studies. Here, we applied a multiverse analysis to a single dataset - effectively sampling 14 out of 864 possible pre-processing and analysis pipelines which were selected to cover the full range of variability found in the literature using systematic review approach (Šoškić et al., 2020). In this large reproducibility project, an ERP N400 dataset was donated by an independent research team, and the 14 selected pipelines were applied to this dataset to compare study outcomes, descriptive statistics, effect size, data quality, and statistical power. Out of the steps that were varied, high-pass filter cut-off, artifact removal method, baseline duration, reference, measurement latency and locations, and amplitude measure (peak vs. mean) were all shown to affect the study outcome parameters. Low-pass filtering was the only step which did not notably influence any of these parameters. In short, this study shows that even some of the seemingly minor procedural deviations can influence conclusions of an ERP study and demonstrates the power of the multiverse analysis approach in both identifying the most reliable effects and providing concrete insights into consequences of a given methodological decision.

License: CC-By Attribution 4.0 International

Files

Loading files...

Citation

Components

Tags

Recent Activity

Loading logs...

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.