Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
*Note*: Please download word documents to view them correctly, as the OSF viewer will often show them one line per page. *Abstract of submitted article*: In behavioral, cognitive, and social sciences, reaction time measures are an important source of information. However, analyses on reaction time data are affected by researchers’ analytical choices and the order in which these choices are applied. The results of a systematic literature review, presented in this paper, revealed that the justification for and order in which analytical choices are conducted are rarely reported, leading to difficulty in reproducing results and interpreting mixed findings. To address this methodological shortcoming, we created a checklist on reporting reaction time pre-processing to make these decisions more explicit, improve transparency, and thus, promote best practices within the field. The importance of the pre-processing checklist was additionally supported by an expert consensus survey and a multiverse analysis. Consequently, we appeal for maximal transparency on all methods applied and offer a checklist to improve replicability and reproducibility of studies that use reaction time measures. ***File overview*** *Analysis* - Multiverse: Analysis code as Rmd and html file, output as csv files, data with codebook "Key to dataheaders.pdf" - Note: data used in the multiverse analysis is open data retrieved from https://osf.io/x8pha and accordingly cited in the manuscript as Zwaan, R. A., Pecher, D., Paolacci, G., Bouwmeester, S., Verkoeijen, P., Dijkstra, K., & Zeelenberg, R. (2018). Participant Nonnaiveté and the reproducibility of cognitive psychology. Psychonomic Bulletin & Review, 25(5), 1968–1972. https://doi.org/10.3758/s13423-017-1348-y - Analysis code of RT literature search as Rmd and html file - Analysis code of expert consensus survey as Rmd and html file *Checklist development* - Checklist_v1.pdf = initial checklist version send out in the expert consensus survey for review - Checklist_v2.pdf = checklist changed based on feedback from expert consensus survey and included in first manuscript submission - Checklist_v3.pdf = checklist changed based on peer-review and feedback at SIPS2023 conference, included in revised manuscript version *Checklist use* - Checklist_standalone_long.pdf = fillable PDF file of the checklist with a short instruction on how to use the checklist, wording examples, checkmarks, and respective page numbers, this is Supplementary Material to the submitted manuscript - Checklist_standalone_short.pdf = fillable PDF file of the checklist with a short instruction on how to use the checklist, checkmarks, and respective page numbers, this is Supplementary Material to the submitted manuscript *Data* - Experts_feedback.pdf = Complete free-text responses in the expert-consensus survey - Litature coding sheet with the respective codebook and literature screening by two teams of coders as csv files - survey_data.RData = data from the expert consensus survey - survey_data_codebook.csv = codebook with information about variables in survey_data.RData *Figures* - All figures appearing in the submitted manuscript *Materials* - Expert consensus survey administered on qualtrics *No folder* - Pre-print of submitted manuscript - Approval of ethics committee for expert consensus survey
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.