Main content

Contributors:
  1. Stéphanie Mathey

Date created: | Last Updated:

: DOI | ARK

Creating DOI. Please wait...

Create DOI

Category: Analysis

Description: In September 2016, I received an automatic email from PubPeer (Nuijten, M. B., Hartgerink, C. H. J., van Assen, M. A. L. M., Epskamp, S., & Wicherts, J. M. (2015). The prevalence of statistical reporting errors in psychology (1985-2013). Behavior Research Methods. http://dx.doi.org/10.3758/s13428-015-0664-2) informing me that they were some statistical errors in the article. The putative wrong p values were related to a post-hoc analysis in the general discussion (meaning that there was no error in the main analyses of the two experiments). However, I decided to re-do all the analyses, to understand the errors. After a bit of archeological work, I found the initial DMDX files and re-ran all the analyses. The good news are that the inferential analyses led to the same conclusions as described in the paper (including the supposed wrong tests). Phew... The bad news are that the descriptive statistics are sometimes very different from what is described in the paper. I impute that to the use of different statistical software and to manual errors when preprocessing the data (I was used to process DMDX files manually with Excel) and analyzing the data (I was used to run them with STATISTICA). I'm therefore all the more happy to use R now for both pre-processing and statistical analyses (though it is still possible to make errors obviously). The script below presents the analyses I re-did, with highlights regarding conclusions.

Files

Loading files...

Citation

Recent Activity

Loading logs...

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.