Abstract: In a recent (not yet submitted or published) research project, me and my co-researchers investigated whether (and if, to what extent) the use of questionable research practices - namely p-hacking - is prevalent and evident in the field of I/O psychology. We analyzed 234 anonymized studies that were submitted for publication to the Journal of Personnel Psychology between 2014 and 2019 in their various stages (desk-reject, reviewed-reject, revised-reject, revised-accept) for indications of p-hacking with p-curve, TIVA and R-Index. Interestingly, accepted studies showed significantly higher indications for p-hacking than rejected studies in all three detection methods. The manuscript status (from desk-reject to revised accept) was significantly and positively correlated with the amount of identified p-hacking. I want to use this session for several goals: - briefly present the design and results of the study and discuss any shortcomings that we may have overseen - maybe find collaborators that are editors of journals or at least contacts towards journals that can provide us with more data and extend the scope of this project on other disciplines (such as social psychology, personality psychology etc.) - discuss any experiences of the plenum and collect anecdotal evidence: did the attending colleagues ever feel that the reviewing process pushed them towards questionable research practices such as p-hacking? - brainstorm ways to minimize this pressure Unconference Landing Page: https://docs.google.com/document/d/10t4yNFIvgYwxuJnOMulhiPxJg-epyav-kG309oxgotI/edit Slides are uploaded and available under files. Recordings are available upon request: Just send me an E-Mail to Jane.Hergert@uni-rostock.de or (if the first one doesn't work) email@example.com
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.