Correcting for bias in psychology: A comparison of meta-analytic methods

Date created: | Last Updated:

: DOI | ARK

Creating DOI. Please wait...

Create DOI

Category: Project

Description: Publication bias and questionable research practices in primary research can lead to badly overestimated effects in meta-analysis. Methodologists have proposed a variety of statistical approaches to correcting for such overestimation. However, much of this work has not been tailored specifically to psychology, so it is not clear which methods work best for data typically seen in our field. Here, we present a comprehensive simulation study to examine how some of the most promising meta-analytic methods perform on data typical of psychological research. We tried to mimic realistic scenarios by simulating several levels of questionable research practices, publication bias, and heterogeneity, using study sample sizes empirically derived from the literature. Our results indicate that one method – the three-parameter selection model (Iyengar & Greenhouse, 1988; McShane, Böckenholt, & Hansen, 2016) – generally performs better than trim-and-fill, p-curve, p-uniform, PET, PEESE, or PET-PEESE, and that some of these other methods should typically not be used at all. However, it is unknown whether the success of the three-parameter selection model is due to the match between its assumptions and our modeling strategy, so future work is needed to further test its robustness. Despite this, we generally recommend that meta-analysts of data in psychology use the three-parameter selection model. Moreover, we strongly recommend that researchers in psychology continue their efforts on improving the primary literature and conducting large-scale, pre-registered replications.

License: CC-By Attribution 4.0 International

This project represents an accepted preprint submitted to PsyArXiv . Learn more about how to work with preprint files. View preprint

Files

Loading files...

Citation

osf.io/rf3ys

Tags

Recent Activity

Loading logs...

This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.

Create an Account Learn More Hide this message