Main content

Effect Sizes, Power, and Biases in Intelligence Research: A Meta-Meta-Analysis  /

Date created: | Last Updated:

: DOI | ARK

Creating DOI. Please wait...

Create DOI

Category: Project

Description: We analyzed 2,439 effect sizes from 131 meta-analyses in intelligence research to estimate the average effect size, median power, and evidence for bias in this field. We found that the typical effect size in this field was a Pearson’s correlation of .26, and the median sample size was 60. We calculated the power of each primary study by using the corresponding meta-analytic effect as a proxy for the true effect. The median power across all studies was 48.8%, with only 29.8% of the studies reaching a power of 80% or higher. We documented differences in average effect size and median power between different subfields in intelligence research (correlational studies, studies of group differences, experiments, toxicology, and behavior genetics). Across all meta-analyses, we found evidence for small study effects in meta-analyses, highlighting potential publication bias. The evidence for the small study effect being stronger for studies from the US than for non-US studies (a US effect) was weak at best. We found no clear evidence for the decline effect, early extremes effect, or citation bias across meta-analyses. Even though the power in intelligence research seems to be higher than in other fields of psychology, this field does not seem immune to the problems of replicability as documented in psychology.

License: CC-By Attribution 4.0 International

Files

Loading files...

Citation

Tags

Recent Activity

Loading logs...

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.