Main content

Date created: 2017-12-20 07:54 AM | Last Updated: 2020-09-28 07:07 AM

Identifier: DOI 10.17605/OSF.IO/Z8EMY

Category: Project

Description: In this meta-study, we analyzed 2,442 effect sizes from 131 meta-analyses in intelligence research, published from 1984 to 2014, to estimate the average effect size, median power, and evidence for bias. We found that the average effect size in intelligence research was a Pearson’s correlation of .26, and the median sample size was 60. Furthermore, across primary studies, we found a median power of 11.9% to detect a small effect, 54.5% to detect a medium effect, and 93.9% to detect a large effect. We documented differences in average effect size and median estimated power between different types of in intelligence studies (correlational studies, studies of group differences, experiments, toxicology, and behavior genetics). On average, across all meta-analyses (but not in every meta-analysis), we found evidence for small study effects, potentially indicating publication bias and overestimated effects. We found no differences in small study effects between different study types. We also found no convincing evidence for the decline effect, US effect, or citation bias across meta-analyses. We conclude that intelligence research does show signs of low power and publication bias, but that these problems seem less severe than in many other scientific fields.

License: CC-By Attribution 4.0 International

Wiki

See the meta-data file for a detailed explanation of all files.

Files

Files can now be accessed and managed under the Files tab.

Citation

Components

Data


Recent Activity

Loading logs...

Analysis


Recent Activity

Loading logs...

Effect Sizes, Power, and Biases in Intelligence Research: A Meta-Meta-Analysis

We analyzed 2,439 effect sizes from 131 meta-analyses in intelligence research to estimate the average effect size, median power, and evidence for bia...

Recent Activity

Loading logs...

Tags

biascitation biasdecline effectearly-extremes effecteffect sizeintelligencemeta-meta-analysismeta-regressionpowerpublication biassmall study effectUS effect

Recent Activity

Unable to retrieve logs at this time. Please refresh the page or contact support@osf.io if the problem persists.

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.