Main content
Home
Menu
Video of talk given at ISBA 2018, Edinburgh
It is well-known in statistics (e.g., Gelman & Carlin, 2014) that treating a result as publishable just because the p-value is less than 0.05 leads to overop- timistic expectations of replicability. These overoptimistic expectations arise due to Type M(agnitude) error: when underpowered studies yield significant results, effect size estimates are guaranteed to be exaggerated and noisy. These effects get published, leading to an overconfident belief in replicability. We demonstrate the adverse consequences of this statistical significance filter by conducting seven direct replication attempts (268 participants in total) of a recent paper (Levy & Keller, 2013). We show that the published claims are so noisy that even non-significant results are fully compatible with them. We also demonstrate the contrast between such small-sample studies and a larger-sample study; the latter generally yields a less noisy estimate but also a smaller effect magnitude, which looks less compelling but is more realistic. We reiterate several suggestions from the methodology literature for improving best practices.
Page permissions have changed
Your browser should refresh shortly…
Renaming wiki...
Wiki page deleted
Press Confirm to return to the project wiki home page.
Connected to the collaborative wiki
This page is currently connected to the collaborative wiki. All edits made will be visible to contributors with write permission in real time. Changes will be stored but not published until you click the "Save" button.
Connecting to the collaborative wiki
This page is currently attempting to connect to the collaborative wiki. You may continue to make edits. Changes will not be saved until you press the "Save" button.
Collaborative wiki is unavailable
The collaborative wiki is currently unavailable. You may continue to make edits. Changes will not be saved until you press the "Save" button.
Browser unsupported
Your browser does not support collaborative editing. You may continue to make edits. Changes will not be saved until you press the "Save" button.

Start managing your projects on the OSF today.
Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.
Copyright © 2011-2025
Center for Open Science
|
Terms of Use
|
Privacy Policy
|
Status
|
API
TOP Guidelines
|
Reproducibility Project: Psychology
|
Reproducibility Project: Cancer Biology