Preprint: "New statistical metrics for multisite replication projects"

Contributors:
  1. Tyler J. VanderWeele

Date created: | Last Updated:

Identifiers: DOI | ARK

Creating DOI and ARK. Please wait...

Create DOI / ARK

Category: Project

Description: [Slide talk of this material: https://www.youtube.com/watch?v=xhexCDRKKW4] Increasing interest in replicability in the social sciences has led to the emergence of novel designs for replication projects in which multiple independent sites replicate an original study. Such "many-to-one" designs are now commonplace in Registered Replication Reports, and they have unique potential to help address central questions about replicability. Namely, they can estimate whether the original study is statistically consistent with the replications, and they can help re-assess the strength of evidence for the psychological effect of interest. However, existing statistical analyses applied to many-to-one designs have important shortcomings: they fail to account for heterogeneity (sometimes leading to unduly pessimistic conclusions about replication success), can be challenging to interpret and compare across replication projects, and often only peripherally address the questions of greatest scientific interest. We therefore propose new statistical metrics representing: (1) the probability that the original study's estimated effect size would be as extreme or more extreme than it actually was, if in fact the original study is statistically consistent with the replications; (2) the probability of a true effect of scientifically meaningful size in the same direction as the estimate of the original study; and (3) the probability of a true effect of meaningful size in the direction opposite the original study's estimate. Unlike existing metrics, the proposed metrics account for all relevant sources of statistical uncertainty, have intuitive interpretations, and harness the specific strengths of many-to-one designs (though they can also be useful in other types of replication research). All analyses are easy to conduct manually or using the R package "Replicate".

License: CC-By Attribution 4.0 International

This project represents a preprint. Learn more about how to work with preprint files. View preprint

Files

Loading files...

Citation

osf.io/apnjk

Tags

Recent Activity

Loading logs...

×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.

Create an Account Learn More Hide this message