Main content

Date created: | Last Updated:

: DOI | ARK

Creating DOI. Please wait...

Create DOI

Category: Project

Description: Forbes, Wright, Markon, and Krueger (2017) state that “psychopathology networks have limited replicability” and that “popular network analysis methods produce unreliable results”. These conclusions are based on an assessment of the replicability of four different network models for symptoms of major depression and generalized anxiety across two samples; in addition, Forbes et al. (2017) analyze the stability of the network models within the samples using split-halves. Our re-analysis of the same data with the same methods led to results directly opposed to those of Forbes et al. (2017): All network models replicate very well across the two datasets and across the split-halves. We trace the differences between Forbes et al.’s (2017) results and our own to the fact that they did not appear to accurately implement all network models, and used debatable metrics to assess replicability. In particular, Forbes et al. (2017) deviate from existing estimation routines for relative importance networks, do not acknowledge the fact that the skip-structure used in the interviews strongly distorted correlations between symptoms, and incorrectly assume that network structures and metrics should not only be expected to be the same across the different samples, but also across the different network models used. In addition to a comprehensive re-analysis of the data, we end with a discussion of best practices concerning future research into the replicability of psychometric networks.

License: CC0 1.0 Universal

Files

Loading files...

Citation

Components

Codes: False alarm? A comprehensive reanalysis of “Evidence that psychopathology symptom networks have limited replicability” by Forbes, Wright, Markon, and Krueger.

Forbes, Wright, Markon, and Krueger (2017) state that “psychopathology networks have limited replicability” and that “popular network analysis methods...

Recent Activity

Loading logs...

Recent Activity

Loading logs...

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.