Main content


Date created: | Last Updated:


Creating DOI. Please wait...

Create DOI

Category: Project

Description: Introduction: Multitrait-multimethod (MTMM) data can be analyzed with single-indicator confirmatory factor analysis (CFA-MTMM) models. Most single-indicator CFA-MTMM models imply—but do not allow testing—the restrictive assumption that method biases generalize (correlate) perfectly across different traits for a given method. Methods: To examine the validity of this assumption, we identified and reviewed 20 published applications of multiple-indicator CFA-MTMM models which allow testing this assumption. Based on simulated data, we demonstrate the consequences of violating the assumption of perfectly general method effects based on the CT-C(M – 1) approach (Eid, 2000; Eid et al., 2003). Results: We extracted 111 heterotrait-monomethod method factor correlation estimates which varied between |.01| and |1.0| (mean = .52) with most correlations being substantially smaller than |1|. The results of our review and simulations show that violations of the assumption of perfectly general method effects (1) are very common, (2) are difficult to detect based on model fit statistics, and (3) can lead to considerable bias in estimates of convergent validity, method specificity, reliability, and method factor correlations in single-indicator models. Conclusion: We recommend that researchers abandon the use of single-indicator CFA-MTMM models and that they use multiple-indicator CFA-MTMM models whenever possible.


Add important information, links, or images here to describe your project.


Loading files...



Recent Activity

Loading logs...

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.