A hierarchical Bayesian state trace analysis for assessing monotonicity while factoring out subject, item, and trial level dependencies

Date created: | Last Updated:

: DOI | ARK

Creating DOI. Please wait...

Create DOI

Category: Project

Description: State trace analyses assess the latent dimensionality of a cognitive process by asking whether the means of two dependent variables conform to a monotonic function across a set of conditions. Using an assumption of independence between the measures, recently proposed statistical tests address bivariate measurement error, allowing both frequentist and Bayesian analyses of tonicity (e.g., Davis-Stober, Morey, Gretton, & Heathcote, 2016; Kalish, Dunn, Burdakov, & Sysoev, 2016). However, statistical power to reject monotonicity can be biased by unacknowledged dependencies between measures, particularly when the data are insufficient to overwhelm an incorrect prior assumption of independence. To address this limitation, we developed a hierarchical Bayesian model that explicitly models the separate roles of subject, item, and trial-level dependencies between two measures. Assessment of tonicity is then performed by fitting separate models that do or do not allow a non-monotonic relation between the condition effects (i.e., same vs. different rank orders). The Widely Applicable Information Criterion (WAIC) and Pseudo Bayesian Model Averaging – cross validation measures of model fit – are used for model comparison, providing an inferential conclusion regarding the dimensionality of the latent psychological space. We validated this new state trace analysis technique using model recovery simulation studies, which assumed different ground truths regarding monotonicity and the direction/magnitude of the subject- and trial-level dependence. We also provide an example application of this new technique to a visual object learning study that compared performance on a visual retrieval task (forced choice part recognition) versus a verbal retrieval task (cued recall).

License: CC-By Attribution 4.0 International

Wiki

The code to run the analyses in this article can be found on the linked gitlab repository, sadilcowellhuber2019. That defines a set of R functions, as well as the main Stan files (in src). The package can be built with Rstudio, compiling the Stan models for future use. Note that the analyses themselves relied heavily on the MGHPCC. Without access to a cluster, most analyses would take prohibitive...

Files

Loading files...

Citation

Tags

Recent Activity

Loading logs...

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.