Main content

Date created: | Last Updated:

: DOI | ARK

Creating DOI. Please wait...

Create DOI

Category: Project

Description: The Cognitive Reflection Test, measuring intuition inhibition and cognitive reflection, has become extremely popular since it reliably predicts reasoning performance, decision-making and beliefs. Across studies, the response format of CRT items sometimes differs, assuming construct equivalence of the tests with open-ended vs. multiple choice items (the equivalence hypothesis). Evidence and theoretical reasons, however, suggest that the cognitive processes measured by these response formats and their associated performances might differ (the non-equivalence hypothesis). We tested the two hypotheses experimentally by assessing the performance in tests with different response formats and by comparing their predictive and construct validity. In a between-subjects experiment (n = 452), participants answered an open-ended, a two- or a four-option response format of stem-equivalent CRT items and completed tasks on belief bias, denominator neglect and paranormal beliefs (benchmark indicators of predictive validity) as well as actively open-minded thinking and numeracy (benchmark indicators of construct validity). We found no significant differences between the three response formats in the number of correct responses, the number of intuitive responses (with the exception of the two-option version being higher than the other tests) and in the correlational patterns with the indicators of predictive and construct validity. All three test versions were similarly reliable but the multiple-choice formats were completed more quickly. We speculate that the specific nature of the CRT items helps to build construct equivalence among the different response formats. We recommend using the validated multiple-choice version of the CRT presented here, particularly the four-option CRT, for practical and methodological reasons.

License: CC-By Attribution 4.0 International

Files

Loading files...

Citation

Tags

Recent Activity

Loading logs...

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.