Main content

Contributors:

Date created: | Last Updated:

: DOI | ARK

Creating DOI. Please wait...

Create DOI

Category: Project

Description: Not all individuals put in the required thought and effort while responding to self-report survey items, and participant carelessness is a source of invalidity in psychological data (Huang, Liu, & Bowling, 2015). Many techniques and methods have been created to screen for this carelessness (Curran, 2016; Johnson, 2005), including the inclusion of items that researchers presume thoughtful individuals will answer in a given way (e.g., disagreement with “I am paid biweekly by leprechauns”, Meade & Craig, 2012). However, no studies have examined if these items always identify those who are careless, or if there exist individuals who have legitimate and justifiable reasons for picking out-of-bounds responses. This paper reports on two studies in which individuals spoke aloud a series of these questions, as well as their responses and justification for those responses. Coding of these responses found that a) individuals do occasionally report valid justifications for presumed invalid responses, b) there is relatively high variance in this behavior over different items, and c) items developed for this specific purpose tend to work better than those drawn from other sources or created ad hoc. These results suggest that care should be taken when implementing these types of items to screen for carelessness.

License: CC-By Attribution 4.0 International

Files

Loading files...

Citation

Tags

Recent Activity

Loading logs...

OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.