Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
**Principal investigators:** **David Scott Yeager** The University of Texas at Austin Email: yeager@psy.utexas.edu Homepage: http://www.utexas.edu/cola/centers/prc/directory/faculty/yeagerds **Jon A. Krosnick** Stanford University Email: Krosnick@Stanford.edu Homepage: http://comm.stanford.edu/faculty-krosnick/ **Sample size:** 1852 **Field period:** 7/9/2009-9/14/2009 **Abstract** Researchers often measure attitudes and beliefs using "some/other" questions ("Some people think that … but Other people think that…") instead of asking simpler "direct" questions. This paper reports studies testing the hypothesis that the some/other question form yields improved response validity. Eight experiments embedded in national surveys (two administered via TESS) provided no evidence in support of this hypothesis. Instead, validity was found to be greatest when employing the direct question format presenting response options in an order that did not violate conversational conventions. Furthermore, some/other questions take longer to ask and answer and, because they involve more words, require greater cognitive effort from respondents. Therefore, the some/other format seems best avoided; direct questions that avoid unconventional response option ordering are preferable to maximize data quality. **Hypotheses** When respondents are asked a some/other question, three effects might ensue: (1) respondents might infer that a greater proportion of other people endorse the unpopular view on the issue, (2) this might allow respondents to feel comfortable honestly reporting opinions they might otherwise hesitate to acknowledge, which (3) would increase the validity of their self-reports. On the other hand, employing the some/other format might increase respondent burden or shift responses through processes of social influence, and might even reduce the validity of self-reports. In doing this research, we explored whether the some/other question format might have different effects depending upon the order in which the response options were offered to respondents. We predicted that offering response options in an unconventional order in survey questions may distract respondents and compromise the validity of the answers they provide. **Experimental Manipulations** Our studies randomly assigned respondents to be asked a question either in a direct form or in a some/other form, and we assessed the psychometric properties of the answers provided. We also randomly rotated response order in the stem and response options. **Outcomes** We examined the effects of the some/other and response order manipulations on three outcomes: 1) Respondents' estimates of the percent of other adults who hold the most popular opinion for each target item. 2) The observed distributions of responses to the target items. 3) The strength of associations between responses to the target items and concurrent validity criteria. **Summary of Results** A meta-analysis of eight experiments (two of which were administered via TESS) showed that the some/other form did in fact communicate to respondents that fewer other Americans held the most popular opinion, consistent with Harter's (1982) and Schuman and Presser's (1981) presumptions. However, conveying this message did not change the observed distributions of respondents' reports of their own opinions, challenging the notion that the some/other format encouraged people to honestly report holding unpopular opinions. More importantly, response validity was compromised by employing the some/other format instead of the direct format and by offering response options in an unnatural order instead of in a natural order. Therefore, these results recommend employing the direct format with response option orders that do not violate conversational conventions. This conclusion is reinforced by a set of practical considerations. Holbrook et al. (2000) reported evidence that people answer questions offering response options in unnatural orders more slowly than they do to questions employing natural response option orders. This means that more survey time is spent on such questions—time that could be spent asking other questions. And, of course, some/other questions also take longer to read and answer because they entail many more words than do direct questions. For example, in the experiments reported here, the some/other questions averaged 44 words each, as compared to just 23 words for the direct forms of the same questions. Not surprisingly, then, a web-administered pilot study we conducted suggests that people answer direct format questions more quickly than they answer some/other format questions, t(355) = 1.81, p < .05. Therefore, on practical grounds as well, the direct format with conventional response option orders seems preferable. **Conclusions** Democratic theory suggests that voters carefully consider all relevant decision criteria before making a deliberate choice. These findings suggest that while such systematic processing might lead to votes that better reflect a voter's value priorities, they also induce ambivalence and reduce the likelihood that voters will participate in the democratic process. **References** Yeager, David Scott & Krosnick, Jon A. (2011). "Does mentioning "some people" and "other people" in a survey question increase the accuracy of adolescents' self-reports?" *Developmental Psychology*, 47:1674-1679. Yeager, D.S. & Krosnick, Jon A. (2012). Does mentioning "some people" and "other people" in an attitude question improve measurement quality? *Public Opinion Quarterly*, 76, 131-141.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.