Main content



Loading wiki pages...

Wiki Version:
This paper is in press at Psychological Science (06/08/18). Read related in press paper [here][1]. Abstract: One of the mind’s most fundamental tasks is interpreting incoming data and weighing the value of new evidence. Across a wide variety of contexts, we show that when summarizing evidence, people exhibit a binary bias: a tendency to impose categorical distinctions on continuous data. Evidence is compressed into discrete bins and the difference between categories forms the summary judgment. The binary bias distorts belief formation— such that when people aggregate conflicting scientific reports, they attend to valence and inaccurately weight the extremity of the evidence. The same effect occurs when people interpret popular forms of data visualization and cannot be explained by other statistical features of the stimuli. This effect is not confined to explicit statistical estimates, but also influences how people use data to make health, financial, and public policy decisions. These studies (total N=1,851) support a new framework for understanding information integration across a wide variety of contexts. [1]:
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.