This paper is in press at Psychological Science (06/08/18). Read related in press paper [here][1].
Abstract: One of the mind’s most fundamental tasks is interpreting incoming data and weighing the value of new evidence. Across a wide variety of contexts, we show that when summarizing evidence, people exhibit a binary bias: a tendency to impose categorical distinctions on continuous data. Evidence is compressed into discrete bins and the difference between categories forms the summary judgment. The binary bias distorts belief formation— such that when people aggregate conflicting scientific reports, they attend to valence and inaccurately weight the extremity of the evidence. The same effect occurs when people interpret popular forms of data visualization and cannot be explained by other statistical features of the stimuli. This effect is not confined to explicit statistical estimates, but also influences how people use data to make health, financial, and public policy decisions. These studies (total N=1,851) support a new framework for understanding information integration across a wide variety of contexts.
[1]: https://osf.io/gxmwz/