Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
According to traditional approaches inspired by heuristics & biases research tradition (e.g., Gilovich, 1993; Plous, 1993; Stanovich, 2012) people are fundamentally flawed in their ability to reason about chance events due to their reliance on suboptimal heuristics and intuitions in their assessment of data. One of the most often cited examples of such flaw in human reasoning is over-interpretation of coincidences which leads people to accept false causal theories. Grifiths and Tenenbaum (2007) have come with alternative explanation of people's reactions to coincidences and they have modeled this cognitive phenomenon within the bayesian framework of rational statistical inference. Within this framework coincidence is not just an over-interpreted "fluke" but it plays a crucial role in the process of theory discovery and revision as it provides opportunity to make a discovery that is not consistent with the current theory of how the world works. They have re-defined coincidence - usually conceptualized as unlikely event - as "*an event that provides support for an alternative to a currently favored causal theory, but not necessarily enough support to accept that alternative in light of its low prior probability.*" Grifiths and Tenenbaum (2007, p. 180) They have tested qualitative and quantitative predictions of this account of coincidences and concluded that false conclusions drawn from coincidences are not caused by inability to properly assess available evidence (as reflected in the likelihood ratio of alternative and null hypothesis) but occur as a consequence of over-estimating the a priori plausibility of those theories (as reflected in the prior odds of alternative and null hypothesis). They have speculated that this over-estimation can be a relic of childhood when the disposition to believe in the existence of unexpected causal relationships is highly adaptive given that small children are surrounded by events that really do involve novel causal relationships. But in the context of current state of knowledge such disposition leads more often to false believes than to genuine discoveries. This speculation has received some empirical evidence suggesting that in comparison with adults children are better in learning unusual abstract causal principles (overhypotheses) due to being less biased by prior assumptions (based on previous experiences) and due to paying more attention to current evidence. (Lucas et al., 2014) In my research I want to use Grifiths and Tenenbaum's bayesian model of causal reasoning on the basis of perceived coincidences and to test the hypothesis that a priori plausibility of alternative explanations can be influenced on the situational basis in the context of the exploration-exploitation trade-off. The basic idea behind this hypothesis is that it would be adaptive for people to be able to situationally change their disposition to believe in the existence of unexpected causal relationships in response to perceived non/availability of the common explanations for observed events. To test this hypothesis I will use two groups of people (randomly assigned to one group or the other) who will differ in their respective probability of experiencing cognitive impasse - a subjective feeling of not knowing what to do in a situation when one tries to solve some problem. For achieving this effect I will use matchstick algebra problems which will differ in their respective level of difficulty (Knoblich et al., 1999; Öllinger et al., 2008). Participants will also solve tasks from Grifiths and Tenenbaum's original study (2007) which will enable me to estimate participants' prior probability they ascribe to alternative hypothesis ("Psychokinesis" experiment) and how they assess available evidence ("Bombing of London" experiment). I expect that 1) participants with induced cognitive impasse will show higher propensity to perceive alternative explanations as a priori more probable in comparison with participants who will not experience cognitive impasse (i.e., experimental manipulation should influence the prior odds of alternative and null hypothesis); 2) both groups of participants will be comparable with respect to the level of perceived suspiciousness of coincidences (i.e., experimental manipulation should not influence the likelihood odds). To measure the effectiveness of experimental manipulation causing the cognitive impass among participants in the experimental group I will also administer PNS (Personal Need for Structure) inventory (Thompson, Naccarato, & Parker, 1992; Neuberg & Newsom, 1993). The PNS inventory will be administered at the very end of the experimental session. This usage of PNS inventory is based on the Whitson and Galinsky's (2008) discovery that lacking of control - caused for example by inability to solve presented problems - increases the need for structure (and thus also the probability of the illusory pattern perception). I expect that experimental manipulation with cognitive impasse will lead to higher PNS scores in the experimental group in comparison with the control group. *References:* Gilovich, T. (1993). How we know what isn’t so: The falliblity of reason in everyday life. New York: Free Press. Griffiths, T. L., & Tenenbaum, J. B. (2007). From mere coincidences to meaningful discoveries. *Cognition*, 103, 180-226. Knoblich, G., Ohlsson, S., Haider, H., & Rhenius, D. (1999). Constraint relaxation and chunk decomposition in insight problem solving. *Journal of Experimental Psychology: Learning, Memory and Cognition*, 25, 1534–1555. Lucas, G. C.; Bridgers, S.; Griffiths, T. L.; & Gopnik, A. (2014). When children are better (or at least more open-minded) learners than adults: Developmental differences in learning the forms of causal relationships. *Cognition*, 131, 284–299. Neuberg, S. L., & Newsom, J. T. (1993). Personal Need for Structure: Individual Differences in the Desire for Simple Structure. *Journal of Personality and Social Psychology*, 65 (1), 113-131. Öllinger, M., Jones, G., & Knoblich, G. (2008). Investigating the effect of mental set on insight problem solving. *Experimental Psychology*, 55, 269–282. Plous, S. (1993). The psychology of judgment and decision making. New York: McGraw-Hill. Stanovich, K. (2012). How to think straight about psychology (10th ed.). Boston: Pearson/A and B. Thompson, M. M., Naccarato, M. E., & Parker, K. E. (1992). Measuring cognitive needs: The development and validation of the Personal Need for Structure (PNS) and Personal Fear of Invalidity (PFI) measures. Manuscript submitted for publication. Whitson, J. A., & Galinsky, A. D. (2008). Lacking control increases illusory pattern perception. *Science*, 322, 115–117.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.