Academic journal article Cognitive, Affective and Behavioral Neuroscience

Feeling We're Biased: Autonomic Arousal and Reasoning Conflict

Academic journal article Cognitive, Affective and Behavioral Neuroscience

Feeling We're Biased: Autonomic Arousal and Reasoning Conflict

Article excerpt

Human reasoning is often biased by intuitive beliefs. A key question is whether the bias results from a failure to detect that the intuitions conflict with logical considerations or from a failure to discard these tempting intuitions. The present study addressed this unresolved debate by focusing on conflict-related autonomic nervous system modulation during biased reasoning. Participants' skin conductance responses (SCRs) were monitored while they solved classic syllogisms in which a cued intuitive response could be inconsistent or consistent with the logical correct response. Results indicated that all reasoners showed increased SCRs when solving the inconsistent conflict problems. Experiment 2 validated that this autonomic arousal boost was absent when people were not engaged in an active reasoning task. The presence of a clear autonomic conflict response during reasoning lends credence to the idea that reasoners have a "gut" feeling that signals that their intuitive response is not logically warranted. Supplemental materials for this article may be downloaded from http://cabn.psychonomic -journals.org/content/supplemental.

In the spring of 2009, fears of the H1N1 virus swept the world. The media commonly referred to the new virus as "swine" or "Mexican" flu although it was no longer harbored in swine and had already spread over the world at the time of the outbreak; hence, eating pork or having dinner at your local Mexican restaurant did not pose any clear health risks. The World Health Organization tried hard to inform the public, but the mere intuitive association with the name of the virus seemed to have an irresistible pull on people's behavior: A lot of us stopped eating at Mexican restaurants, Haitian officials rejected an aid ship with Mexican food aid, pork belly futures collapsed on Wall Street, and the Egyptian government even ordered their farmers to kill all of their pigs (Alexander, 2009; Ballantyne, 2009). From a logical point of view, none of these measures was effective to stop the spread of the virus or avoid contamination, but, intuitively, people nevertheless felt they were better off by simply avoiding contact with Mexicans or pork.

People's overreaction to the swine flu threat is a dramatic illustration of a general human tendency to base our judgment on fast intuitive impressions rather than on more demanding, deliberative reasoning. This tendency is biasing people's performance in a wide range of classic logical and probabilistic reasoning tasks (Evans, 2003; Kahneman, 2002). One of the most famous and studied examples is the belief bias phenomenon in syllogistic reasoning. Belief bias refers to the intuitive tendency to judge the validity of a syllogism by evaluating the believability of the conclusion (Oakhill, Johnson-Laird, & Garnham, 1989). Often this is problematic, because the believability of the conclusion conflicts with its logical status. Consider the following example: "All birds have wings. Crows have wings. Therefore, crows are birds." Although the conclusion in the example is logically invalid and should be rejected, intuitively many people will nevertheless tend to accept it because it fits with their prior beliefs. Sound reasoning requires that people abandon this mere intuitive, or so-called "heuristic," thinking, and engage in more deliberate, analytic thinking. Unfortunately, this turns out to be quite hard for most people; just as in the swine flu case, many reasoners end up being biased by their intuition.

Although it is a well-established fact that people are often biased, the nature of this bias is unclear. The crucial issue boils down to whether or not people detect that they are biased. Sound reasoning requires that people monitor their intuitions for conflict with more logical considerations. According to one view, people would be very bad at this monitoring (e.g., Kahneman & Frederick, 2005). Because of lax monitoring, people would simply not detect that their intuitions are invalid. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.