Rational Mind Designs: Research into the Ecology of Thought Treads on Contested Terrain

Article excerpt

As a visiting scholar at Stanford University in 1990, Gerd Gigerenzer once joined in an animated lunchtime discussion with several colleagues about the implications of voluminous reported pitfalls in human reasoning. An economist tersely concluded the conversation with the zinger, "Look, either reasoning is rational or it's psychological."

That seemed to Gigerenzer, a psychologist fascinated by the history of statistical methods in science, a fair summary of how many social scientists treat the mind. Experimenters typically borrowed mathematical equations, or algorithms, from probability theory and treated them as "optimal" standards of rational decision making. According to this view, optimal judgments in uncertain situations rest on calculations that take all pertinent and available information into account. Human thinkers, with their often wayward psychological assumptions, could only flail at these ideals.

Beginning in the 1970s, researchers repeatedly found that volunteers grappling with various judgment tasks failed to reason according to the optimal mathematical formulas. That discrepancy led many psychologists to posit the existence of simple rules of thumb for making judgments that result in biased or irrational conclusions. They use the term "heuristics" for these rules. Heuristics have attracted blame for encouraging beliefs in astrology, faith healing, and UFOs, among other things.

For the past decade, in scientific papers and in lectures at U.S. and European universities, Gigerenzer has tried to yank heuristics off their throne. Although not the first critic of the idea that highly fallible assumptions rule human thought, he is perhaps the most aggressive, and he comes armed with alternative theories to explain human reasoning. As a result, the German scientist has attracted both ardent supporters and detractors, and his wake has roiled the waters of psychological research.

"We want to determine simple psychological principles that minds actually use and then examine how these principles exploit the structure of real-world environments," asserts Gigerenzer, director of the Center for Adaptive Behavior and Cognition at the Max Planck Institute for Psychological Research in Munich. "Such principles are rational, in the sense that they can be accurate and work quickly."

A tradition extending from 18th-century philosophers and mathematicians to modern social scientists assumes that rational, intelligent thinking conforms to the laws of probability theory and logic, Gigerenzer contends. This reflects a broader tendency on the part of behavioral researchers to think of the mind as a reflection of their tools, from statistical techniques to computer methods.

For instance, psychologists have for 35 years repeatedly assessed whether people make inferences according to Bayes' theorem, a mathematical formula that can be used to estimate an individual's expectation that a particular event will occur. Tests have consistently shown that people do not estimate probabilities according to Bayes' rule, typically because they fail to account for the base rates of events that they want to predict.

In one example, researchers ask volunteers to calculate the probability that a 40-year-old woman who has a positive mammography test actually has breast cancer. The correct answer involves not only the accuracy of the test, which gives a positive result in 80 percent of women with breast cancer, but also the base rate of the disease (1 percent among 40-year-old women) and the rate at which a mammography falsely reports breast cancer in healthy women. Usually, volunteers asked to make this type of estimate focus on the first factor to the exclusion of the others and produce unduly high estimates, around 80 percent. Bayes' theorem, in which the base rate factors into a relatively complex calculation of the one-time probability of having breast cancer given a positive mammography, yields a prediction of 7. …