Academic journal article Group Facilitation

The Logic of Failure: Why Things Go Wrong and What We Can Do to Make Them Right

Academic journal article Group Facilitation

The Logic of Failure: Why Things Go Wrong and What We Can Do to Make Them Right

Article excerpt

by Dietrich Domer Metropolitan Books, Henry Holt and Co., New York, 1996, translated by Rita and Robert Kimber, ISBN: 0805041605, $25.00. Paperback edition: The Logic of Failure: Recognizing and Avoiding Error in Complex Situations, Perseus Press, Cambridge, MA, 1997, 222 pages, ISBN: 0201479486, $16.00.

Book Review by Nancy S. Hewison

Lots of things go wrong in Dietrich Dorner's The Logic of Failure. Nuclear reactors fail, communities break down and entire societies crash. Complex situations go awry despite, or because of, the planning and subsequent actions of intelligent and well-intentioned people. Dorner would have much to talk about with Peter Senge (1990) and others working in the area of systems thinking. The Logic of Failure is packed with illustrations of the ways humans approach complex situations and fail to solve them because of an inability to think systemically. Dorner, a professor of psychology at the University of Bamberg, Germany, utilizes computer simulation exercises to study problem solving, decision making, human information processing, and the psychological aspects of planning, and to elucidate the cognitive behaviors that lead to failure in these arenas. Rather than burden the reader with tedious detail about the computer simulations, however, he interweaves learnings from them with numerous real life examples which enhance the reader's comprehension, contribute to readability, and provide illustrations of potential use to facilitators.

The first chapter presents a number of scenarios that capture the reader's attention. In one example, each participant in a computer simulation experiment was given dictatorial powers to increase the well being of the occupants of Tanaland, a fictitious region in West Africa. Over the course of ten computer-simulated years, a participant had six opportunities to intervene in the region's agricultural practices, medical care, and access to water, electricity, and motorized equipment. At each intervention point, information could be gathered to feed into planning. Of twelve participants, only one succeeded in stabilizing population growth while creating overall improvement in the standard of living. The decisions of the average participant, on the other hand, initially improved life in the region and then led to one or another catastrophe. In one case, a vigorous campaign to eradicate the rodents and monkeys who were eating the crops deprived the local leopards of their normal food supply. They then turned to feeding on the farmers' cattle. In another case, crop yield increased due to motorized plows and artificial fertilizer, with the result that the population grew until it exceeded the capacity of the food supply.

Why did one participant succeed while all the others failed? None of them had any particular expertise and the experiment's problems did not require any specialized knowledge. The answers, according to Domer, lie in the way humans think and in the fact that, when dealing with complex systems, "...we cannot do just one thing. Whether we like it or not, whatever we do has multiple effects."

In discussing a real life example, the 1986 catastrophe at the atomic energy plant in Chernobyl, Dorner suggests, "We cannot find a single example of [human] failure. No one who should have stayed awake fell asleep..overlooked a signal... [or] accidentally flipped a wrong switch." The plant operators, however, exhibited many of the characteristics of participants in the Tanaland experiment. They thought in terms of linear networks of causation rather than considering potential side effects of their decisions and actions, and they did not understand that exponentially developing processes move extremely rapidly. Furthermore, under time pressures, they applied overdoses of established measures.

Having stripped away any hope the reader may have harbored that the cognitive pitfalls of decision making reside only in such simulated situations as Tanaland, Dorner sets out, in chapter two, to demystify the demands of complexity, dynamics, and intransparency which planners and decision makers regularly face. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.