Magazine article The Futurist

Decision Making under Pressure: From Emergency Rooms to Space Missions, Many Decision-Making Situations Allow No Room for Error. an ER Physician Reflects on What Went Wrong as Flight Managers Assessed the Potential Damage on the Space Shuttle Columbia

Magazine article The Futurist

Decision Making under Pressure: From Emergency Rooms to Space Missions, Many Decision-Making Situations Allow No Room for Error. an ER Physician Reflects on What Went Wrong as Flight Managers Assessed the Potential Damage on the Space Shuttle Columbia

Article excerpt

I was working a late night shift as an emergency-room physician in February 2003, shortly after the space shuttle Columbia disaster that resulted in the death of seven astronauts. As I reflected on the disaster, one persistent thought troubled me: If the best and brightest of NASA management could not avoid such disastrous outcomes from their decision making, what hope was there for me and my decision skills in the emergency room? What could I learn from this disaster?

My "Shuttle Thinking" model resulted from those rare, quiet moments when I would put my feet up on my desk and try to analyze my own decision-making process, searching for ways to improve it. I studied the Columbia disaster and compared it to my own style of making decisions. If the Columbia had been a patient, what would I have done differently? How could I improve my own decision process and then share it with others? "Shuttle Thinking" is what I now call a set of five common pitfalls that I believe undermine our critical decision-making process.

Five Pitfalls in Critical Decision Making

To improve my decision-making process, I now consciously examine the impact of Shuttle Thinking on every high-level decision I make, using the Columbia disaster as an example. Other examples could also serve to illustrate common decision-making pitfalls--the meltdown of large financial institutions, government decisions involving Hurricane Katrina, or the sinking of the Titanic also follow the same path of poor decision making that doomed Columbia.

As you recall, shortly after Columbia's launch, a piece of insulating foam about the size of a large briefcase apparently broke off from the external fuel tank, hitting the shuttle's left wing. The extent of the damage to the left wing was not known. NASA managers felt that no action was needed, and the Columbia was allowed to return to Earth. A normal, uncomplicated reentry was expected. However, after the loss of the Columbia and crew, the Columbia Accident Investigation Board (CAIB) found fault with the decisions of NASA management.

Pitfall One: Unique Situation

Unique situations, by definition, have no learning curve. NASA management had no training-manual solution for the space shuttle Columbia incident. Instead, NASA management evaluated the situation as it unfolded; they became the learning curve. As is often the case with bad decisions in unique situations, the eventual horrific outcome was never even an initial consideration.

Key lesson: Unique situations must be approached cautiously, considered inherently risky and dangerous, and should be considered invitations to poor decision making. Scenario planning may be helpful in identifying potential sources of trouble, but unique situations require extra attention.

Pitfall Two: Data Deficit

Sometimes, not enough data exists to help you make wise decisions. Important decisions are sometimes made on little or no information. In the case of Columbia, there was no available information to determine if the left wing of the craft had been damaged. There were limited structural sensors in the wing, and no direct visualization of the wing from the shuttle was attempted.

So, not only was minimal data available, but there were few options for obtaining any additional data. Extravehicular activity (space walk) or launching another shuttle to "fly by" the Columbia to take a "visual" and check for damage were not simple options, even if they had been considered. The option recommended by engineers was to use Defense Department technologies to attain high-resolution images of the wing; however, NASA management did not exercise tills option, believing that the damage was likely too minor. The CAIB investigators later concluded that the decision-making process itself contributed to the disaster.

Key lesson: Data deficits, with inadequate information for a critical decision, make it mandatory to obtain additional data. …

Author Advanced search

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.