Intellectual Hazard: How Conceptual Biases in Complex Organizations Contributed to the Crisis of 2008
Miller, Geoffrey P., Rosenfeld, Gerald, Harvard Journal of Law & Public Policy
INTRODUCTION I. INTELLECTUAL HAZARD A. Complexity Bias B. Incentive Bias C. Asymmetry Bias D. Intellectual Hazards in Financial Markets II. INTELLECTUAL HAZARD AND THE CRISIS OF 2008 A. Banks B. The Fed C. Rating Agencies D. The Basel Committee E. Regulators III. POSSIBLE REFORMS A. Complexity Bias B. Corporate Governance Reforms C. Education D. Government Reforms E. Stress Tests CONCLUSION
This Article identifies an important but previously unrecognized systemic risk in financial markets: intellectual hazard. Intellectual hazard, as we define it, is the tendency of behavioral biases to interfere with accurate thought and analysis within complex organizations. Intellectual hazard impairs the acquisition, analysis, communication, and implementation of information within an organization and the communication of such information between an organization and external parties. We argue that intellectual hazard was a cause of the Crisis of 2008 and suggest that this risk may be an important factor in all financial crises. We offer tentative suggestions for reforms that might mitigate intellectual hazard going forward.
NASA's Mars Climate Orbiter, launched from Cape Canaveral with great expectations in December 1998, reached Mars on September 23, 1999. The spacecraft passed behind the planet and out of radio contact at 9:04 UTC (1) and should have reestablished contact twenty-one minutes later. It never reappeared. (2) An investigation revealed that one of the two navigation teams assigned to the mission had been using metric system units and the other had been using the English system. Because of the difference between measurement units, the spacecraft entered orbit at too low an altitude and failed due to atmospheric stress and friction. (3)
On February 20, 1995, Dr. Rolando R. Sanchez, a surgeon in Tampa, Florida, scrubbed and entered the operating room for a routine leg amputation. (4) A blackboard in the operating room specified the leg to be amputated, as did the operating room schedule and the hospital's computer system. (5) When Dr. Sanchez entered the room the patient had already been prepped for surgery, with one of her legs draped and sterilized. The doctor performed the surgery, only to learn that he had cut off the wrong leg. It turned out that other paperwork available in the operating room, including the patient's consent form and medical history, specified the proper leg. Dr. Sanchez had apparently relied on the more commonly used sources of information about the procedure and never consulted the materials that could have prevented the mistake. (6)
Each of these disasters resulted from a common, dangerous, but little-recognized phenomenon. These events took place within complex organizations--a bureaucratic agency with numerous teams and subcontractors working on the same project, and a hospital with its network of physicians, nurses, equipment, and systems for medical and financial record-keeping and control. The mistakes were elementary--so elementary that if a single person had been carrying out the task, rather than a complex team, they never would have happened. Yet the consequences of those mistakes were devastating. The problem in both cases was the failure of the complex organization to properly acquire, communicate, analyze, and implement information pertinent to risk and crucial to the success of the operation.
The catastrophic events in financial markets during the fall of 2008 (7)--events we will refer to hereafter as the "Crisis of 2008"--were more complicated than these disasters, but there are also significant parallels. Financial markets today are among the most sophisticated, well-funded, well-informed, and technologically advanced institutions in the world. They process trillions of dollars in transactions each year. Many highly trained, hardworking, brilliant people work in the industry. …