Preventing Medical Errors
LUCIAN L. LEAPE
Most people first became aware of the problem of medical errors in late 1999, when the National Academy of Sciences' Institute of Medicine (IOM) released To Err Is Human, which announced that up to ninety-eight thousand people die each year from medical errors (IOM 2000). Although the shocking mortality figures came from studies published up to eight years previously (Leape et al. 1991; Brennan et al. 1991; Thomas et al. 2000), they were new to most readers and came now from an impeccable source. Congress promptly scheduled hearings, and shortly thereafter the president called on all federal health agencies to implement the IOM recommendations (Quality Interagency Coordination Task Force 2000).
The IOM brought to public attention a slow-growing safety movement that began in 1995 with the coincidence of disaster and opportunity. The disaster was a series of highly publicized serious medical errors, most notoriously, the death of Betsy Lehman, a health reporter for the Boston Globe, from a massive overdose of chemotherapy at the respected Dana Farber Cancer Institute. That such a tragic error could happen at such a prestigious medical institution shook both public and professional confidence. The opportunity was the discovery by health care leaders of the potential for preventing errors by using industrial human factors approaches, particularly the recognition that the cause of most human errors is neither carelessness nor incompetence, but defects in the systems in which people work (Leape 1994). For example, system characteristics such as look-alike labels and sound-alike names, conditions of work (long hours and heavy work loads), and managerial style (diffused responsibility and lack of teamwork) make it more likely that an individual will make a mistake. Errors can be reduced by redesigning the systems.
The implications of this concept for medicine are profound because it runs counter to classical medical training that focuses on faultless individual performance, reinforced by shaming and blaming. However, the systems approach is based on a wealth of studies in cognitive psychology and human factors engineering, as well as substantial experience in industries such as aviation, which have