What Has Managed Care Really Done
to the U.S. Health Care System?
National Bureau of Economic Research
For most of the last century, the U.S. health care system was financed primarily through traditional indemnity health insurance plans that paid doctors, hospitals, and other health care providers on a feefor-service basis. By the 1960s, most Americans received insurance of this type through either their employers or government programs, such as Medicare and Medicaid (HIAA 1991). In the midst of economic prosperity that minimized constraints on the revenues they could collect, and faced with the then comparatively low cost of health care, health insurers and the government provided ample funding for the widespread provision of ever more advanced health care. In the process, this subsidized and encouraged the training of new physicians, the building of new infrastructure, and the development of increasingly advanced, and almost always more expensive, technologies. By all accounts, these developments contributed significantly to the capabilities of medicine to cure disease and improve the health and functioning of patients. By the 1970s and 1980s, though, rapidly increasing costs gave rise to a number of cost-containment efforts. Perhaps the most prominent of these efforts is the growth of managed care, encompassing a range of changes in the practices of health insurers that have eroded the pillars of the traditional fee-for-service health care financing system.
The growth of managed care has raised important questions about its impact on the well-being of patients. An increasing number of opponents argue that expansion of managed care has put cost cutting