When Medicare and Medicaid were passed in 1965, less than 6 percent of the nation's output was devoted to health care. Today, that figure exceeds 14 percent and is rising rapidly. This article looks at some of the reasons behind the relentless ascent in medical costs over the last several decades and examines how government policy has both contributed to and tried to rein in these costs.
In 1965, Congress enacted Medicare and Medicaid to ensure that poor and elderly Americans would not be denied access to health care. In that year, 5.9 percent of the nation's total output was spent on medical services. By 1992, this share had soared to approximately 14 percent, and by the year 2000, it is projected to reach almost 19 percent.(1) The growing fraction of the economy devoted to health care is one reason why many advocate a major overhaul of the current system.
A second is that although federal and state governments are projected to spend $365 billion this year providing medical care for the poor and aged, 15 percent of the population still lacks health coverage--a situation many find deplorable.(2) A third criticism stems from the fact that 91 percent of private health insurance is handled through employers.(3) This has created what is known as "job lock," or the reluctance of workers to change jobs or industries because they fear losing their medical benefits. Although it is hard to quantify the impact of job lock on the economy, public opinion polls suggest that between 10 and 30 percent of workers feel tied to their current companies for this reason.(4)
These problems have led to increasing pressure on the federal government to pass some type of health care reform package. Congress, the administration, and the American public are currently debating what form this legislation should take. Before adopting any new system, however, we need to understand the forces--both market and government--that have shaped and are currently shaping our approach to health care. This Economic Commentary examines these forces by looking at the history of medical care in the United States since the ear]y part of the century.
* HEALTH CARE: 1913 TO 1966
Many claim that the market for health care is unlike that for most goods and services. While this may be true, prior to the Depression, the health care market in the United States operated much like any other, with customers paying doctors and hospitals directly out of their own pockets. The major difference between health care and other goods and services today is insurance: Third-party payers now contribute 78 cents of every medical dollar spent.
The U.S. health insurance industry can trace its roots to 1929, when Baylor Hospital began offering prepaid hospital coverage to 1,200 teachers. This was the beginning of what later became known as Blue Cross.(5) The dramatic growth in health insurance did not occur until World War II, however. By 1943, 43 Blue Cross plans were in effect nationwide.
Blue Cross originally based its premiums on the cost of insuring specific geographic areas, with each resident charged the same amount. This practice, known as the community ratings system, was gradually superseded by experience-rated premiums, which were based on the expected cost of insuring an individual or subgroup. Thus, payments began to depend on a person's age, sex, and health status.
The advent of experience ratings was inevitable in light of adverse selection. Adverse selection refers to the greater incentive under community-rated plans for those with generally poor health or chronic illnesses to purchase medical insurance. Because of this selection bias, insurers could not charge actuarially fair premiums to normal healthy individuals, who were thus driven even further out of the market. To bring them back in, it became necessary to move toward an experience-rated system. Insurance companies hence started spending resources to ascertain the risk class of individuals, which effectively priced some people out of the market. …