There is perhaps no greater tragedy in a parent's life than learning that one's child is terminally ill. Today, more than at any time in the past, when conventional treatment fails, dying children are given access to experimental treatment. To a surprising extent, society takes for granted the participation of dying children in medical experiments. In part, this is because we have come to view participation in clinical trials as a potential benefit. This view contrasts sharply with the dominant perception of the mid to late 20th century, which viewed medical research as a potential threat to vulnerable populations. Upon closer scrutiny, both of these perspectives carry with them some important truths. This Article seeks to build upon those truths by undertaking a critical analysis of contemporary ethical and legal policies governing the inclusion of terminally ill children in clinical research.
Well-documented abuses of human subjects in medical experimentation, including research with children, created concern in the latter decades of the 20th century. This led to the perception that children were vulnerable, given their inability to protect their own interests, and that the mere fact that parents gave informed consent was insufficient to safeguard their children from the potential harms of medical experimentation. Federal regulations put into place in 1981 sought to provide the necessary additional safeguards. Such rules might have led to a relatively small percentage of sick children enrolling in clinical trials in pediatric oncology. Yet today, a strikingly large majority of U.S. children with cancer are enrolled in Phase III clinical trials and receive therapy under experimental conditions. This is unlike the situation of adults with cancer, where approximately fifteen percent participate in clinical trials. The overwhelming majority of children with cancer become subjects in medical experiments.
These numbers might be unsurprising as pediatric cancer research involves children suffering from life-threatening diseases, which until the 1960s had virtually no effective therapy. In the mid-twentieth century, clinicians and families agreed that clinical studies offered the best chance to save the lives of the affected children. The partnership between pediatric cancer research and clinical studies proved remarkably effective. In less than half a century, clinical research in pediatric oncology produced great progress. The most common form of childhood leukemia went from being a nearly always fatal disease to one cured more than seventy-five percent of the time. Such success no doubt contributed to a willingness to permit children to become the subjects of medical experiments, and perhaps reflected a more general shift in American thinking about the nature of medical research. Beginning in the mid-1980s, in response to scientific progress achieved through clinical research in cancer and AIDS, Americans began to demand access to clinical trials.
These factors help to explain the high percentage of children with cancer who participate in clinical studies, and may also explain the limited ethical and legal scrutiny this issue has received. It may seem needlessly academic to analyze the pros and cons of enrolling pediatric cancer patients in clinical trials. Nonetheless, it is far from clear that the sickest of children-those whom conventional treatment cannot cure-personally benefit from enrollment in early-phase studies.
To the extent that we fail to explore the legal and ethical issues surrounding children's participation in clinical studies, particularly in Phase I clinical trials, we run the risk of replicating the historical abuses of this exploitable population. Toward this end, this Article undertakes a critical analysis of the legal and ethical norms governing the enrollment of sick children in early-phase trials. The Article begins with a brief overview of the history of medical experimentation involving children, assessing both its great promise and its inherent limitations. …