Derr, Thomas Sieger, First Things: A Monthly Journal of Religion and Public Life
Global warming has achieved the status of a major threat. It inspires nightmares of a troubled future and propels apocalyptic dramas such as the summer 2004 movie The Day After Tomorrow. Even were the Kyoto treaty to be fully implemented, it wouldn't make a dent in the warming trend, which seems to be inexorable. Doom is upon us.
Except that maybe it isn't. You might not know it from ordinary media accounts, which report the judgments of alarmists as "settled science," but there is a skeptical side to the argument. Scientists familiar with the issues involved have written critically about the theory of global warming. The puzzle is why these commentators, well-credentialed and experienced, have been swept aside to produce a false "consensus." What is it that produces widespread agreement among both "experts" and the general public on a hypothesis which is quite likely wrong?
The consensus holds that we are experiencing unprecedented global warming and that human activity is the main culprit. The past century, we are told, has been the hottest on record, with temperatures steadily rising during the last decades. Since human population and industrial activity have risen at the same time, it stands to reason that human activity is, one way or another, the cause of this observed warming. Anything wrong with this reasoning?
Quite a lot, as it turns out. The phrase "on record" doesn't mean very much, since most records date from the latter part of the nineteenth century. Without accurate records there are still ways of discovering the temperatures of past centuries, and these methods do not confirm the theory of a steady rise. Reading tree rings helps (the rings are further apart when the temperature is warmer and the trees grow faster). Core samples from drilling in ice fields can field even older data. Some historical reconstruction can help, too--for example, we know that the Norsemen settled Greenland (and named it "green") a millennium ago and grew crops there, in land which is today quite inhospitable to settlement, let alone to agriculture. Other evidence comes from coral growth, isotope data from sea floor sediment, and insects, all of which point to a very warm climate in medieval times. Abundant testimony tells us that the European climate then cooled dramatically from the thirteenth century until the eighteenth, when it began its slow rewarming.
In sum, what we learn from multiple sources is that the earth (and not just Europe) was warmer in the tenth century than it is now, that it cooled dramatically in the middle of our second millennium (this has been called the "little ice age"), and then began warming again. Temperatures were higher in medieval times (from about 800 to 1300) than they are now, and the twentieth century represented a recovery from the little ice age to something like normal. The false perception that the recent warming trend is out of the ordinary is heightened by its being measured from an extraordinarily cold starting point, without taking into account the earlier balmy medieval period, sometimes called the Medieval Climate Optimum. Data such as fossilized sea shells indicate that similar natural climate swings occurred in prehistoric times, well before the appearance of the human race.
Even the period for which we have records can be misread. While the average global surface temperature increased by about 0.5 degrees Celsius during the twentieth century, the major part of that warming occurred in the early part of the century, before the rapid rise in human population and before the consequent rise in emissions of polluting substances into the atmosphere. There was actually a noticeable cooling period after World War II, and this climate trend produced a rather different sort of alarmism--some predicted the return of an ice age. In 1974 the National Science Board, observing a thirty-year-long decline in world temperature, predicted the end of temperate times and the dawning of the next glacial age. …