One of the most contentious issues of the day is global warming. Those who openly discuss the subject fall into one of two camps. First, there are the environmental alarmists who only see the world in terms of urban sprawl, deforestation, and pollution. For this group, global warming provides the much-needed justification to curtail, or reverse, our current level of earth-unfriendly economic activity. The other group sees no evidence of harmful global warming. They view the draconian anti-business remedies as both unjustified and misguided.
Given the high stakes (from both a monetary and an emotional perspective), it should come as no surprise that there is a temptation for the first group to play fast and loose with the available scientific data. Findings that support global warming are highlighted, and those that do not are downplayed, omitted, or politicized. Global-warming computer models are frequently little more than high-tech "crystal balls." With the multitude of variables and assumptions that come into play, these computer models become highly suspect. In deciding which model to use, the critical question becomes: "How scary do you want the future to be?"
Ground zero in the global warming debate is the 1997 Kyoto Protocol Agreement. This treaty, yet to be ratified by the United States, calls for a reduction in greenhouse gases and fossil-fuel emissions to a level 5 to 7 percent below the 1990-benchmark year by 2012.1 The estimated compliance cost for the United States will be $300 billion a year.2 But the global solidarity to end global warming had a temporary setback last November at The Hague. Participating countries were unable to work out the details.
The Kyoto Protocol seems to be built on the following two assumptions: First, global warming is a function of human activity (with the biggest villains being automobiles, factories, and power plants), and second, we are currently experiencing unprecedented levels of global warming. However, a review of the earth's most recent "geological history" brings into question both assumptions and puts the entire subject in a different light.
For over a million years, the earth has undergone a succession of glacial and interglacial periods. Each glacial period lasted anywhere from 70,000 to 100,000 years. In the most recent one, ice covered all of what is now Canada and the northern third of the United States.3 To date, each glacial period has been followed by a very warm, yet much shorter, interglacial period of 10,000 to 30,000 years. In some of these interglacial periods, ice covered less area than today.
The last ice age ended approximately 10,000 years ago. This was followed by a period of significant global warming that lasted -5, 000 years. The average temperature in this time frame was 2 to 3 degrees Celsius higher than we find today. This caused the sea level to rise over 100 feet. The warmer climate also made it possible for broad-leafed forests to grow in latitudes much farther north than they do currently. In the most recent 5,000year period, there have been numerous periods of distinct global warming and global cooling.4 However, the overall long-term climatic trend indicates that the earth has been getting cooler, not warmer.
There was a very pronounced medieval warm period from 700 AD to 1400 AD. Indirect evidence suggests that the average temperature was as much as 1.5 degrees Celsius warmer than today. In Europe, agriculture flourished at latitudes farther north and at higher elevations than today. Vineyards, which require sunny and warm conditions, existed in areas 300 miles north of the present limits. The cultivation of grapes for winemaking was extensive throughout the southem portions of England from about 1100 to around 1300. The amount of English wine produced was enough to provide significant competition with the French. As further evidence of a much warmer climate, the tree line in the Alps was 300 meters higher than we find today. …