WHEN HE ANNOUNCED HIS BID FOR THE PRESIdency back in 1991, the then-governor of Arkansas, Bill Clinton, spoke movingly of the teacher who had most influenced his thinking, a Georgetown University professor named Carroll Quigley. Quigley was known for ripping apart a copy of The Republic while he denounced Plato as the intellectual father of totalitarianism. But it was not the classroom pyrotechnics that most impressed the future president. Rather, it was Quigley's emphasis on the future in his foundation course on Western civilization. Clinton never forgot the professor's preoccupation--and not just because he was one of only two students in the class to receive an A.
"The thing that got you into this classroom today is belief in the future, a belief that the future can be better than the present and that people will and should sacrifice in the present to get to that better future" said Quigley. "That belief has taken man out of the chaos and deprivation that most human beings toiled in for most of history to the point where we are today. One thing will kill our civilization and way of life--when people no longer have the will to undergo the pain required to prefer the future to the present. That is what got your parents to pay this expensive tuition. That is what got us through two wars and the Depression. Future preference. Don't ever forget that."
It is tempting to dismiss this preference for the future as a truism, an instinct for clan survival hard-wired into the genes of all living creatures. Adults of every species exert themselves to feed and protect their helpless young. Hunter-gatherers learn to salt and dry today's meat against tomorrows hunger, and the most primitive peasants learn to save precious seed corn for next year's harvest. But advanced societies have embellished and refined the instinct into something much grander: an array of deliberate policy choices. These include investment in police and standing armed forces, education and economic infrastructure, and social health and welfare. Advanced societies extend welfare provisions even to the elderly, though they know that there is little genetic advantage to be gained from such expenditure on those beyond breeding age. They make these substantial income transfers from the working population to the retired for reasons of social cohesion and human decency--and possibly also from an acute sense of the propensity of the elderly to vote. Whatever the cause, this is an act of general political will that has little to do with the individual demands of our genes and everything to do with what we might call Quigley's Law: Successful societies are defined by their readiness to allow consideration of the future to determine today's choices.
The United States is a successful society today because over the past two or three generations it has applied Quigley's Law more thoroughly and more widely than any other society in history, and, in doing so, has shaped much of the world. Until 1940, the United States was not much more Quigley-minded than most other great powers. But the challenges of global war from 1939 to 1945, and the Cold War thereafter, persuaded successive administrations of both parties to apply Quigley's principles on a global scale. There had been a hesitant precedent in the way that the British Empire crushed piracy, abolished the slave trade, established the principle of freedom of the seas, and built lighthouses and ports available to all. But the strategy by which the United States waged the Cold War was altogether more grandiose in conception and more transforming in its application.
That extraordinary generation of policymakers gathered around Presidents Franklin Roosevelt and Harry Truman--George Marshall, Dean Acheson, George Kennan, Paul Nitze, Paul Hoffman, and others--established, with bipartisan support, a series of global institutions that, in effect, created The West, the global economic machine that brought together the wealth, markets, and ingenuity of North America, Western Europe, and Japan. …