For years to come, historians are likely to ponder the reasons that led the United States to invade Iraq in March 2003-an invasion that has cost many lives and vast treasure but has yet to result in the establishment of a stable, legitimate regime. Many reasons, official and unofficial, have been advanced to explain the 2003 invasion, but none have proved entirely persuasive. The original explanation given for US military action-to eliminate the threat supposedly posed by Saddam Hussein's stockpiles of weapons of mass destruction (WMD)-no longer holds any credibility, as it has long since been shown that US officials were aware before the invasion that Iraq possessed few, if any WMD munitions. The main explanation provided after the invasion-to foster democracy in Iraq and neighbouring countries-has also lost credibility because of the increasingly violent hostility between Iraq's religious and sectarian factions. Other explanations, drawing on the ideological and personality traits of George W. Bush and other key figures involved, have also been offered. As times goes on, however, analysts are sure to consider the deeper historical forces that gave rise to the 2003 invasion-and this mode of analysis is likely to lead us back to the "Carter doctrine" of 1980.
The Carter doctrine was first enunciated in then-President Jimmy Carter's "state of the union" address of 23 January 1980, in which he informed congress and the American people that access to the Persian Gulf's oilfields was essential to the health of the US economy and so any hostile effort to block such access would be considered an assault on America's "vital interests" and so would be resisted by "any means necessary, including military force." To implement this policy, Carter established the rapid deployment joint task force (RDJTF) and deployed a permanent US naval presence in the Gulf. And while they may have employed different language, all the presidents who succeeded Carter have reaffirmed the basic premises of his 1980 doctrine and have taken steps to enhance America's capacity to project military force into the greater Persian Gulf region.
Many formative expressions of America's Cold War policy-the Truman doctrine, the Eisenhower doctrine, and the Nixon doctrine, among others-have been rendered moot by the collapse of the Soviet Union; the Carter doctrine, however, continues to govern US foreign policy today. As will be argued below, the 2003 US invasion of Iraq can be viewed as a natural extension of the Carter doctrine, along with other US military moves in the Persian Gulf area. But the logic of the Carter doctrine is no longer being applied to the Gulf alone: increasingly, access to oil in other producing areas is being viewed in Washington as a "vital interest," and thus as something that must be protected by military force whenever necessary. Indeed, the globalization of the Carter doctrine may prove to be one of the most significant developments of the post-Cold War era.
ORIGINS OF THE CARTER DOCTRINE
The origins of the Carter doctrine can be traced back to February 1945, when the United States first established a protectorate over Saudi Arabia and committed itself to the use of military force in protecting Persian Gulf oil. This move was triggered by Washington's concern over the nation's declining oil output and a desire to ensure the safety of its overseas supplies. Until 1945, the United States was largely self-sufficient in petroleum production and, in most years, produced a large enough surplus to satisfy the needs of many foreign consumers as well. But the requirements of wartime consumption plus intimations of an eventual decline in US output led President Franklin D. Roosevelt to seek control over foreign sources of petroleum. By 1943, he had concluded that Saudi Arabia was likely to assume the role of America's principal foreign supplier after World War II, and, by 1945, had determined that the United States must extend some sort of protective umbrella over Saudi Arabia's prolific oil fields. …