In recent times, the United States has entered a particularly active phase in its use of military force. Since 1989, the United States has intervened in Panama, Kuwait, northern Iraq, Somalia, Bosnia, Haiti, Kosovo, Afghanistan, and Iraq. And, with the promulgation of the Bush Doctrine in the aftermath of the events of September 11, 2001, it appears that the United States is poised to continue its active intervention in the future.
Much is new in the world that helps to explain this increased frequency. First, the United States now stands unrivaled in the international system. This concentration of power permits US decision makers to consider the use of force in almost any crisis. Second, since the end of the Cold War, there has been a spate of violent regional and civil wars which, in addition to new information technologies, have led human rights activists to collect evidence of gross violations of international humanitarian laws and to launch intense advocacy campaigns across the world. This has pressured the United States to use its massive military arsenal to alleviate the extreme abuses and a flurry of new norms of humanitarian intervention. Third, a new wave of interventions has occurred in response to the emerging threats associated with terrorism and the illicit proliferation of WMD.
Despite all that has changed since 1989 and, more recently, since September 11, 2001, much remains the same. US citizens have always been divided about when and where the United States should use military force. Except in rare occasions of extreme national emergency, US decisions concerning the use of force are almost always contested. Political elites, in particular, differ about the nature of threats and the costs and efficacy of the use of force. As a result, in almost every instance when a US President considers the use of US force in overseas combat missions, there are intense political debates among US foreign policy elites about whether or not force should be used. Ultimately, decisions to intervene are almost always based on tenuous coalitions--not consensus. Because rhetoric campaigns are such an integral part of the process to mobilize public and political support, the public frequently develops unrealistic expectations about the nature, likely cost, and efficacy of military intervention. These unrealistic expectations ultimately have profound implications not only for the intervention, but also for the long-term commitment to post war reconstruction.
The Politics of Intervention
Famed Christian theologian Reinhold Niebuhr once noted that "every nation is caught in the moral paradox of refusing to go to war unless it can be proved that the national interest is imperiled, and of continuing in the war only by proving that something much more than national interest is at stake." For the United States, this acute moral paradox has had a peculiar political twist. The debates over Federalism and Republicanism between Alexander Hamilton and Thomas Jefferson that surfaced in the early days of the United States were never resolved. They only marked the beginning of the enduring tension and confluence between realism and idealism that is still today the distinctive essence of US foreign policy. Throughout the past two centuries, we have grown accustomed to the differences between realists and idealists, interventionists and anti-imperialists and isolationists, and hawks and doves, among others. Today, we hear common references to hardliners, selective engagers, liberal internationalists, humanitarianists, isolationists, and pacifists. Whatever labels scholars or journalists attach to these differences, however, they all reflect one enduring element of US foreign policy: there is no singular or monolithic conception of national interest or of American values and principles.
World War II is the only instance in history when the nature of the threat was so clear and unambiguous that it generated as close to a consensus as possible in US society. …