Supreme Command in the 21st Century. (Joint Warfighting)
Cohen, Eliot A., Joint Force Quarterly
The term supreme command figures in a book by the same title that is too rarely read today: a memoir of World War II by Maurice Hankey. (1) A small, neat, bald man, Hankey was a former Royal Marine officer and model civil servant known to two generations of British politicians as "the man of secrets." From 1912 to 1938 he served as the secretary to the Committee on Imperial Defence and the Cabinet, a position which gave him a unique perspective on supreme command. Ironically, this man of secrets struggled with the censors to get his sober memoir published. The tale told by Hankey is that of supreme command as bureaucratic process--interwoven political and military decisionmaking at top levels of government. The British, masters of the art of committee work, established the modern pattern of supreme command in the Committee on Imperial Defence, which was a rough model for the National Security Council in the United States in 1947.
Supreme command as bureaucratic process consists of three elements. The development of specialized and trained military staffs began in the 19th and matured in the 20th century. As late as the interwar period some American war plans called for Washington-based staffs to sally forth into the field or establish command posts at sea, but by the outbreak of World War II those ideas were understood to be impractical if not downright dangerous. War is a complex bureaucratic effort that requires evaluating intelligence reports, managing the flow of materiel, and preparing strategic and operational plans that look out six months to a year or more. Thus supreme command as process requires modern strategic command posts as centers of activity in the White House and Pentagon when war breaks out.
The second aspect of contemporary supreme command, standing committees to coordinate the work of the military and later of government agencies, was primarily a result of World War II, though the practice did not spread to some regions of the world until the end of the century. While the war gave birth to both the Joint Chiefs of Staff and a permanent secretariat to support them, it took nearly 40 years for the Joint Staff to assume its current form. Similarly, the National Security Council and its web of committees and multilevel working groups did not mature for decades and continues to evolve today with the organization of a homeland security department.
Finally, communication from the field to the center of government has progressed from the use by Abraham Lincoln of the telegraph office in the War Department as the first situation room to the live video feeds to presidential airborne or buried command posts of today. As world politics reacted to instantaneous television coverage, so did the requirement for supreme command. Despite fear of overcentralized decisionmaking, the impulse to pull more information to the highest level persists and does not appear to lag behind technological advances in the civilian sector.
However, supreme command is not only a set of extremely vital mechanisms, procedures, and innovations, but a more fundamental phenomenon. In this sense, it consists of the relationship between civilian leaders and military commanders; it is civil-military relations at the top in wartime, and as such involves problems as old as war itself. To paraphrase Winston Churchill, the story of supreme command is one of reciprocal complaints by politicians and generals. In the United States politicians fret over military options while soldiers complain about micromanagement, interference, and ambiguous guidance.
The Normal Theory and Unequal Dialogue
Implicit in this latter set of complaints (the former gain scant attention) is a common view of what a healthy civil-military relationship should look like--that is, what one might call the normal theory of civil-military relations. This theory holds that there should be a division of labor between soldiers and statesmen. …