According to Gordon Craig and Alexander George:
Cold War is a descriptive term that was generally adopted in the late forties to characterize the hostile relationship that developed between the West and the Soviet Union. While loosely employed, the term had an exceedingly important connotation: it called attention to the fact that, however acute their rivalry and conflict, the two sides were pursuing it by means short of another war and that, it was hoped, they would continue to do so. As some commentators noted, however bad the Cold War was, it was better than a hot one, and few would deny that the Cold War was an acceptable substitute for a thermonuclear war with the Russians, if that indeed were the only alternative. 1
For nearly half a century, the Cold War defined the parameters of international relations, in general, and American foreign policy, specifically. Rooted in the fissures that existed among the Allies during the Second World War, the Cold War matured and evolved to accommodate, among many new factors, nuclear weaponry, the rise of a distinct version of Communism in Asia, and the aspirations of a postcolonial generation of underdeveloped nations.
Initially, the Cold War reflected a number of important points of departure that defined the western alliance clustered around U.S. leadership and the Sovietled Communist bloc. One of these points of departure involved the strategic shift in power that followed the Second World War. After 1945, the former great powers of the nineteenth century, particularly Great Britain, were in retreat throughout the world, creating significant power vacuums in their wake. In the postwar, these powers were progressively forced to surrender their former domains by default (e.g., in the case of Greece and Turkey) or as a result of war (e.g., in the case of French Indochina). In this context, the Cold War became a struggle over the population and resources of the de-evolving colonial periphery as well as the core nations formerly in possession of it.