The detonation of atomic bombs over the Japanese cities of Hiroshima and Nagasaki in August 1945 announced to the world the arrival of the 'atomic age'. Twelve years later, a team in New York led by E. Donnall Thomas reported the first use of bone marrow transplantation in cancer patients. These striking but very different developments may appear unconnected; but both are in fact intimately linked to the history of stem cell biology. In different ways each was important for the identification in the late 1950s of the blood stem cell, the origin of the entire blood system and the first stem cell to be recognised. In the decades that followed, stem cell biology was to develop into a cutting-edge biomedical enterprise that today offers hope for a host of prevalent, debilitating diseases. Amid the contemporary expectations surrounding the prospect of new and powerful stem cell therapies, the atomic heritage of stem cell biology has largely been forgotten.
The Manhattan Project
The atomic bombs dropped on Japan in 1945 were the culmination of a top-secret American enterprise, the Manhattan Engineering District, more commonly known as the Manhattan Project. The initiative was prompted by growing fears that Nazi Germany might be developing an atomic bomb following a number of advances in nuclear physics that suggested the theoretical possibility of such a weapon. Authorised by President Roosevelt in the summer of 1942, the top-secret bomb project was placed under military control and commanded by General Leslie R. Groves. The Manhattan Project was a scientific mission unprecedented in scale: lasting for three years, it involved over 150,000 personnel and cost over $2 billion. The vast and complex organisation involved a number of sites across the US: secrecy was paramount and maintained by stringent security, including a policy of strict compartmentalisation at and between sites. Only a few senior figures knew of the project: President Truman, who over the summer of 1945 agreed with his advisors on the use of the atomic weapon, learned of it only after succeeding Roosevelt in April.
Mobilisation of the project was swift and involved military personnel, scientists, commercial contractors and a large construction workforce. Each site had responsibility for a specific task: initial small-scale development of the nuclear reactor, for example, took place at the University of Chicago. Establishing 'proof-of-principle' of the chain-reacting nuclear reactor was key to the entire project: bomb production rested wholly upon a supply of uranium and plutonium generated by fission reactions within the reactor. Reactor scale-up, together with uranium separation, was the province of a new installation at Oak Ridge, Tennessee. Here a quiet valley east of Nashville was cleared of its few inhabitants and transformed in under a year into the fifth largest city in the state, with a population of 55,000 drawn largely from neighbouring towns and states. Two thousand miles away, towering reactors were under construction in Hanford, Washington, principally for plutonium production. The final phase, weaponisation, was overseen by the charismatic theoretical physicist J. Robert Oppenheimer at the remote site of Los Alamos in the New Mexico desert. The first nuclear bomb test, code-named 'Trinity', was carried out in July 1945 at Alamogordo, 200 miles from Los Alamos. Timing was crucial: the successful 'Trinity' test allowed Truman to hint to Stalin at the Potsdam Conference held in July and August 1945 of the powerful new weapon. It also foreshadowed by less than a month the use of the bombs against Japan.
A new era of radiation biology
The Manhattan Project culminated in events that profoundly shaped the postwar geopolitical map and in technologies that formed the basis of a new world order. It has been seen as exemplifying 'Big Science', a concept that highlights the changing scale of science after the Second World War and draws attention to shifts in its organisation, culture and practice. …