AMERICA'S DIGITAL REVOLUTION
In 1957, about a decade after the invention of the transistor at Bell Laboratories, AT&T's Western Electric semiconductor plant in Allentown, Pennsylvania, employed 4,000 workers. They produced an unprofitable five transistors a day per worker, all used by the telephone company. By 1983, when AT&T decided to sell chips on the open market for the first time, the Allentown plant still employed about 4,000 workers. They were still manufacturing transistors. But they produced some 6.4 trillion of them, integrated on microchips, or some 5.3 million transistors per worker per day: a productivity increase of a factor of 1.06 million. Each of the some 10 million transistors sold from Allentown in 1957 cost about $2.50; in 1983 they sold for a few thousandths of a cent apiece. A decade later in 1992, transistors would sell for 500 millionths of a cent.
The increase in productivity and decline in price was accompanied by a radical rise in quality. Each of the transistors made in 1992 was far cheaper to operate, far more reliable, and incomparably more useful than the earlier devices. In the late 1950s, the transistor was a relatively rare and expensive component, used in pocket radios, hearing aids, and a few other specialized products in addition to telephone equipment. By 1992, connected by the millions in integrated circuits less than a quarter-inch square, transistors had heralded the finally triumphant computer revolution.
This entire industry was invented in the United States, developed here, and exported to the world. The Information Age is