Microcomputers from Movement to Industry

Article excerpt

MICROCOMPUTERS: FROM MOVEMENT TO INDUSTRY

The microcomputer is the child of an odd marriage between the military industrial complex and counter-culutre hackers from the fringers of the new left. Ironically, this product of the anti-authoritarian visions of its inventors has become the darling of financiers and a symbol of the hope of the renewed dynamism of U.S. capitalism. The history of the microcomputer industry is thus a case study in both the constraints on innovation within large capitalist firms and the ability of those firms to coopt the control the products of renegade inventors. It is also the story of the creative energies unleashed by the liberatory impulses of the late 1960s, and the idealistic but fallacious belief that technical innovation per se can challenge the centralization of information and power, and the foundations of capitalist rule.

The Origins

Scientifically, the microcomputer represents a series of gradual developments rather than a sudden breakthrough. During the Second World War teams sponsored by the military and intelligence agencies in the United States, Germany, and Britain developed the first digital computers. After the war, orders from the Pentagon, the Census Bureau, the Atomic energy Commission, and hte other government agencies propelled the private sector to develop increasingly complex computers. Initially the circuitry was based on vacuum tubes, which were bulky and unreliable. A 1950-vintage machine with as much computing power as the word processor on which I am typing was the size of a room, broke down several times a day, cost millions, and required many full-time attendants. Early on, IBM established a dominant position in the computer industry, accounting for three quarters of worldwide sales during the 1950s and 1960s.

In 1947, scientists at Bell Laboratories, the research arm of AT&T, invented the transistor, which eventually replaced the vacuum tube and opened the way to more reliable, smaller, and cheaper machines. This invention evolved directly from solid state physics research sponsored by the military. The Pentagon continued to paly a critical role in the development and dissemination of semiconductors throughout the 1950s and 1960s by sponsoring research and providing the major market for the resulting products. IBM quickly incorporated the new technology into its machines, and by 1960 had become the largest non-military customer of virtually every U.S. semiconductor manufacture.

In contrast to the computer industry, which was dominated by IBM, the manufacture of the semiconductor components of computers (and other electronic equipment) was intensely competitive.

Major electronics firms, such as GE, RCA, and Sylvania, chose not to invest heavily in semiconductors that competed with the undermined their existing product lines of vacuum tubes, leaving an opening for smaller newcomers such as Texas Instruments (TI) and Fairchild Camera. In the late 1950s engineers at TI and Fairchild invented the integrated circuit, which combined many transistors on a single chip of silicon, allowing dramatic decreases in the size and cost of electronic components. These tiny, cheap, powerful circuits made possible devices that had hitherto been unthinkable, and inspired the imagination of the firms' engineers. This blossoming of technical possibilities, combined with the relatively small amount of capital then needed to manufacture integrated circuits, led to a proliferation of new electronics companies, many of them in California's "Silicon Valley." The low cost and small size of components made them practical not only for military and industrial hardware, but for consumer electronics gadgets as well. Japanese firms gained a large share of the consumer electronics market, starting in 1957 with the export of transistor radios, though U.S. firms continued to dominate the world semiconductor market until the late 1970s. …