Computer Manufacturing: Change and Competition

Article excerpt

A historical study of the computer industry reveals that computer manufacturers add workers to their payrolls then shed them when the products they manufacture undergo technological change, increased demand, and international competition; what is the impact on workers in the industry?

Technological breakthroughs in the computer industry have been dramatic. For example, musical birthday cards have more computing power than existed anywhere in the world, prior to 1950.(1) In addition, computing power that once cost millions of dollars can now be had for hundreds. Gordon Moore, chairman of Intel, has stated: "If the auto industry had moved at the same speed as our industry, your car today would cruise comfortably at a million miles an hour and probably get a half a million miles per gallon of gasoline. But it would be cheaper to throw your Rolls Royce away than to park it downtown for an evening."(2)

The effects of technology on the cost and speed of computers have been as dramatic as its effects on employment in the computer industry. After the introduction of the world's first personal computer in 1975, the industry enjoyed many years of phenomenal employment growth. Between 1960 and 1984, employment in the manufacturing of computers and computer equipment rose by 259 percent,(3) compared with a 74-percent increase in total nonfarm payroll employment. However, from 1984 to 1995, the computer manufacturing industry began to change, losing 32 percent of its work force. Tbis is one of the swiftest declines in all manufacturing industries during this period. This article discusses some of the reasons behind this decline, which include a shift in focus from computer hardware to software, changing industry dynamics, international competition, and emerging technologies.

Early computers

Technological innovations that began as early as the 1600's - such as the first mechanical calculating device - have ushered our society into what is now called "the information age."(4) The concept of the computer can be credited to English mathematician Charles Babbage. In 1823, he designed what became the theoretical model for modern computers. His design included devices for entering and storing data, performing calculations, and displaying the results. It was not until 1945 that Presper Eckert and John W. Mauchly invented the first computer - the ENIAC (Electrical Numerical Integrator and Computer). This huge machine measured 30 feet by 50 feet, weighed 30 tons, and contained 18,000 vacuum tubes, 6,000 switches, 1,500 relays, and hundreds of plug wires. It added 5,000 10-digit numbers in one second, a rate 1,000 times faster than any other calculator in existence.(5) Although it was designed to calculate ballistic trajectories, it was used to decide whether or not it was possible to build a hydrogen bomb. It took 6 weeks to compute half a million punch cards to come up with the answer, and the answer was correct! Inventions such as the transistor in 1947 and the microprocessor in 1971 led to the creation of the world's first personal computer in 1975 by Ed Roberts. Not far behind, the Apple and the IBM versions of the personal computer were introduced. From this time forward, the computer has evolved rapidly, with new technologies not only reducing size and cost, but also increasing memory and processing power tremendously.

Computer manufacturing

The developments in computer technology prior to 1957 led the Office of Management and Budget to create "computer manufacturing" as an industry classification - SIC 357.(6) This industry is defined as those establishments primarily engaged in the manufacturing of electronic computers, peripherals, and storage devices. In 1994, the industry employed 354,000 people and made up 1.9 percent of all manufacturing employment in 1994. This compares with 4.9 percent employed in motor vehicles and equipment, 2.6 percent in aircraft and parts, and 3. …