Automation: A Familiar Technology Grows in Importance
Computers have become commonplace over the past decade, owing in part to high-power advertising and mass media coverage. It is easy to forget that computers are not new but have been in use for several decades, although their focus has changed over the years. Perhaps the first example of the use of a computer for large-scale data processing was the Holerith machine, which processed data collected in the census of 1890. The birth of the electronic digital computer, of course, followed that of mechanical devices for data processing, but it too has been with us for some time. An article in the Washington Post, "Honoring the Father of the Computer, 50 Years Late," gave credit to John V. Atanasoff for his work inventing computers at Iowa State University in 1937. 1 Further work by others, such as John Mauchly, resulted in the first digital computer, the Electronic Numerical Integrator and Computer (ENIAC) used by the U.S. Army in 1946. Although many computer buzzwords have come and gone, the term automation is generic and covers the evolution of computers from the first Holerith machine to the latest microcomputer or personal computer (PC). Despite the trends and fads that have characterized the various phases automation has passed through, there are certain commonalities that apply wherever it is used.
Automation in particular is not new to management analysts, who have been centrally involved with automation since it began to spread throughout the federal government just after World War II. In fact, the methods examiners and efficiency experts, who were predecessors of today's management analysts, almost certainly were involved with the use of the Holerith machine to process census data. Automation