Information Technology Management
Richard L. Nolan
Evolving from the earlier periods of electromechanical automated data processing (ADP) technologies described in chapter 6, the modern digital computer came into its own form of information technology during the period from 1960 to 2000. Heralded by the advances of the digital computer, Europeans are thought to have introduced the acronym IT, short for information technology.1 This term, which Americans rapidly adopted, signified digital convergence in data, voice, and video. Also during this time the organization continuously reinvented and assigned new functions to the computer as dictated by improved economics and organizational learning. Eventually these changes accumulated so as to become an information revolution that changed the way companies structured and managed themselves.
The Stages Theory, first proposed in 1973, 2 has been widely used as a normative theory for the management of IT. The theory is based on the notion that the complicated nature of computer technology would produce a body of knowledge on the effective management of IT within an organization. As a result, the assimilation of computer technologies, and more broadly, information technologies, required bold experimentation, out of which emerged four stages of organizational learning.
These four distinct stages of organizational learning formed an "S-shaped" curve. Initially limited investment and contained experimentation for proving the value of the technology in the organization characterized Stage I: Initiation. Following initiation, the steep part of the S-shaped curve (Stage II: Contagion) represented a period of high