The use of automation in complex sociotechnical systems has proved to be a double-edged sword. It is a technology that, perhaps more so than any other, speaks with a forked tongue to system designers. On the one hand, it promises unprecedented reliability, reduced workload, improved economy, and fewer errors. On the other hand, it whispers of less tangible, but no less real, costs to operators in terms of skill degradation, mental isolation, and monitoring burdens (Hischhorn, 1984; Norman, 1990: Parasuraman, Molloy, Mouloua, & Hilburn. 1906).
The human factors response to automation has been hesitant, perhaps on account of the discomforting discrepancy between the advantages and disadvantages just noted. Automation technology seems to possess a momentum that cannot be deterred, not to mention controlled. The warnings of human factors professionals examining automation effects seem to go unheeded by control engineers swept up in the wave of microprocessor and software innovations that sometimes overwhelm concerns for human limitations. Control systems automation is the epitome of a technology-driven enterprise. To be sure, it has been very effective in achieving some of its goals. For this reason, it is difficult to question its application. However, a number of authors have pointed out that automation introduces problems of its own (e.g., Wiener & Curry, 1980). How can engineers be convinced that automation technology is a tool and not a panacea for human-machine interaction problems (Billings, 1997)? The ironic solution that we adopt in this article is to use automation designers' own language--control theory--to point out the role of human factors considerations in the design of automation.
Reviews of the evolution of automation in complex systems and of the empirical research on human interaction with automation are already available in the literature (Billings, 1997: Moray. 1986: Mouloua & Parasuraman, 1994; Parasuraman & Mouloua, 1996: Parasuraman & Riley, 1997; Sheridan, 1987, 2002; Wickens, Mavor, Parasuraman, & McGee, 1998; Wiener, 1985). The novel contribution of this article is a domain-independent, control-theoretic framework to investigate human factors issues in automation. We demonstrate that the framework provides critical insight into evaluating the design of existing automation interfaces in nuclear power plants. We also show the value of this generic framework by using it to critically review the literature dealing with mode proliferation, mode transitions, mode awareness, and mode error. Finally, the control-theoretic framework is used to propose a set of design principles that augment existing guidelines to more effectively contribute to the design process.
A CONTROL-THEORETIC FRAMEWORK FOR STUDYING AUTOMATION
The following definitions will be employed in this article. In a departure from the journal's usual editorial style, these terms, and a few others, are capitalized throughout the manuscript to ensure that their use is not confused with other uses of the same or similar terms with which the reader may be familiar.
System: Plant + Controllers + Instrumentation + Interface.
Plant: Final Control Components + Process.
Final Control Components: controllable equipment that is used to influence a Process.
Process: the entity being controlled.
Controller: the automated means by which action is exerted on the Final Control Components.
Instrumentation: the means by which data about the Plant and Controllers are gathered.
Interface: Displays + Controls.
Displays: the devices through which Operators obtain information about the System.
Controls: the devices through which Operators take Action on the System.
Human Operator: the human actor who interacts with the System to achieve goals.
Note that the Human Operator has not been included within our definition of System, not because the Operator is outside our scope of analysis but because there is value in distinguishing the Human Operator from the System for the purposes of this article. …