Academic journal article Human Factors

Team Play with a Powerful and Independent Agent: Operational Experiences and Automation Surprises on the Airbus A-320

Academic journal article Human Factors

Team Play with a Powerful and Independent Agent: Operational Experiences and Automation Surprises on the Airbus A-320

Article excerpt

INTRODUCTION

In a variety of domains, the development and introduction of advanced automated systems has led to an increase in the efficiency and precision of operations. At the same time, unexpected problems with human-automation interaction related to the "communication with machines rather than operation of machines" (Card, Moran, & Newell, 1983) have been observed. Effective communication with advanced automation technology is becoming increasingly important because these systems are no longer passive tools but, rather, agent-like devices that operate at a high level of autonomy and authority (Billings, 1997; Norman, 1990; Sarter & Woods, 1994). Advanced automation can initiate actions without immediately preceding or directly related operator input (i.e., autonomy), and it is capable of modulating or overriding user input (i.e., authority). These properties of modern technology impose high attentional and knowledge demands on operators, who need to maintain awareness of the automation's status, behavior, intentions, and limitations in order to efficiently coordinate their activities with the system.

Close observation of human-automation interaction in a variety of domains - particularly aviation (Sarter, Woods, & Billings, 1997) - has shown that operators often are unable to anticipate and track automation activities and changes. This results in automation surprises, in which actual system behavior violates operators' expectations. Automation surprises begin with misassessments and miscommunications between the automation and the operator(s), which lead to a gap between the operator's understanding of what the automated systems are set up to do and how the automated systems are or will be handling the underlying process(es). The gap results in the crew's being surprised later, when the system's behavior does not match the crew's expectations. The critical question is whether or not operators detect unexpected and undesirable process behavior in time to prevent or recover from negative consequences.

Past work has shown that three interrelated factors give rise to the gap between the actual system configuration and activities and the operators' expectations about automation behavior: (a) poor mental models, (b) low system observability, and (c) highly dynamic and/or nonroutine situations (Sarter & Woods, 1995; Woods, 1996). The increasing capabilities and complexity of automated systems create new knowledge demands for people charged with supervising system activities. Operators need to know more about how the system works and how to work the system in a variety of operational contexts (Abbott, Slotte, & Stimson, 1996). Increasing knowledge demands, coupled with static or decreasing training investments, can easily lead to significant gaps or misconceptions in users' mental models of the automated systems they manage.

Observability refers to the ability of available feedback to actively support operators in monitoring and staying ahead of system activities and transitions. Observability is more than mere data availability; it depends on the cognitive work needed to extract meaning from the available data. Automated systems often provide poor or little feedback about their current or future activities (Norman, 1990; Sarter & Woods, 1995). As a result, operators have to perform significant cognitive work to infer what the automated systems are doing and what they will do next as operational circumstances change. Low observability also impairs long-term learning and gradual improvement of operators' mental model with experience (Sarter & Woods, 1994).

The analysis of automation surprises reveals one more contributing or enabling factor. Automation surprises tend to occur primarily in the context of situations that involve a high degree of dynamism, nonroutine elements, or both, or conjunctions that impose high demands on the human's attentional resources and therefore involve a high risk of losing track of system behavior (Sarter & Woods, 1994; Woods, Johannesen, Cook, & Sarter, 1994). …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.