Management by Consent in Human-Machine Systems: When and Why It Breaks Down

Article excerpt

This study examined the effects of conflict type, time pressure, and display design on operators' ability to make informed decisions about proposed machine goals and actions in a management-by-consent context. A group of 30 B757 pilots were asked to fly eight descent scenarios while responding to a series of air traffic control clearances. Each scenario presented pilots with a different conflict that arose from either incompatible goals contained in the clearance or inappropriate implementation of the clearance by automated flight deck systems. Pilots were often unable to detect these conflicts, especially under time pressure, and thus failed to disallow or intervene with proposed machine actions. Detection performance was particularly poor for conflicts related to clearance implementation. These conflicts were most likely to be missed when automated systems did more than the pilot expected of them. Performance and verbal protocol data indicate that the observed difficulties can be explained by a combination of poor system feedback and pilots' difficulties with generating expectations of future system behavior. Our results are discussed in terms of their implications for the choice and implementation of automation management strategies in general and, more specifically, with respect to risks involved in envisioned forms of digital air-ground communication in the future aviation system. Actual or potential applications of this research include the design of future data link systems and procedures, as well as the design of future automated systems in any domain that rely on operator consent as a mechanism for human-machine coordination.

INTRODUCTION

In many domains the introduction of highly autonomous and powerful automated systems has resulted in an increased need for human-machine communication and coordination. Problems such as automation surprises (Sarter, Woods, & Billings, 1997) or a lack of mode awareness (Sarter & Woods, 1995) indicate, however, that the systems creating this need are not designed to support it. Current attempts to address the problem focus on improved system feedback or enhanced training for the human operator. In contrast, relatively few studies have examined the appropriateness and efficacy of various automation management strategies (i.e., different methods of shared human-machine control) for different tasks and task contexts.

The two automation management strategies that are implemented in most existing flight deck systems are management by consent and management by exception (Billings, 1997). In the case of management by exception, the machine is allowed to initiate and perform actions on its own. This approach requires relatively little explicit and observable human-machine interaction, but it imposes considerable monitoring demands on the operator and involves the risk of losing system and mode awareness. In contrast, management by consent refers to an approach in which the automation must ask for and receive explicit operator permission before taking any action (Billings, 1997). This approach presumably increases operators' awareness of and control over automation behavior, but it does so at the cost of increased communication demands.

As a first step in a line of research to determine the desirability and effectiveness of possible coordination strategies, a survey of airline pilots was conducted to examine the reasons underlying their preferences for and experiences with the two aforementioned automation management strategies (Olson & Sarter, 1998). Overall, pilots expressed a strong preference for the management-by-consent approach. However, high levels of time pressure, workload, and task complexity and low task criticality led to a shift toward a preference for management by exception. Pilots' general preference for management by consent can be explained, in part, by their assumption of a higher level of control over machine actions under this approach. …

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.