Academic journal article Human Factors

Why Better Operators Receive Worse Warnings

Academic journal article Human Factors

Why Better Operators Receive Worse Warnings

Article excerpt

INTRODUCTION

The introduction of advanced technologies to practically all domains of life has increased the availability of dynamic warnings, alarms, and decision aids that are designed to help people in the performance of tasks or to alert them about potential hazards. Warnings can be considered as a basic form of automation through which a system informs the operator about the possible existence of a problem. Systematic research and informal observations show that operators may come to rely to different degrees on warnings and alarms. One extreme is the so-called cry-wolf phenomenon (Breznitz, 1983), in which an operator ceases to respond to a valid warning (see also Bliss, Gilson, & Deaton, 1995). Even when operators continue to respond, their responses are likely to be slow when a high percentage of warnings are not justified (Getty, Swets, Pickett, & Gonthier, 1995). In extreme cases the operator may even actively disable a warning system if it gives too many invalid warnings (Sorkin, 1988). The other extreme is akin to automation bias (Mosier, Skitka, Heers, & Burdick, 1998) or the notion of complacency when using automation (Parasuraman, Molloy, & Singh, 1993). Here operators rely strongly on automation and miss instances in which the automation fails. In the context of warnings, the operator may come to rely entirely on the warning system and ceases to attend to additional information.

The fact that operators respond in complex ways to warnings and may alter their responses over time poses a major challenge for system designers, who need to specify the characteristics of warning systems that will be implemented in a device. It would be ideal to be able to predict how people will respond to warnings, given the characteristics of the warning and other relevant variables.

Warning systems are usually viewed as having specific, fixed levels of validity. The present paper demonstrates that this view may not always be appropriate. We suggest that the occurrence of hazard warnings in complex systems often depends on the operators' characteristics, and that the diagnostic value of a warning system will usually decrease for better operators.

CHARACTERIZING WARNING SYSTEMS

For the simplest warning system there are two possible states of the monitored variable, such as normal operation and failure, and a binary warning indicates the state of the system (e.g., a green or red indicator light, or a warning that it is on or off). We will refer to the two system states as N for normal operation and F for a system failure. Accordingly, the output of the warning system will be labeled as n when the system indicates normal operation and f when it indicates a failure. The two states of the system and of the warning form four combinations, which can be labeled as true positive (TP) for a warning that is given when there is a failure (f\F), true negative (TN) for no warning when the system is intact (n\N), false positive (FP) for a warning that is given when the system is intact (f\N), and false negative (FN) for no warning when there is actually a failure (n\F). Other terms that are frequently used are hit for TP, correct rejection for TN, false alarm for FP, and miss for FN. See Figure 1 for a schematic depiction of the four combinations.

One common way to describe the diagnostic value of a system is through the probability of hit, which is the percentage of TP out of all failure events:

p(hit) = p(f\F) = TP/TP + FN.

In medical terms this measure is referred to as the "sensitivity" of a diagnostic test. It is usually accompanied by the probability of false alarm, which is the percentage of FP out of all normal events:

p(FA)= p(f\N) = FP/FP + TN.

The corresponding medical term is

1 - p(FA) = p(n\N) = TN/FP + TN,

which is the "specificity" of the test (Swets & Pickett, 1982). …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.