Academic journal article Human Factors

Cognitive Anchoring on Self-Generated Decisions Reduces Operator Reliance on Automated Diagnostic Aids

Academic journal article Human Factors

Cognitive Anchoring on Self-Generated Decisions Reduces Operator Reliance on Automated Diagnostic Aids

Article excerpt


The use of automated diagnostic aids is becoming increasingly common within a variety of complex systems, such as aviation, nuclear power, and health care. The ultimate effectiveness of these aids, however, is seldom realized because operators tend to undertrust imperfectly reliable automation, resulting in its disuse (Parasuraman & Riley, 1997; Wickens & Hollands, 2000). In the case of automated diagnostic aids, such disuse is often manifested in the form of operator disagreements with the aid, even when aided performance is statistically more accurate than unaided performance (Wiegmann, Rich, & Zhang, 2001).

Disuse of automated aids can occur even when systems are opaque (i.e., provide little or no information to the operator upon which to base a diagnosis; Wiegmann, 2002; Wiegmann et al., 2001). For example, handheld analyzers are now commonly used to help airline luggage inspectors detect the presence of hidden contraband such as narcotics and explosives in passenger luggage. Given that luggage is generally opaque, operators must rely almost entirely on the automation to perform this analysis. Although such analyzers are fairly accurate, they do have a tendency to produce false alarms. Consequently, operators generally undertrust these devices and have reportedly ignored analyzer alarms (Parasuraman, Hancock, & Olofinboba, 1997). Anecdotally, operators formulate an opinion concerning the likelihood that a particular passenger might be dangerous and then base their interpretation of the automated alarm based on this hypothesis--a phenomenon known as cognitive anchoring.

Cognitive Anchoring

A great deal of empirical evidence in cognitive and social psychological research suggests that when people are faced with discrepant information after choosing a certain hypothesis, they have a strong tendency to bias their belief revisions in favor of the initially chosen hypothesis (Einhorn & Hogarth, 1982; Hogarth & Einhorn, 1992; Tversky & Kahneman, 1974; Wickens & Hollands, 2000). They often behave as if they have attached a "mental anchor" to the initial hypothesis and do not easily shift it to the alternative, giving the impression that "first impressions are lasting." Likewise, operators of imperfectly reliable automation might attempt to circumvent automation errors by generating their own independent decisions about system states prior to receiving information from the diagnostic aid. Consequently, there may be a strong a tendency for operators to "cognitively anchor" their final decisions onto their initial hypotheses, thus increasing the probability that operators will disagree with an aid that provides alternative information, regardless of the objective accuracy of the aid's diagnoses.

Self-anchoring. Decision makers have an implicit tendency to spontaneously generate their own independent hypotheses about states of the world based on available raw data and to anchor their final decisions onto these initial hypotheses (see Tversky & Kahneman, 1974). However, opaque systems afford automation users little or no access to raw data on which to base diagnostic decision making. Operators' hypotheses about system states are at best intuitive guesses or "hunches." The extent to which users of these opaque systems will independently generate (or guess) their own hypotheses about system states and cognitively anchor their agreements with an automated diagnostic aid onto these initial "hunches" is still unknown.

Automation utilization strategies are often influenced by the interaction of the user's perceived reliability of the automated aid and self-biases (Dzindolet, Pierce, Beck, Dawe, & Anderson, 2001; Lee & Moray, 1992; Riley, 1989). In general, humans tend to underestimate the reliability of imperfect automated aids (see Dzindolet, Pierce, Beck, & Dawe, 2002; Wiegmann et al., 2001). This lowered perceived reliability interacts with the operator's self-biases to lead to the choice of a particular automation utilization strategy. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.