Academic journal article Human Factors

Trust in Automation: Designing for Appropriate Reliance

Academic journal article Human Factors

Trust in Automation: Designing for Appropriate Reliance

Article excerpt

INTRODUCTION

Sophisticated automation is becoming ubiquitous, appearing in work environments as diverse as aviation, maritime operations, process control, motor vehicle operation, and information retrieval. Automation is technology that actively selects data, transforms information, makes decisions, or controls processes. Such technology exhibits tremendous potential to extend human performance and improve safety; however; recent disasters indicate that it is not uniformly beneficial. On the one hand, people may trust automation even when it is not appropriate. Pilots, trusting the ability of the autopilot, failed to intervene and take manual control even as the autopilot crashed the Airbus A320 they were flying (Sparaco, 1995). In another instance, an automated navigation system malfunctioned and the crew failed to intervene, allowing the Royal Majesty cruise ship to drift off course for 24 hours before it ran aground (Lee & Sanquist, 2000; National Transportation Safety Board, 1997). On the other hand, people are not always willing to put sufficient trust in automation. Some operators rejected automated controllers in paper mills, undermining the potential benefits of the automation (Zuboff, 1988). As automation becomes more prevalent, poor partnerships between people and automation will become increasingly costly and catastrophic.

Such flawed partnerships between automation and people can be described in terms of misuse and disuse of automation (Parasuraman & Riley, 1997). Misuse refers to the failures that occur when people inadvertently violate critical assumptions and rely on automation inappropriately, whereas disuse signifies failures that occur when people reject the capabilities of automation. Misuse and disuse are two examples of inappropriate reliance on automation that can compromise safety and profitability. Although this paper describes reliance on automation as a discrete process of engaging or disengaging, automation can be a very complex combination of many modes, and reliance is often a more graded process. Automation reliance is not a simple binary process, but the simplification makes the discussion of misuse and disuse more tractable. Understanding how to mitigate disuse and misuse of automation is a critically important problem with broad ramifications.

Recent research suggests that misuse and disuse of automation may depend on certain feelings and attitudes of users, such as trust. This is particularly important as automation becomes more complex and goes beyond a simple tool with clearly defined and easily understood behaviors. In particular, many studies show that humans respond socially to technology, and reactions to computers can be similar to reactions to human collaborators (Reeves & Nass, 1996). For example, the similarity-attraction hypothesis in social psychology predicts that people with similar personality characteristics will be attracted to each other (Nass & Lee, 2001).

This finding also predicts user acceptance of software (Nass & Lee, 2001; Nass, Moon, Fogg, Reeves, & Dryer, 1995). Software that displays personality characteristics similar to those of the user tends to be more readily accepted. For example, computers that use phrases such as "You should definitely do this" will tend to appeal to dominant users, whereas computers that use less directive language, such as "Perhaps you should do this," tend to appeal to submissive users (Nass & Lee). Similarly, the concept of affective computing suggests that computers that can sense and respond to users' emotional states may greatly improve human-computer interaction (Picard, 1997). More recently, the concept of computer etiquette suggests that human-computer interactions can be enhanced by recognizing how the social and work contexts interact with the roles of the computer and human to specify acceptable behavior (Miller, 2002). More generally, designs that consider affect are likely to enhance productivity and acceptance (Norman, Ortony, & Russell, 2003). …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.