there was no difference in the number of correct diagnoses between conditions but the times to a diagnosis indicate that it was not whether the subjects determined the cause but when they did it. As predicted, when the subjects knew what was going on, they were better able to make the correct system response.
One interesting result not shown above was the response of two subjects in the Faultfinder condition who correctly diagnosed the leak and correctly shut down the engine but did not divert to the nearest alternate as mandated by ETOPS rules (both subjects were ETOPS rated). In the baseline case, all three subjects who shut down the engine chose to divert. This gives rise to the possibility that the additional systems information may affect mission performance. This finding agrees with Rogers' et al. ( 1996) discussion of the propagation of information between different operational levels. Because this response was unexpected, it was not addressed in the standard experiment debriefing and therefore the reason that the pilots did not divert may never be known. However, one could speculate that the subjects knew that the problem was a leak, that the leak was secured, and that the engine was still good (although handicapped). Therefore, the subsystem was operational, just not operating. This, coupled with the fact that the alternate was only 100 miles closer and the passengers were expecting to go to Honolulu, may have led the subjects to press to destination. This was not necessarily a wise decision given that the right engine had failed earlier in the flight but it is understandable.
Only one subject (Faultfinder condition) felt that the engine failure and fuel loss were related. This subject elected not to shut the engine down because he felt that there was some contaminant in the fuel. Most, but not all, of the information provided by Faultfinder could have supported this hypothesis. When the FUEL DISAGREE message appeared, the subject persevered in his "contaminant" hypothesis even though it did not explain the fuel loss.
In summary, it appears that providing this additional fault management information has operational value even when the displays are not optimized for the information. As Tenney et al. ( 1997) pointed out in their analysis of the data, the additional information may have assisted the crew in performing more knowledge-based processing rather than simply skill- or rule-based processing. To be sure, there is more research to be done regarding issues such as false alarms and missed alerts, and over-reliance on such a system, however this research and others like it ( Trujillo, 1997) demonstrate significant potential.
The authors wish to acknowledge the support of John Barry, Dr. William Rogers, Dr. Yvette Tenney, Capt. Skeet Gifford (ret.), Capt. Dave Simmon (ret.) and Myron Sothcott without whom this experiment would not have been possible.
Abbott K. H. ( 1991). Robust fault diagnosis of physical systems in operation (NASA-TM-102767). Hampton, VA: NASA Langley Research Center.
Boeing Commercial Airplane Group. ( 1996). Statistical Summary of Commercial Jet Aircraft Accidents, World Wide Operations, 1959- 1995.
Klein G. A., Orasanu J., Calderwood R., & Zsambok C. E. ( 1993). Decision making in action: Models and methods. Norwood, NJ: Ablex Publishing Co.
National Transportation Safety Board. ( 1990). Aircraft accident report: United Airlines Flight 232 McDonnell Douglas DC-10-10 Sioux Gateway Airport, Sioux City, Iowa, July 19, 1989 (NTSB- AAR-90-06). Washington, DC: National Transportation Safety Board.
Reason J. ( 1990). Human error. New York: Cambridge University Press.