Article #16
2000
 
 
 
Newsletter list 

Newsletter 

Previous Article 

Next Article 


 


Dealing with Conflicting Information

Kathleen L. Mosier, Jeffrey Keyes, and Roberta Bernhard
San Francisco, CA

Omission and commission errors resulting from automation bias, the tendency to rely on automated cues as a heuristic replacement for vigilant information seeking and processing, have been documented in professional pilots and students, in one- and two-person crews. Underlying causes of omission errors have been traced in part to vigilance issues, as crews who are monitoring flight progress and system status often "miss" events that are not pointed out to them by automated systems. Causes of commission errors are harder to track. It has been hypothesized that they may be related to a desire of pilots to "take action," as proactivity has typically been associated with superior crew performance. In this work, which focused on regional, or Part 135 operations, the decision involved in choosing among sources of information was investigated via an ASRS (Aviation Safety Reporting System) analysis, and also through a paper-and-pencil scenario study.

Data for the ASRS study were obtained from the ASRS CD containing the database for the years 1994-1998. Using several broad search criteria words, such as "automation" (or specific automated displays or instruments) and "conflict," we created a preliminary sample of 1,200 reports that were potentially relevant to our study. Each of these was screened for appropriateness, and we identified 189 ASRS reports in which automation was involved. Incidents were coded with respect to the sources of information that were cited concerning the critical incident, whether the sources provided consistent information or were in conflict, and how the incident was resolved. We were particularly interested in incidents involving conflicting information from different sources - and found that most of these incidents (N=24) were traffic incidents, and involved a conflict between TCAS information and some other source (ATC or visual cues). Traffic incidents were also most often cited as involving high risk. Analyses indicated that, when TCAS information entailed taking evasive action, crews typically followed TCAS recommendations - even when visual information contradicted the need for the maneuver. These incidents supported the notion of a "take action" tendency.

For the scenario study, 125 regional air pilots were asked to respond to a packet of scenarios. Each scenario conveyed a situation involving conflicting information from two sources - an automated source + either a human source or a traditional indicator. Information from one source suggested making some change (action); information from the other source suggested maintaining status quo. Seven of the scenarios were matched between packets - that is, the same scenario was manipulated so that, in one packet, the information from the automated source suggested action, and in the other packet, the information from the other source suggested the same action. One scenario contained conflicting action recommendations - an automated source suggested that one of two engines was on fire; traditional indications suggested that it was actually the other engine that was damaged. Pilots saw only one version of each scenario. They were asked to choose between two decision options, and assess their confidence level, as well as the risk involved in the scenario.

We found no systematic evidence of a preference for automated information - in fact, in none of the scenario pairs was automated information followed across packets. Rather, we saw a pronounced scenario effect; that is, in most scenarios there was high agreement across packets on the preferred option, the risk level of the scenario, and the confidence with which pilots chose an option. For the pair of scenarios that contained conflicting engine fire indications (which engine was on fire), pilots most often believed traditional indicators. We did not find evidence of a systematic preference for action (which was, in most cases, the more conservative option), although the higher the estimated risk of a scenario, the more likely pilots were to choose action, and the more confident they were in their choice.

Results of this study are encouraging in that they suggest that we may be able to impact automation bias if we train pilots early enough in their careers to evaluate automated cues in context with other cues. However, we need to be cautious about generalizing from the paper-and-pencil venue. This format provides information differently than it is shown within the cockpit, and allows the information to be processed in a less biased and more analytical way. Additionally, we have previous evidence that, when encountering a situation in an actual or simulated aircraft environment, pilots do not always do what they say they would do. Follow-up studies will be required to determine if results of the paper-and-pencil study will hold in other venues.

Contact Kathleen L. Mosier

Previous Article                Next Article

Newsletter list