Desktop version

Home arrow Management

  • Increase font
  • Decrease font


<<   CONTENTS   >>

Mistakes – The Failure of Rules

Whereas slips and lapses are a function of inadequate attention applied to skill-based behaviour, mistakes flow from the selection of future courses of action. Mistakes are the result of both rule-based and knowledge-based behaviour. In this section, we will look at the application of rules to the management of work. Quite simply, rules are repertoires of TFTHEN’ statements acquired in training or through direct experience. Unfortunately, rule selection can be problematic. We are now moving into the realm of conscious, premeditated activity (although rule-based behaviour can also be incorporated into fairly automatic routines). In October 2004, a B-737-300 failed to pressurise after take-off and the crew elected to land and change aircraft. Later, the captain was surprised to discover that, based on the output from the flight data recorder, the aircraft pressurisation had not been turned on in the first place. On reflection, the captain recalled that he had checked the pressurisation switches and had seen that they were both in the off position. However, as the checklist was underway and it called for the switches to be moved one at a time, he assumed that this action was pending and did not think to query it. In effect, he had sampled the work process at what he thought was too early a stage in its progression. After speaking to his FO, he was surprised to find that the FO, too, had seen the switches in the ‘off’ position but had assumed that the captain had moved them for some reason. Both pilots had created a ‘rule’ to explain the configuration of the aircraft. The behaviour of the FO also points to the role of social hierarchies in shaping action.

The crew of the Helios aircraft were engaged in more formal rule application. As the aircraft climbed through 12,000ft, the cabin altitude warning horn sounded. The cabin altitude warning horn alerts the pilots if the cabin altitude exceeds 10,000ft, the point at which the partial pressure of atmospheric oxygen starts to fall below that required to support normal human functioning. From the flight data recorder, we can see that the crew, first, disengaged the autopilot and then disengaged the autothrottles. The thrust levers were briefly retarded and then the automatics were reset and the aircraft continued its climb. What the crew did not do was recognise that the cabin was not pressurising. Instead, they called dispatch for advice.

The warning horn actually serves two purposes. At lower levels, it warns the crew if the aircraft is incorrectly configured for take-off or landing. The usual context for hearing this horn is on approach with the power levers at idle and the undercarriage still up. The use of the same warning for two different conditions is an example of an engineering solution (Level 3) to satisfy an airworthiness requirement (Level 4), the assumption being that the differences between these two circumstances will avoid confusion. The crew of the Irish aircraft departing Cork also experienced the horn sounding at 10,000ft but the captain chose to disregard it as spurious. He had mis- identified the cause of the horn, presumably because of its stronger association with the undercarriage on approach. The actions of the Helios crew in moving the power levers suggest that they also associated the horn with a configuration problem rather than the cabin altitude.

If we return to the information processing model discussed in a previous chapter, cues from the environment are detected and attended to. They are then brought into working memory and processed. The crews of both the Helios and the Irish aircraft mapped the cue onto sets of examples of when the horn would sound held in longterm memory. They then applied the strongest association - configuration warning - and disregarded the input as spurious. The use of a wrong rule (IF THEN cconfiguration problem>) caused them to stop searching for other possible associations. We can see the role of bias in decision-making at work in these situations. In October 2004, the crew of a 737-300 experienced problems with the auto throttle and the transponder in the climb. Suspecting a failure of the air/ground sensing system, the crew was fault-finding when they got an intermittent sounding of the warning horn. Assuming that this was the configuration warning, they were able to partly rationalise the symptoms they were encountering. Although, in this case, the cabin pressurisation had been set correctly, the system had actually failed. While the crew were distracted, the cabin altitude had been slowly rising until it set off the horn. This is an example of ‘confirmation bias’: information is being used to confirm the current assessment of the situation when, in fact, the new information was completely unrelated. The captain of the aircraft later commented, ‘after a few minutes the FO remembered that the take-off warning horn doubles as the cabin altitude warning. He checked the cabin altitude and saw that it was above 10,000 ft and that we were slowly losing pressure’. In this case, the two pilots were individually processing cues from their environment. They were fitting information into their model of aircraft systems in order to develop a plausible explanation for the symptoms they were encountering. However, the FO then recalled an additional piece of information that offered a completely different narrative. Whereas the initial response was to look for commonality between the existing problem and any new information as it arrived, the correct solution was arrived at when the FO treated the horn in isolation and sought other associations that could be made. It is important to recognise that breaking out of one mode of sense-making and developing an alternative strategy is effortful. We also need to consider what we referred to as ‘startle’ in Chapter 2. In November 2003, a captain reported, ‘I was amazed at how long - 10-15 seconds - it took us to fully realise what was happening. I always thought I would instantly recognise the take-off warning as the cabin altitude warning once I was airborne. Not so!’

These examples have all involved crew applying the wrong rule to a situation in order to make sense of a warning. I suggested in the previous chapter that decisionmaking is a stream of responses to the environment typically driven by formal and informal rule sets. This section has looked at how rules shape interpretation - the rule was used in retrospect - but wrong rules can be used prospectively to select the next course of action in a sequence. The fitness of the rule, given the circumstances, will determine the effectiveness of the action.

 
<<   CONTENTS   >>

Related topics