‘Wrong Work’ and Violations as Improvisations
The final category of performance I proposed was wrong task behaviour. Unlike the first two categories, when crew engage in ‘wrong work’ they are, in fact, creating their own ways to achieve the task and, at times, such behaviour can be problematic. Reason (1990) suggests that a violation is a ‘deliberate error’ while LOSA uses the more cumbersome formulation of ‘intentional non-compliance’. Whatever the label, both capture the idea of aberrant behaviour being purposeful. Table 6.6 shows the percentage of errors deemed to be intentional in a sample of audits. Because it is difficult to be absolute about intentions (and LOSA does include some ‘tests’ to apply before using this category), LOSA observers can use a ‘not sure’ category if they are in doubt. The table only includes events where the observer was certain that it was intentional.
An example of a violation is the case of a B-777 that landed at Toronto International Airport and missed its nominated exit. The options available were to call for a tug
to manoeuvre the aircraft on the runway or to take another exit that would require a much longer taxi route around the airport and back to the terminal. The FO was about to call for the tug when the captain instructed him to make sure his feet were clear of the toe brakes, whereupon the captain used reverse thrust to back the aircraft across the runway and make room to turn back to the planned exit. The use of reverse thrust in this context is expressly forbidden with two specific prohibitions contained in manuals. The risk is that the aircraft could tip on its tail if the brakes are touched inadvertently. The action clearly violated the published requirements. That it also deviated from social norms relating to responsibility and professionalism reflects the fact that violations are also socially constructed. The behaviour of the captain can, however, be adequately explained in terms of an individual selecting a personal goal as an expedient despite a fully understood proscription and a known risk. Violations often need to be seen in terms of the individual’s sense of self and their personal interpretation of the latitude afforded to them.
A less clear-cut example of violating behaviour was seen in the study that looked at the new aircraft type entering service (Fleet C in Table 6.6). Because it was being operated as a common type rating, crew were getting used to operating both their current and the new aircraft. Differences between the two types resulted in an intermixing of procedures, often inadvertently, but with one captain going so far as to adapt the old procedures, with which he was very familiar, to the new aircraft. Some crew did not use the new head-up display as intended. Here we see individuals shaping new processes and technologies to suit their preferred approach to the task but in a way that exceeded company expectations.
The ‘wrong task’ category of behaviours, while it might reflect violations, captures active intervention by the crew in the work process to modify the task in a manner that was considered ‘illegal’ by the observer. These comprised 63.09% of ignored and 20.6% of the undetected errors. Examples of wrong task acts include:
CN turns off flight deck recirculation fans during set up to reduce distracting noise even though he said that it was against SOP
During weather avoidance, turns were made without reference to, or clearance from. АТС
АТС advised of the availability of the new ATIS but crew did not check the update CN let the FO park the aircraft on a bay that only had left-seat guidance
Wrong tasks can also be interpreted as improvisation. In the sample of 86 sectors, these behaviours were seen on nine occasions (10.4% of flights). Automation was
Crew Awareness of Violation
deliberately circumvented to achieve a goal, an alternative action or expedient was implemented or a non-standard tool was used. One type of intentional error common to all fleets was PF/PM duty exchanges. For example, during ground preparation, the FO will often do tasks formally assigned to the captain in situations when the captain is dealing with, say, a problem and needs to speak to the engineer. On examination, it is clear that crew operate outside of the published procedures in order to maintain the work flow. The fact that such a high percentage of this type of event was ignored by the crew suggests that the need for flexibility and adaptability is well understood and accepted by crew, which brings us back to the nature of work. Normative models fail to take into consideration the need for work to be flexible and adaptive to cope with change. While work is undertaken within familiar frameworks, it is also a socially constructed, negotiated process.
Table 6.7 presents data in relation to passive management and violations. In the case of crew that fail to detect an error, the observer was of the opinion that the bulk of the actions were not intentional. This suggests that either crew knowledge was flawed (they were unaware of the correct action) or they were distracted by other tasks. Where acts were ignored, it was the opinion of the observer that, in most cases, crew were fully aware that they were operating outside of procedures. The data reinforce the view that crews construct solutions to problems in a deliberate manner.
I suggested in Chapter 5 that pilots engage in activity directed at achieving congruence between the desired goal state and the current status of the task. From a resilience perspective, LOSA data seem to support the idea that crew are routinely engaged in repair action to sustain progress towards the desired goal. Crew are also actively engaged in modifying and creating alternative solutions to the demands of the task.
If buffering does, indeed, describe the ability of the system to cope with crew performance variability, then the fact that most of the observed errors were inconsequential reflects the fact that, under normal circumstances, the system is able to cope. In the next section, I want to look at crews approaching the boundary of the system.