Desktop version

Home arrow Management

  • Increase font
  • Decrease font


<<   CONTENTS   >>

Organisational Error

The fact that accidents can have an organisational genesis is recognised in Reason’s model of accident causation (Reason, 1990). The model is based on the elements of a production system and is illustrated in Figure 2.2. At the top level, decisions- makers make strategic decisions about the structure and function of the organisation. Unfortunately, senior managers cannot anticipate all the ways in which things might go wrong and, so, some of their decisions will contain the seeds of a future failure. The design of the ATR system on the MD-83 described earlier is an example of a fallible decision, a plausible design solution to a problem that was subsequently found to exacerbate rather than moderate a situation under very specific circumstances. These are the latent errors and resident pathogens referred to earlier. Control of operations on a daily basis is delegated to line supervisors. In order to meet production targets, supervisors turn a blind eye to discrepant production activities. The third level in the model captures the underlying vulnerability of human operators, who are susceptible to stress and fatigue. In many ways, the level of psychological precursors is a boundary layer between management and production. The fourth level is that of production. While doing work, people take short cuts, modify procedures and commit errors. Finally, systems put in place measures to

Reason’s model

FIGURE 2.2 Reason’s model.

keep the system safe. Local defences are, themselves, the product of human design activity and, so, are potentially flawed (think of the B-1900 engine fire extinguishers). The model represents the manner in which failings at one level may be trapped at the next level in the system. So, a flawed work process may be ameliorated by increased supervisory oversight. Operator failings will be captured by warning systems. However, it is conceivable that, as the figure illustrates, flaws at every level can align to allow a propagation path for failure.

The model is often described as a variation of the linear approach, and it is not hard to see why. Hollnagel, however, describes it as an ‘epidemiological’ model, and I have to agree with that. In simple terms, the model not only creates an agenda for investigating aspects of the organisation that can contribute to failure but also provides a framework for making comparisons between organisations that can then throw light on why things fail. For example, it has long been recognised that aviation in Alaska is riskier than that in the other States of the USA. Over the years, there have been various studies conducted by the NTSB, trying to understand why the accident rate is so high. At one point, as a commercial pilot, you were 20 times more likely to be killed in an accident in Alaska than if you were flying elsewhere. One study found that operators with a high turnover of maintenance managers had higher accident rates. Applying Reason’s framework across a range of organisations could reveal epidemiological risk factors.

We can apply the model to the Pelee Island accident. First, we can start with the decision-makers. The division of responsibilities in TC had allowed Georgian Express to commence operations without any oversight from its POI. The initial trial flights to get the approval to run the service were based on each sector being a discrete entity with a complete dispatch process being conducted before each flight, but we have seen how that concept quickly seemed to have changed. Line supervision was delegated to the pilot who failed to complete some of the processes associated with the dispatch of the aircraft. Equally, the migration from two scheduled sectors to a single flight with an intermediate stop was not picked up by the management.

The pilot had had very little sleep the night before, having returned after a weekend away and getting up very early to start work; fatigue was a probable psychological precursor. Despite having his attention drawn to the ice on his aircraft, the pilot chose to take no action despite freezing rain falling at the time of departure. Here we see risky action that derives, possibly, from a desire to get the work done and maybe avoid getting stuck on Pelee Island. Georgian Express provided equipment for de-icing aircraft as a local defence, but this equipment was left behind on the day and Pelee Island Airport lacked alternative facilities. Pelee Airport had no scales for weighing passengers, another ‘defence’ of sorts, had they been present. Of course, even if these things had been available, there is nothing to say they would have been used. Like linear models, Reason’s Model allows us to arrange factors in related clusters. However, dynamic aspects of accident propagation are not explained, largely because it is not a model, rather it is a metaphor for how production takes place in an organisation (Hollnagel, personal communication). The NAT and HRO concepts also tend to look at the structural properties of systems rather than the behaviour of individuals. From a competence perspective, we need to recognise that humans work within complex and unpredictable sets of relationships involving components such as technology, meteorological phenomena, commercial and social policy goals. Each individual’s performance must carve a path through these, often competing, forces.

 
<<   CONTENTS   >>

Related topics