Overview
Introduction

Human error has been shown to be the single largest contributing factor in aviation, transportation, high-technology, and industrial mishaps (1). A great deal of attention has been paid to understanding human error in complex systems, including issues related to design, communication, and judgment. However, less attention has been paid to the impact of organizational influences on safety and performance. These surveys examine the organizational climate using a human factors framework.

Background

An early model depicting organizational influences on loss control was introduced by Frank Bird (1974). His "Domino Theory" model posited that loss (i.e., a mishap) resulted from a sequence of events, each influencing the next (similar to that of falling dominos with each domino having a cause and effect on the next in the series).(2) Figure 1 depicts the steps in Domino Theory.

oca

In the ensuing years, several other researches expanded upon Bird's work. Turner (1978) observed that organizational factors leading to a mishap could go unnoticed for long periods by system's designers or users. He defined this dormant period of unforeseen disaster as an "incubation period" which could persist for years before a "triggering event" generated a mishap. Turner indicated this triggering event could be confused as a "causal factor" of the mishap rather than the last event in the sequence of events leading to the mishap.(3)

Perrow (1984) contended that as organizations and their technologies become more complex, they also become vulnerable to accidents [mishaps] stemming from unforeseen or misunderstood events. He termed this type accident a "normal accident" in the sense that it is an inherent part of complex systems that they will eventually fail. Perrow highlighted the essential role that organizations and management have in managing these systems.(4)

James Reason (1990) built upon earlier error management research. He divided errors into two categories: (1) "active errors" that are almost immediately recognizable, and (2) "latent errors" that might lie dormant for a period of time until some precipitating event triggers the mishap (similar to Turner's incubation period). Reason introduced the "Swiss cheese" model of mishap causation. Figure 2 depicts the four tiers of Reason's model.

oca2

Breakdown in interactions between these tiers results in "holes" in their respective defenses. It is these holes in the layered defenses that gives the name "Swiss cheese" to the model.(5)

Wiegmann and Shappell (1996) fleshed out Reason's human error model by providing and defining subordinate categories to each of the model's four tiers. Their contribution resulted in the development of the Human Factors Analysis and Classification System (HFACS) - a model now recognized throughout DoD and becoming generally accepted in industry (refer to Figure 3).(6)

oca3




Notes:

(1) Duffey, R. and Saull, J. The probability and management of human error (Draft). Proceeding of 12th International Conference on Nuclear Engineering. April 25-29, 2004, Virginia.
(2) Bird, F. (1974). Management guide to loss control . Institute Press, Loganville, Georgia.
(3) Turner, B. (1978). Man-made disasters . Wykeham Publications, London, England.
(4) Perrow, C. (1984). Normal accidents: Living with high-risk technologies . Basic Books, New York.
(5) Reason, J. (1990). Human error . Cambridge University Press, New York.
(6) Wiegmann, D and Shappell S. (1996). A human error approach to aviation accident analysis: The human factors analysis and classification system . Ashgate, Great Britain