A review of human error in aviation maintenance and … error in aviation maintenance and...

29
* Corresponding author. Present address: NASA Langley Re- search Center, Crew/vehicle Integration Branch, M/S 152, Ham- pton, VA 23681-0001, USA. Tel.: #1-757-864-2030; fax: #1- 757-864-7793. E-mail address: k.a.latorella@larc.nasa.gov (K.A. Latorell). 1 Present address: The Eastman Kodak Company, 901 Elm- grove Rd., Rochester, NY 14653, USA. International Journal of Industrial Ergonomics 26 (2000) 133}161 A review of human error in aviation maintenance and inspection Kara A. Latorella!,*, Prasad V. Prabhu",1 !Department of Industrial Engineering, State University of New York at Buwalo, Buwalo, NY 14260, USA "Galaxy Scientixc Corporation, 2310 Parklake Dr. Atlanta, GA 30345, USA Received 1 June 1997 Abstract Aviation safety depends on minimizing error in all facets of the system. While the role of #ightdeck human error has received much emphasis, recently more attention has been directed toward reducing human error in maintenance and inspection. Aviation maintenance and inspection tasks are part of a complex organization, where individuals perform varied tasks in an environment with time pressures, sparse feedback, and sometimes di$cult ambient conditions. These situational characteristics, in combination with generic human erring tendencies, result in varied forms of error. The most severe result in accidents and loss of life. For example, failure to replace horizontal stabilizer screws on a Continental Express aircraft resulted in in-#ight leading-edge separation and 14 fatalities. While errors resulting in accidents are most salient, maintenance and inspection errors have other important consequences (e.g., air turn-backs, delays in aircraft availability, gate returns, diversions to alternate airports) which impede productivity and e$ciency of airline operations, and inconvenience the #ying public. This paper reviews current approaches to identifying, reporting, and managing human error in aviation maintenance and inspection. As foundation for this discussion, we provide an overview of approaches to investigating human error, and a description of aviation maintenance and inspection tasks and environ- mental characteristics. Relevance to industry Following an introductory description of its tasks and environmental characteristics, this paper reviews methods and tools for identifying, reporting, and managing human error in aviation maintenance and inspection. ( 2000 Elsevier Science B.V. All rights reserved. Keywords: Aviation; Maintenance; Inspection; Human factors; Human error 1. Introduction In the opening remarks of the 1995 FAA Avi- ation Safety conference, US Secretary of Trans- portation, Federico Pen 8 a, challenged the airline industry to meet the goal of zero accidents. Given the complexity of the aviation system, this goal is 0169-8141/00/$ - see front matter ( 2000 Elsevier Science B.V. All rights reserved. PII: S 0 1 6 9 - 8 1 4 1 ( 9 9 ) 0 0 0 6 3 - 3

Transcript of A review of human error in aviation maintenance and … error in aviation maintenance and...

*Corresponding author. Present address: NASA Langley Re-search Center, Crew/vehicle Integration Branch, M/S 152, Ham-pton, VA 23681-0001, USA. Tel.: #1-757-864-2030; fax: #1-757-864-7793.

E-mail address: [email protected] (K.A. Latorell).1Present address: The Eastman Kodak Company, 901 Elm-

grove Rd., Rochester, NY 14653, USA.

International Journal of Industrial Ergonomics 26 (2000) 133}161

A review of human error in aviation maintenanceand inspection

Kara A. Latorella!,*, Prasad V. Prabhu",1

!Department of Industrial Engineering, State University of New York at Buwalo, Buwalo, NY 14260, USA"Galaxy Scientixc Corporation, 2310 Parklake Dr. Atlanta, GA 30345, USA

Received 1 June 1997

Abstract

Aviation safety depends on minimizing error in all facets of the system. While the role of #ightdeck human error hasreceived much emphasis, recently more attention has been directed toward reducing human error in maintenance andinspection. Aviation maintenance and inspection tasks are part of a complex organization, where individuals performvaried tasks in an environment with time pressures, sparse feedback, and sometimes di$cult ambient conditions. Thesesituational characteristics, in combination with generic human erring tendencies, result in varied forms of error. The mostsevere result in accidents and loss of life. For example, failure to replace horizontal stabilizer screws on a ContinentalExpress aircraft resulted in in-#ight leading-edge separation and 14 fatalities. While errors resulting in accidents are mostsalient, maintenance and inspection errors have other important consequences (e.g., air turn-backs, delays in aircraftavailability, gate returns, diversions to alternate airports) which impede productivity and e$ciency of airline operations,and inconvenience the #ying public. This paper reviews current approaches to identifying, reporting, and managinghuman error in aviation maintenance and inspection. As foundation for this discussion, we provide an overview ofapproaches to investigating human error, and a description of aviation maintenance and inspection tasks and environ-mental characteristics.

Relevance to industry

Following an introductory description of its tasks and environmental characteristics, this paper reviews methods andtools for identifying, reporting, and managing human error in aviation maintenance and inspection. ( 2000 ElsevierScience B.V. All rights reserved.

Keywords: Aviation; Maintenance; Inspection; Human factors; Human error

1. Introduction

In the opening remarks of the 1995 FAA Avi-ation Safety conference, US Secretary of Trans-portation, Federico Pen8 a, challenged the airlineindustry to meet the goal of zero accidents. Giventhe complexity of the aviation system, this goal is

0169-8141/00/$ - see front matter ( 2000 Elsevier Science B.V. All rights reserved.PII: S 0 1 6 9 - 8 1 4 1 ( 9 9 ) 0 0 0 6 3 - 3

ambitious. Trends in the airline industry and theaviation environment exacerbate the di$culty ofachieving this goal. Maintenance costs, passengermiles #own, and the number of aircraft haveall exceeded the overall growth of the aviationmaintenance technician (AMT) work force (AirTransport Association, 1994). Also, as the UScommercial aviation #eet ages, aircraft requiremore inspection and maintenance. The concurrenttrends of increased maintenance and inspectionworkload, and decreased work force seem to fore-cast increasing safety issues associated with humanerrors in maintenance and inspection.

Fortunately, technological advances have buf-fered, to some degree, the e!ects of these trends.Fail-safe systems, improved hardware, better soft-ware design, better maintenance equipment andmethods, and other technological advancements,have improved safety and, in some ways, reducedmaintenance and inspection workload. While it istempting to think of such technological advance-ments as necessarily improvements to overallsafety, one must consider that innovations alsorequire the humans in the system to acquire newskills and knowledge, and may induce additionalopportunities for human error. The focus of im-proving aviation maintenance and inspection hasbeen traditionally to improve the technology usedin these tasks. Because this focus introduced addi-tional human error concerns, more recent attemptsto improve aviation safety have focused on reduc-ing inspector and repair personnel error (Reasonand Maddox, 1995). Managing human errors hasbecome a critical aspect of the aviation industriesdrive towards increasing the safety and reliability ofthe commercial aviation system. As evidence of thisfocus, human error was a major theme in the air-craft maintenance and inspection workshop heldduring the 1995 National Aviation Safety confer-ence (FAA, 1995).

While the generics of human error or genotypes(Hollnagel, 1991) are essentially constant acrosswork domains, the speci"cs or phenotypes (Holl-nagel, 1991) are in#uenced by characteristics of thetask and environmental context. This paper reviewsgeneral approaches to the study of human errorand the characteristics of work in aviation mainten-ance as a foundation for describing the nature,

incidence and consequences of human error inthis domain. We review current e!orts towards de-tecting, reporting, and managing human errorsin aviation maintenance. Finally, we conclude bydiscussing future directions for reducing theincidence and mitigating the e!ects of human errorin aviation maintenance and, thereby, improvingaviation safety and e$ciency of aviation operations.

2. Human error

Understanding the role of human error in anaccident or incident is fundamentally di!erent fromsimply attributing such an event to an inherentlyfallible human operator. Human error has beenvariously characterized as: any member of a set ofhuman actions that exceeds some limit of accepta-bility (Swain and Guttman, 1983), any human ac-tion or inaction that exceeds the tolerances de"nedby the system with which the human interacts(Lorenzo, 1990), the failure to achieve an intendedoutcome beyond the in#uence of random occurren-ce (Reason, 1990), a necessary outcome to allowhumans to explore and understand systems (Table 1)(Rasmussen, 1990; Reason, 1990), and derivative ofoperators' social experience of responsibility andvalues (Taylor, 1987). These de"nitions convey themultifaceted nature of human error. Principally,however, they suggest the two complementary pro-posals that: (1) human operators are organic mech-anisms with failure rates and tolerances analogousto hardware/software elements of a system, (2) thathuman error is a pejorative term for normal humanbehavior in often unkind environments, where onlythe outcome determines if this behavior is deleteri-ous. These two perspectives are also re#ected in thetwo major approaches developed to address humanerror in accident and incident analyses: humanreliability assessment (HRA) and human error classi-"cations (Kirwan, 1992a). Summarized below, HRAmethods and human error classi"cations are morefully reviewed and contrasted by Kirwan (1992a,b).

2.1. Human reliability analysis

The HRA approach is an extension of proba-bilistic risk assessment (PRA). Probabilistic risk

134 K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161

Table 1Multifaceted human error taxonomy (adapted from Rasmussen,1982)

Factors A!ecting PerformanceSubjective goals and intentionsMental load, resourcesA!ective factorsSituation FactorsTask characteristicsPhysical environmentWork time characteristicsCauses of Human MalfunctionExternal events (distraction, etc.)Excessive task demand (force, time, knowledge, etc.)Operator incapacitation (sickness, etc.)Intrinsic human variabilityMechanisms of Human MalfunctionDiscriminationInput information processingRecallInferencePhysical coordinationPersonnel TaskEquipment/procedure design, installation, inspection, etc.Internal Human MalfunctionDetectionIdenti"cationDecisionActionExternal Mode of MalfunctionSpeci"ed task not performedCommission of erroneous actCommission of extraneous actAccidentally coincidental events (sneak path)

assessments identify all risks (including human er-ror) that a system is exposed to, describe associ-ations among these risks, quantify risk likelihood,and express this information in a fault tree or eventtree representation. Human reliability analysesprovide more detailed assessment of the human-related risks inherent in systems. Human reliabilityanalyses identify human errors as the failure toperform an action, failure to perform an actionwithin the safe operating limits (e.g., time, accu-racy), or performance of an extraneous act whichdegrades system performance. Human error prob-abilities are then de"ned for each identi"ed error asthe ratio of the number of errors occurring in a cer-tain interval, to the number of opportunities foroccurrence. Human error probabilities may derive

from either actual observation or simulation tech-niques. In addition to more traditional simulations(Seigel et al., 1975), simulation approaches havebeen developed which computationally representa dynamic model of the human operator, tasks, andexternal situational factors (e.g., Woods et al., 1987;Cacciabue et al., 1993). Extensions of traditionalHRA recognize the importance of considering thein#uence of environmental characteristics on thepropensity for human errors and have includedthese `performance shaping factorsa (PSFs) in cal-culations of HEPs (Kirwan, 1992a) and in richsimulations of the environment. One disadvantageof the HRA approach, is that it requires that thesystem exists in order to observe these error modesand collect data on error rates. As such, thismethod is of limited use in designing safe systemsfrom inception. Simulation approaches provide theadvantage of real-time generation of (simulated)human errors in response to (simulated) externalconditions and events, and thereby avail predictiveinformation by changing simulated conditions.

2.2. Human error classixcations

The second major approach to investigating hu-man error is more qualitative; that is to classifytypes of human errors. It is beyond the scope of thispaper to review the spectrum of human error classi-"cation systems (see Reason, 1990; Woods et al.,1995) for a more detailed treatment of humanerror. Rather, this section describes three basicforms that these system may take, provides repre-sentative examples, and concludes by emphasiz-ing the need for a holistic approach to classifyinghuman error.

Human error classi"cation schemes have beendescribed as behavioral, contextual, or conceptualin nature (Reason, 1990). Behavioral classi"cationsdescribe human errors in terms of easily observedsurface features. Behavioral classi"cations partitionhuman errors on such dimensions as their formalcharacteristics (omission, commission, extraneous)(e.g., Swain and Guttman, 1983), immediate conse-quences (nature and extent of injury or damage),observability of consequences (active/immediate vs.latent/delayed) (Reason and Maddox, 1995), degreeof recoverability, and responsible party. These

K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161 135

classi"cations provide no mapping of surface char-acteristics to causal mechanisms. Contextual classi-"cations begin to address causality by associatinghuman errors with characteristics of the envir-onmental and task context. These classi"cationsystems are valuable because they emphasize thecomplex interactions among system components,including human operators, and generally result inricher data collection of circumstances surroundingincidents and accidents. However, contextual clas-si"cations are really correlational and not necessar-ily indicative of causal relationships. Further, thesecorrelational classi"cations cannot, alone, explainwhy similar environmental circumstances do notdeterministically produce repeatable errors (Rea-son, 1990).

Conceptual classi"cation systems attempt to es-tablish causality in terms of more fundamental, andlikely predictable, characteristics of human behav-ior. Norman's (1981) distinctions between slips andmistakes is perhaps the simplest classi"cationscheme in this category. Slips are failures in execut-ing the correct intention. Mistakes result from mis-taken intentions. More elaborate classi"cationstypically begin with a model of human informationprocessing and de"ne error types based on thefailure modes of information processing stages ormechanisms (e.g., Reason, 1990; Rouse and Rouse,1983). Categories of systematic error mechanisms(e!ects of learning, interference among competingcontrol structures, lack of resources, and stochasticvariability) have been identi"ed for each of threelevels of cognitive control (skill, rule, knowledge)(Rasmussen and Vicente, 1989). Prabhu et al.(1992b) organize error shaping factors by thesesame levels of cognitive control. Although mostuseful for establishing causality and for predictinghuman error, conceptual classi"cations rely on the-oretical inferences rather than observable data andare therefore more open to argument. These humanerror classi"cations address the error-proneness ofinformation processing mechanisms of an indi-vidual operator.

A smaller contingent cautions that such e!ortsmay drive human error research to be `individual-istically myopica (Quintanilla, 1987). That is,to stress the cognition of the individual tothe exclusion of social and organizational factors

(Quintanilla, 1987). Reason (1990) classi"es humanerrors based on social values to distinguish be-tween, strictly, errors and violations. Errors are, asdescribed above, unwitting deviations. Violationsare deviations from operating procedures, recom-mended practices, rules or standards that are delib-erate. Reason (1990) also distinguishes betweenviolations and sabotage. Violators intend the devi-ating acts, but not their potential for bad conse-quences. Saboteurs, however, intend both thedeviating act and its bad consequences. Violationsare shaped by three inter-related social factors:behavioral attitudes, subjective norms, and behav-ioral control (Reason and Maddox, 1995). Oper-ators who commit attitudinal violations, do sosimply because they know they can; and conscious-ly weigh perceived advantages against possiblepenalties or risks. Operators who are concernedwith the perceptions of subjective norms, may com-mit violations because they seek the approval ofothers. Finally, operators may not be able to exer-cise the behavioral control to not commit a viola-tion. Other organizational factors which contributeto human error relate to the manner in whichgroups of individuals interact (e.g., Reason, 1987).

These classi"cation schemes characterize variousaspects of human errors. To fully address the causesand e!ects of human error, however, a more holisticapproach is required. Rasmussen (1982) emphasizesthis need to place errors in a rich context. Histaxonomy (Table 1) considers not only mechanismsof human information processing malfunction, butalso the task, situation factors (task, physical, andwork time characteristics), operator a!ect and in-tentions, and ultimately, the external expression ofthe error.

2.3. Responding to human error

Perhaps the single most important contributionof human error investigation methods, both HRAand human error classi"cations, is to emphasizethat the goal of such investigations is not to at-tribute blame. Rather, errors are traced beyond theoperator who committed it to identify predisposingcharacteristics of the environment and task. Typi-cally, accident investigations back track untila cause is identi"ed (Rasmussen, 1985). Because it is

136 K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161

usually possible to identify cases where an oper-ator's failure to take compensatory actions allowsa developing failure to become expressed, thesefailures are often attributed to human error. Suchnaive attributions of blame to the generic fallibilityof human operators often halt accident investiga-tions prematurely, obviating the opportunity toidentify other causal or performance-shaping fac-tors and develop interventions (Woods et al., 1995;Reason, 1990; Lorenzo, 1990). Reason (1990) sug-gests that in order to break out of this `blamecyclea we must recognize that: (1) human perfor-mance is shaped by situational and environmentalcontexts, (2) simply instructing someone to not takean action they did not intend to take in the "rstplace is not very e!ective, (3) errors result frommultiple contributing factors, and (4) the relevantcharacteristics of situations and environments areusually easier to alter than the relevant character-istics of operators (Allen and Rankin, 1995).

Of course the most obvious response to a humanerror is to identify the causal mechanisms and alterthe system such that that error is not repeatable.Unfortunately this requires a sophisticated error-detection system, capable of identifying complexinteractions, and the impractical assumption thathuman variability is minimal. Some would arguethat even if eliminating all human error were pos-sible, it may not be desirable. Rasmussen (1990)argues that what we call errors are unavoidable sidee!ects of operators' exploration of the boundaries ofacceptable performance. He contends that errors, ornear errors, serve a valuable purpose in developingand maintaining operator expertise. Along similarlines, Senders and Moray (1991) suggest that elimin-ating the opportunity for error severely limits therange of possible behavior and thus inhibits trial anderror learning, and reduces the #exibility of humanoperators. They argue that the key is to reduce theundesirable consequences of the error, and not ne-cessarily the error itself. They therefore postulatethe concept of an error-forgiving design, ratherthan an error-free design as a goal.

Hollnagel (1990) also argues that error tolerantsystem designs are necessary. He points out thatknowledge about the limitations of human capacity(e.g., with regard to perception, attention, discrim-ination, memory) is used while making reasonable

assumptions for system design. Such design deci-sions reduce the probability of system-induced hu-man errors, those errors that can be traced back toparticular con"gurations of the human}machinesystem (e.g., the interface design, the task design).However, Hollnagel (1990) suggests that systemdesigners often overlook the fact that human capa-city is variable and the actual variance could belarger than the expected variance in many situ-ations. In other words, there is a category of errorsthat result from the inherent variability of humancognition. One approach to addressing this prob-lem would be to reduce the requirements for perfor-mance until they were met by a preponderance ofthe situations. However, this could mean that thesystem would perform below capacity in the major-ity of the cases and the human operator might beunderloaded. An alternate approach would be todesign the system with error tolerance. Hollnagel(1990) proposes that an error tolerant systemwould:

f provide user with appropriate information at theappropriate time to minimize the opportunityfor system induced erroneous actions,

f compensate for human perceptual dysfunctionby providing information in redundant and sim-pli"ed forms,

f compensate for human motor (and cognitive)dysfunction by maintaining the integrity of inputdata (through anticipation and context-depen-dent interpretation),

f contain provisions for detecting erroneous ac-tions and for instigating corrective procedures,

f allow easy correction and recovery of erroneousactions by providing a forgiving environment.

Similarly, Rasmussen and Vicente (1989) provideguidelines for designing system interfaces that tol-erate human error:

f Make limits of acceptable performance visiblewhile still reversible.

f Provide feedback on the e!ects of actions to helpcope with time delays.

f Make latent conditional constraints on actionsvisible.

f Make cues for actions, and represent necessarypreconditions for their validity.

K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161 137

f Supply operators with tools to perform experi-ments and test hypotheses.

f Integrate cues for action.f Support memory by externalizing e!ective men-

tal models.f Present information at the most appropriate

level for decision making.f Present information embedded in a structure

that serves as an externalized mental model.f Support memory of items, acts, and data that are

not integrated into the task.

Reason (1990) suggests that systems could bedesigned to minimize violations by changing theorganizational culture and social norms, and indi-vidual beliefs/values.

3. Aviation maintenance tasks and environments

Aviation maintenance and inspection tasks oc-cur within the larger context of organizational andphysical environmental factors. A system model ofaviation maintenance and inspection (Latorellaand Drury, 1992) de"nes four interacting com-ponents in this system (operators, equipment,documentation, and task) and suggests that thesecomponents interact over time as well as withinboth physical and social, or organizational, envi-ronments. In addition to considerations of the tasksperformed, the system model emphasizes the inter-actions among operators, interactions of operatorswith equipment and documentation used, and thephysical and job environment in which these tasksoccur. Operator classi"cations include inspectionand repair personnel at various organizationallevels (line operators, lead operators, foreman leveloperators) as well as production foremen and en-gineers. The equipment used in inspection andmaintenance tasks ranges from common tools (e.g.,#ashlights, mirrors, rulers) to more elaborateequipment requiring specialized training, such asthat required for non-destructive testing/inspection(NDT/NDI) (e.g., eddy-current, magnetic reson-ance, dye-penetration techniques). The documenta-tion, or more broadly, the information environment,used in inspection and maintenance includes notonly those required and used to perform speci"c

inspection and maintenance tasks (e.g., graphicsand procedures in work cards, reference materials,defect reporting forms), but also those necessary tocoordinate inspection and maintenance activities(e.g., shift turnover forms). The physical environmentis de"ned by parameters such as temperature; noiselevel and type; lighting level, color, and distribu-tion; and the presence of potential physical andchemical hazards to operators. For example, pre-cautions must be exercised to ensure that personnelare not exposed to radiation during X-ray NDTinspections, or to excessive fumes when inspectinginside a fuel tank. The physical environment in-cludes not only these ambient characteristics butalso characteristics at the workspace, such as theadequacy of task-lighting provided by a #ashlight(Reynolds et al., 1993). The organizational environ-ment is equally important and is receiving increasedattention in aviation maintenance. The followingsections more fully characterize aviation mainten-ance and inspection tasks and the surrounding or-ganizational setting.

3.1. Aviation maintenance and inspection tasks

The typical de"nition of human error in main-tenance and inspection refers to the activities of theinspector or repair person. Drury (1996) describesthe functions at this level of the system as (1) plan-ning, (2) opening/cleaning, (3) inspection, (4) repair,and (5) buy-back. Initially, a team including theFAA, the aircraft manufacturer, and start-up oper-ators, de"nes maintenance and inspection proced-ures for commercial aviation airlines.

Work items are de"ned by predictive models ofequipment and material wear, and are informed byprior observations, as well as incidents and acci-dent investigations. Airlines then de"ne actualschedules in a process that must meet legal require-ments (Shepherd et al., 1991). This process requiresconsidering interference between inspection and re-pair tasks due to required access, equipment and/orpersonnel constraints, in an e!ort to minimize totalaircraft out-of-service time. There are typically fourtypes of inspection/maintenance checks. Theserange in the degree of inspection and maintenancework from the least detailed (#ight line checksand A-checks) to the heaviest level (D-checks). The

138 K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161

Table 2Examples of inspection functions: Visual and eddy-current (Drury, 1996)

Function Visual inspection Eddy-current inspection

Initiate Get work card Get work card and equipmentRead and understand requirements Read and understand requirements

Calibrate eddy-current equipmentAccess Locate area on aircraft Locate area on aircraft

Assume correct position for viewing Position equipmentSearch View area systematically Move probe over rivet heads systematically

Stop if any indication detected Stop if any indication detectedDecide Compare indication against remembered standards

(e.g., for corrosion)Re-probe while closely watching signal on equipmentmonitor

Respond If indication exceeds standards, mark defect andcreate repair sheet

If indication con"rmed, mark defect and create repair sheet

Else, continue searching Else, continue searching

result of this planning function results in packagingmaintenance and inspection items into a check, andgenerating work cards which specify these itemsand procedures for accomplishing their inspec-tion/repair/replacement.

The opening/cleaning function prepares the in-spection/repair area. Prior to inspection, the areamust be cleaned and devoid of any oil, hydraulic#uid, or other visual interference. Typically, accesspanels are removed and internal cleaning mustbe performed. Cleaning and area preparation areusually performed by di!erent personnel thaninspection and repair operations. Inspection canbe performed either visually (also using tactile andauditory cues) or using non-destructive testingmethods, (e.g., eddy-current, ultrasonic, magneticresonance, X-ray, and dye penetration) (Latia,1993), which provide an abstracted or enhancedsignal for visual interpretation. Regardless of themethod, the inspection function includes the fol-lowing sub-functions: (1) initiate, (2) access, (3)search, (4) decide, (5) respond (Drury, 1996,1994;Drury et al., 1990). Inspection begins with an in-spector obtaining a work card and any equipmentrequired for the job it speci"es. After obtaining thisequipment and understanding the requirements ofthe work card, the inspector locates the inspectionsite. Inspection requires searching the target areaeither visually or with the appropriate equipmentuntil all items are addressed or the entire area issearched. As indications appear, inspectors must

determine if they are faults or not. Faults identi"edduring search are recorded for repair and buy-backinspection. Table 2 provides an example of theseinspection functions for both visual and eddy-cur-rent inspection (Drury, 1996).

The repair function also begins with a work card.Repair work cards specify the repair job, procedurefor repair, and note additional reference materialsrequired. The repair function can be decomposedinto sub-functions similar to the inspection sub-functions: (1) initiate, (2) site access, (3) part access,(4) diagnosis, (5) replace/repair, (6) reset systems, (7)close access (Drury, 1996). Repair begins with ac-cess of the appropriate work card, equipment, andparts for the repair. The repair person then locatesthe site of the repair and removes any additionalparts to access the element requiring repair. Re-moved items are inspected and stored. Techniciansperform diagnostic procedures speci"ed by thework card and determine whether to repair orreplace the target element. Once this determinationis made, technicians may need to obtain additionalparts before actually repairing or replacing the ele-ment. After the repair/replacement, the relevantsystems are reset, #uid levels are restored, and thesystem adjusted to speci"cation. The repair func-tion concludes by closing the access to the targetarea and making "nal adjustments. Table 3 showsa decomposition of the repair function (Drury,1996). Repairs may be performed on the aircraftitself, or as a sub-component process o!-line.

K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161 139

Table 3Generic repair functions and tasks (Drury, 1996)

Function Tasks

Initiate Read and understand work cardPrepare tools and equipmentCollect parts and suppliesInspect parts and supplies

Site Access Bring parts, supplies, tools and equipment towork site

Part Access Remove items to access partsInspect/store removed items

Diagnosis Follow diagnostic proceduresDetermine parts to replace/repairCollect and inspect more parts and suppliesif required

Replace/Repair Remove parts to be replaced/repairedRepair parts if neededReplace parts

Reset Systems Add #uids/suppliesAdjust systems to speci"cationInspect adjustments

Close Access Re"t items removed for accessAdjust items re"ttedRemove tools, equipment, parts, and excesssupplies

A second maintenance person, usually an inspec-tor, may re-inspect, or `buy-backa, a repair beforethe item is closed (Drury, 1996). Prior to returningan aircraft to service, all scheduled items and addi-tional items resulting from inspection must beeither certi"ed as complete or logged as deferred.Maintenance deferral is only possible in certainpre-de"ned situations, for items which are notsafety-critical (Drury, 1996). These items are treatedon the next scheduled, or event-driven maintenancecycle.

3.2. Organizational context

The aviation maintenance and inspection systemincludes not only the individual technicians per-forming the functions above, but personnel in theorganization level of the airline as well as at regula-tory agencies, aircraft manufacturers, and compon-ent vendors. The organizational context in whichaircraft maintenance and inspection occurs isequally important to an understanding of human

error as the inspection and repair functions them-selves.

At a higher level, the planning function translatesorganizational requirements (i.e., those imposed byregulatory agencies and manufacturers) into re-quirements for airline carriers, and consequentlytranslates these requirements into a schedule oflocal activities for inspection or repair personnel.Another function of the organizational level is toprovide quality control and assurance of the in-spection and maintenance processes. Quality con-trol in aircraft maintenance and inspection resultsfrom surveillance and auditing actions of regula-tory agencies (e.g., Federal Aviation Administra-tion (US FAA), Civil Aviation Authority (UKCAA)), as well as quality assurance functions in theairline companies (Drury, 1996). Quality assurancefunctions include: checking engineering change or-ders, auditing inspection and maintenance activ-ities for errors, auditing components (and vendors)used for replacements, and examining record keep-ing (Drury, 1996). Failures of quality assurance aswell as regulatory policies/practices allow mainten-ance and inspection human errors. These errors,and propagating e!ects of these errors, decreaseaviation safety.

In addition to these speci"c functions, generalorganizational characteristics in#uence perfor-mance at the individual level. Organizational fac-tors (e.g., de"nition of work groups/isolation ofworkers, reporting structures, payo! structures,and issues of trust and authority) demonstrablya!ect patterns of work in aviation maintenanceoperations (Taylor, 1990). More speci"cally, hu-man error in a major airline carrier's maintenancefacility is in#uenced by characteristics such as or-ganizational structure, people management, provis-ion of quality tools and equipment, training andselection, commercial and operational pressures,planning and scheduling, maintenance of buildingand equipment, and communications (Rogan,1995a).

4. Human error in aviation maintenance

The previous section brie#y outlines the func-tions involved at the individual and organizational

140 K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161

levels of aviation maintenance and inspection.However, these simple descriptions do not conveythe complexity of this system. Modern aircraft andthe systems embedded in them are increasinglytechnologically complex. New methods for inspect-ing and diagnosing these systems are increasinglyspecialized. Further, inspecting and maintainingcommercial aviation is organizationally complex;emerging from a socio-technic process in whichhundreds, even thousands, of people are directlyinvolved (Taylor, 1990).

These conditions combine to produce a workenvironment that predisposes the humans workingin this system to err. For example, given that thereare 14 di!erent kinds of attachment lock mecha-nisms in a narrow body aircraft seat, the chances ofoverlooking a poorly locked seat are very high(Lutzinger, 1992). Attempts to simultaneously ac-complish the competing goals of safety, timeliness,and pro"t result in implicit time pressures. Organ-izational/economic pressures may motivate oper-ators to violate inspection/maintenance practices.Consequences of errors are not immediately obvi-ous (Graeber and Marx, 1993). For example, in oneaccident a faulty maintenance action did not havean observable e!ect until 17 months after it occur-red (NTSB, 1990). Delayed feedback dramaticallyreduces the ability of operators to learn from errors.Such delays also impede accident investigation be-cause situational factors surrounding the o!ending`humana error are lost. In addition, because dif-ferent types of maintenance problems presentthemselves randomly to individual operators, it isdi$cult for any one operator to identify what maybe a systematic problem in an aircraft type ormechanism (cf. Inaba and Dey, 1991). Aircraftmaintenance often spans multiple days and mul-tiple shifts, making coordination of activities andinformation amongst di!erent operators over dif-ferent shifts very di$cult. Quality control auditsand inspections and error reporting systems obtaindata on inspection and repair performance. How-ever they do not typically provide consistent ortimely feedback to operators on actual errors. Fur-ther, feedback during training for inspection tendsto focus on procedural aspects of the task (e.g.,setting up NDT equipment and troubleshootingrules) rather than providing feedback for other,

more cognitive, aspects of the inspection task (e.g.,making perceptual judgments) (Prabhu and Drury,1992).

Given these complexities, it is not surprising thathuman operators in this system commit errors.Aviation maintenance and inspection errors can bedescribed in terms of their most immediate, observ-able e!ect on aircraft equipment, ultimate e!ects on#ight missions (incidents/accidents), and secondarye!ects on the airline carrier industry. Further,forms of errors in aviation maintenance and inspec-tion are de"ned as failure modes of the tasks in-volved in their functions. The incidence and formsof aviation maintenance and inspection human er-rors are described in these terms below.

4.1. Ewects of maintenance errors on aircraftequipment

Several studies have identi"ed the most common,immediate e!ects of human error in aviation main-tenance. A major airline shows the distribution of122 maintenance errors over a period of three yearsto be: omissions (56%), incorrect installations(30%), wrong parts (8%), other (6%) (Graeber andMarx, 1993). A three year study by the CivilianAviation Authority (CAA) found the eight mostcommon maintenance errors to be: incorrect instal-lation of components, the "tting of wrong parts,electrical wiring discrepancies (including cross con-nections), loose objects (tools, etc.) left in the air-craft, inadequate lubrication, cowlings, accesspanels and fairings not secured, fuel caps and refuelpanels not secured, and landing gear ground lockpins not removed before departure (UK CAA,1992, cited in Allen and Rankin, 1995). In-#ightengine shut downs on Boeing 747's in 1991 weredue to the following human errors, in order ofoccurence frequency, (Pratt and Whitney studycited in Graeber and Marx, 1993):

f missing or incorrect parts,f incorrect installation of parts or use of worn/

deteriorated parts,f careless installation of O-rings,f B-nuts not safety wired,f nuts tightened but not torqued or over-torqued,f seals over-torqued,

K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161 141

f not loosening both ends of connecting tube re-placement,

f replacing tube assembly without "rst breakingconnections between matching parts,

f not tightening or replacing oil-tank caps,f not cleaning or tightening cannon plugs,f dropping foreign objects into engines,f allowing water in fuel, allowing Skydrol in oil

system.

Data collected at a major US airline (Prabhu andDrury, 1992) revealed several major categories ofhuman errors in maintenance and inspection tasks,these are:

f defective component (e.g. cracked pylon, worncables, #uid leakage),

f missing component (e.g., bolt-nut not secured),f wrong component (e.g., incorrect pitot static

probes installed),f incorrect con"guration (e.g., valve inserted back-

wards),f incorrect assembly sequence (e.g., incorrect se-

quence of inner cylinder spacer and lock ringassembly),

f functional defects (e.g., wrong tire pressure,over-tightening nuts),

f tactile defects (e.g., seat not locking in position),f procedural defects (e.g., nose landing gear door

not closed).

4.2. Accidents due to maintenance errors

The types of maintenance errors contributing toaccidents range from glaring omissions that are thedirect cause, to more minor errors which combinewith other o!-normal occurrences to create theseaccidents (Graeber and Marx, 1993). Estimates ofthe contributions of maintenance factors to theincidence of aviation accidents and incidents vary.Sears (1986 cited in Graeber and Marx, 1993) statesthat maintenance was a contributing factor in 12%of the international accidents that occurred be-tween the years 1959 and 1983 (Graeber and Marx,1993), and #awed maintenance practices were themajor factor in 3% of these cases (Boeing, 1993a).In a more recent survey, Boeing (1993b) found thatchanges in maintenance and inspection could haveprevented approximately 16% of the hull losses

and almost 20% of all accidents that occurredbetween 1982 and 1991. For this period, mainten-ance and inspection factors were implicated in 47accidents in total, these accidents resulting in 1481onboard fatalities (Graeber and Marx, 1993).

Several fairly recent accidents have been at-tributed, at least in part, to human error in main-tenance and inspection operator tasks. Failure toinstall O-ring seals on an L-1011's engines allowedoil to leak out, eventually resulting in two of thethree engines ceasing operation due to oil starva-tion (NTSB, 1984; Strauch and Sandler, 1984). Thepilot had shut down the third engine earlier in the#ight in response to a low oil indication, but wasable to restart this engine and successfully reachedthe airport. Maintenance on a Continental ExpressEMB-120 was not adequately transferred to a sec-ond shift (NTSB, 1992). Forty-seven screws on theleft horizontal stabilizer were not replaced, causingthe leading edge to separate in #ight and resultingin 14 fatalities (Shepherd and Johnson, 1995;NTSB, 1992). Failure to detect a pre-existing metal-lurgical defect resulted allowed a fatigue crack toform in a critical area of a DC-10's fan disk. Com-pounding the initial error, the resulting fatiguecrack itself was unnoticed during inspection. In#ight, this situation resulted in separation and dis-charge of the rotor assembly, catastrophic failure ofthe d2 engine, consequent severing of the #ight-control hydraulic systems, and ultimately the lossof 111 lives and many other injuries (NTSB, 1990).In the Aloha Airlines accident (NTSB, 1989), failureto detect multiple site damage resulting fromjoining cracks resulted in hull failure and a crashlanding. This error caused the catastrophic loss ofthe aircraft, and only exceptional pilot performancepresented numerous fatalities in this accident.

In addition, accident investigators note the con-tributions of organizational factors to aviation ac-cidents. Reason and Maddox (1995) describe theconcept of an organizational accident/incident asone in which management decisions, emergingfrom the corporate culture, generate latent failuresthat are transmitted to the workplace where theycreate a local climate that promotes errors andviolations. These latent failures occasionally breakthrough engineered safety features (e.g., design,standards, procedures) to cause an accident/incident.

142 K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161

For example, while the Continental Express acci-dent described above was primarily caused by a re-pair person's error (i.e., not replacing the horizontalstabilizer screws), the accident could not have oc-curred if both the air carrier's quality assurancefunctions and FAA's regulatory surveillance wereattentive. Similarly, analysis of the Sioux City acci-dent (NTSB, 1990) implicated inspection and qual-ity control procedures during the engine overhaulprocess. Although correctly performed mainten-ance could have prevented the Aloha Airlines acci-dent, this accident was attributed to both failure ofinspectors to detect cracks, and failure of fatiguemodels to correctly anticipate crack growth andindicate required inspection (Drury, 1996).For every aircraft design, routine inspections, andoccasionally replacement/repairs, are required atpre-de"ned intervals to prevent failures from de-veloping (Hagemaier, 1989). Models of crackpropagation, based on structural and materialfracture mechanics, de"ne these intervals (Goran-son, 1989). Drury (1996) notes that this processinvites misinterpretation of failures in modelprediction as fundamentally human errors of in-spection or repair. Accident investigation of theTWA Flight 843 that was destroyed following anaborted take-o! found the precipitating event to bea malfunction in an angle-of-attack (AOA) sensorin the stall warning system. Although other factorswere involved, the investigation report emphasizesthat the precipitating cause was the failure ofTWA's maintenance and quality assurance trendmonitoring programs to detect the intermittentAOA malfunction.

While the self-auditing role of air carriers is un-deniably important, the role of an independent andattentive regulatory agency is critical. Only oneexample is necessary to underscore this point. In1990, Eastern Airlines and many of its top execu-tives were indicted by a grand jury for falsifyingmaintenance records, disregarding FAA require-ments that repairs be examined, and creating falserecords to indicate that required, scheduled main-tenance had been performed (Nader and Smith,1994). Without an independent regulatory agency'scritical review of airline and manufacturer practi-ces, the opportunity exists for the unscrupulous tooptimize economy at the expense of safety.

4.3. Other ewects of human errors in aviation main-tenance

Accidents due to human error forcefully under-score the potentially dire consequences of mainten-ance errors on aviation safety. The relative rarity ofthese accidents, however, does not imply that main-tenance errors are as rare. Accidents typically resultfrom the combination of causal factors and mustovercome several lines of defense. In additionto those that precipitated or facilitated the aboveaccidents, maintenance errors with no directconsequences have been detected. For example, theaccident investigation of the China Airlines, Flight583 (which became unstable after leading edge slatswere deployed at cruise altitude) detected thata rubber plug was left in the maintenance positionfor rigging the slat control system (NTSB, 1993a).Douglas Aircraft Company engineers stated thatalthough the plug should be removed after main-tenance, its presence did not a!ect the operation ofthe slat system. So maintenance errors may existthat, given other defense mechanisms built into thesystem are in place, are not consequential andtherefore not usually detected. It is tempting todismiss such seemingly inconsequential errors asunimportant. They are mentioned here for tworeasons. First, errors which seem inconsequentialmay, in other circumstances, interact with othero!-normal situations to result in a `sneak-pathaaccident. Second, even if a particular erroneousresult is not damaging in e!ect, it may indicatea predisposing condition in the environment, taskor operator's knowledge that may result in an errorof greater consequence.

Although accidents are the most salient andpoignant e!ects, human errors in maintenance ac-tivities have other important consequences. Forexample, maintenance errors have required airturn-backs, delays in aircraft availability, gate re-turns, in-#ight shutdowns, diversions to alternateairports, maintenance rework, damage to main-tenance equipment, and injury to maintenancepersonnel. Gregory (1993) "nds that 50% of allengine-related #ight delays and cancellations aredue to improper maintenance. Thirty-three percentof all military aviation equipment malfunctions re-sult from poor prior maintenance or improperly

K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161 143

applied maintenance procedures (Ru!ner, 1990).These consequences ultimately a!ect customer satis-faction and airline company productivity and pro"t.The negative e!ects of human error in aviationmaintenance, as re#ected in accidents, incidents, andother operational ine$ciencies, are clear.

4.4. Predicting forms of inspection and maintenancehuman errors

Extensions of human reliability and human errorclassi"cation methods to the aviation maintenanceand inspection area are rare. This section presentsthree approaches to studying aviation mainten-tance and inspection errors beyond simply catalog-ing their overt consequences to equipment. Mostsimilar to the HRA approach, one study employsa fault tree analysis to investigate and quantifyhuman error in aircraft inspection (Lock andStrutt, 1985). Lock and Strutt (1985) developa #owchart to describe the inspection process(Fig. 1). Analysis of this #owchart yields six poten-tial errors in the inspection process (Table 4), whichmay co-occur in more complex error forms. Theseerrors are then represented in a fault tree (Fig. 2).Lock and Strutt (1985) identify "ve PSFs relevantto aircraft inspection (area accessibility, generalarea lighting, access and visual enhancement tools,motivation/attitude, and work method) and pro-vide relative weights to indicate their importancefor each inspection step. However, noting the di-$culty of quantifying the probabilities neededto complete this fault tree, the authors do notactually perform an HRA analysis of human errorprobabilities.

Drury (1991) describes human error phenotypesbased on the previously described model of main-tenance and inspection functions. He de"nes thesehuman errors by decomposing inspection functionsinto tasks, and identifying the failure modes of thesetasks. These error categories were re"ned throughobservation of inspections, and discussions withinspectors, supervisors, and quality control person-nel (Drury et al., 1990; Drury, 1991). Table 5presents a sample of this error taxonomy. Thetaxonomy above only presents errors as observed,that is phenotypes of erroneous actions in aviationmaintenance and inspection. Drury (1991) also pro-

vides a classi"cation scheme for genotypes, the un-derlying mechanisms, of human error in aviationmaintenance and inspection. He bases this classi-"cation scheme on Rouse and Rouse's (1983) errorframework (Table 6).

In a di!erent vein, Foyle and Dupont (1995)identify the twelve most common causes of main-tenance personnel's `judgment interference.a Themotivation for this e!ort is a quote by JeromeLederer, President Emeritus of the Flight SafetyFoundation, `(Maintenance) error is not the causeof an accident. The cause is to be found in whateverit was that interfered with the (maintenance per-son's) judgment at a critical moment, the outcomeof which was a maintenance error (Foyle andDupont, 1995).a Table 7 explains this judgmentinterference in terms of a speci"c accident: on July11, 1991, an aircraft maintenance technician (AMT)who was aware that at least two tires on a Nation-air DC-8 had low pressure, boarded the aircraftand perished with 260 other persons.

5. Managing human error in aviation maintenance

Great strides have been made towards SecretaryPen8 a's goal of zero accidents. On a national level,the 1995 National Safety conference (FAA, 1995)called for additional human factors research focusedon error detection and prevention. Speci"cally, itsuggested that the FAA #ight standards should es-tablish a national data base for aviation humanfactors research, develop a maintenance error analy-sis tool prototype, and develop a system for main-tenance personnel based on the same principles asthe Crew Resource Management (CRM) programfor the cockpit. This section reviews recent e!ortstowards detecting and managing human error.

5.1. Detecting human errors in aviation maintenanceand inspection

Detecting systemic human errors, and associatedperformance shaping characteristics, is di$cult dueto low error rates, distributed occurrence overthe system, and variability of error phenotypes forthe same error genotype. There are two funda-mental methods for detecting errors in aviation

144 K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161

Fig. 1. Inspection Model Flowchart (adapted from Lock and Strutt, 1985).

Table 4Potential errors in the inspection process (Lock and Strutt, 1985)

Error Location in Flowchart De"nition

Scheduling (E1) Wrong execution of either of the two tasks: `identify next inspectiona or `move to locationaInspection (E2) Not seeing a defect when one existsInspection (E3) If human induced, due to either forgetting to cover area, covering area `inadequatelya or

a scheduling errorEngineering Judgment (E4) An error in deciding whether the area in which a defect is found is signi"cant or notMaintenance Card System (E5) Arises because the work cards themselves may not be used to note defects on the hangar #oor

immediately as they are foundNoting Defect (E6) The error is noted incorrectly or not noted at all

K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161 145

Fig. 2. Inspection Error Fault Tree (Lock and Strutt, 1985).

maintenance. The "rst detects human errors retro-actively, and is based on error reporting responsesto system-de"ned performance deviations. In con-trast, there are other, more pro-active detectionmethods. This section reviews error reporting andpro-active error identi"cation methods of detectinghuman error in aviation maintenance and inspec-tion in the commercial aviation system.

5.2. Reporting errors in aviation maintenance andinspection

Error reporting systems characterize, to varyingdegrees, perceived causes of a negative event. Error

reporting systems, then, may identify humanerror contributions for a particular incident andmay identify, as a result of this incident, systemicproblems. These systems are therefore, reactiveto errors inherent in the system. There are manydi!erent forms of error reporting systems.Most current error reporting systems show somecommon features (Drury, 1991; Latorella andDrury, 1992):

f They are event driven } the system captures dataonly if a di$culty arises or defect is found.

f Aircraft type and structure serve as the majorclassi"cation type for reporting.

146 K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161

Table 5Sample of aircraft maintenance and inspection errors by task step (Drury, 1991)

Tasks for `Initiatea function Error(s)

1.1 Correct instructions 1.1.1 Incorrect instructions1.1.2 Incomplete instructions1.1.3 No instructions available

1.2 Correct equipment procured 1.2.1 Incorrect equipment1.2.2 Equipment not procured

1.3 Inspector gets instructions 1.3.1 Fails to get instructions1.4 Inspector reads instructions 1.4.1 Fails to read instructions

1.4.2 Partially reads instructions1.5 Inspector understands instructions 1.5.1 Fails to understand instructions

1.5.2 Misinterprets instructions1.5.3 Does not act on instructions

1.6 Correct equipment available 1.6.1 Correct equipment not available1.6.2 Equipment is incomplete1.6.3 Equipment is not working

1.7 Inspector gets equipment 1.7.1 Gets wrong equipment1.7.2 Gets incomplete equipment1.7.3 Gets non-working equipment

1.8 Inspector checks/calibrates equipment 1.8.1 Fails to check/calibrate1.8.2 Checks/calibrates incorrectly

Table 6Example of possible errors in calibrating NDI task step (Drury, 1991)

Level of Processing Possible Errors!

1. Observation of system state Fails to read display correctly2. Choice of hypothesis Instrument will not calibrate. Inspector assumes battery is low3. Test of hypothesis Fails to use knowledge of NiCads to test4. Choice of goal Decides to search for new battery5. Choice of procedure Calibrates for wrong frequency6. Execution of procedure Omits range calibration step

!This example is for task 1.8.2 shown in Table 5.

f Data is further classi"ed, and its urgency deter-mined, using expert assessment of error critical-ity.

f Feedback is not well-formatted to be useful tooperators.

f Error reports can result in changes to mainten-ance and inspection procedures.

This section identi"es two forms of error report-ing. First, accident investigations and incident re-ports are considered as a form of error reporting.

Second, we consider error reporting systems in themost conventional sense, those residing within air-line companies and more formal regulatory report-ing systems.

5.2.1. Accident investigationsThe National Transportation Safety Board

(NTSB) conducts investigations of aviation acci-dents in which there is any loss of life or signi"cantaircraft damage. These investigations routinelyconsider whether documentation re#ects that

K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161 147

Table 7Judgment interference in aviation maintenance personnel (Foyle and Dupont, 1995)

Judgment Interference Example

Lack of communication The AMT did not inform the pilot that tire pressures were low, even after when the pilot stated `wegot a #at tire, you "gure?a on the "nal takeo! roll

Complacency The tires had had low pressure for several days without attentionLack of knowledge Ignorance of the manufacturer's speci"cations for tire pressureDistraction (Any distraction to the AMT is unknown.)Lack of teamwork The AMT and the cockpit crew were #ying together, yet did not appear to know maintenance status

of the aircraftFatigue AMTs did not go to sleep until at least 11 pm and were called up at 3 am. They had also been

traveling with the aircraft and working on it during down timeLack of resources Nitrogen to in#ate the tires was not readily availablePressure The AMT was apparently under much personal pressure to keep the aircraft #ying for job security

reasonsLack of assertiveness The base manager, who had no authority over the AMT, told him to `forget ita when it appeared

they would be delayed 30 min if the AMT in#ated the low tiresStress The AMT was counting on the success of this deployment to enable him to advance with the

company and to be able to settle in CanadaLack of awareness The AMT was unaware that tire pressure was critical to the aircraftmH s safetyNorms Company procedures allowed the AMT to make an error as an acceptable practice

maintenance was performed as scheduled and wasappropriate for the aircraft involved in the accident.Beyond this minimal investigation of maintenanceperformance, these investigations also considerwhether inspection/maintenance personnel couldhave identi"ed/repaired a problem that was notnecessarily related to scheduled performance. Fur-ther, if an inspection/maintenance problem is im-plicated, the investigation extends to considerpossible de"ciencies in airline quality control andassurance programs, in the regulatory mechanismof the FAA, and in manufacturer's recommenda-tions for scheduling inspection/maintenance tasks.These investigations result in recommendations tothe airline companies (e.g., inspection/maintenancepractices, environmental considerations, training,quality control and assurance practices), the FAA(e.g., ensure airlines adhere to maintenance sched-ules), and manufacturers (e.g., scheduling new orincreasing the frequency of inspections/repairs/re-placements). Investigation of the Continental acci-dent (NTSB, 1992), described above, resulted inrecommendations at the individual operator level(shift changeover), the airline level (quality assur-ance programs), and for the FAA (ensuring thatairline quality assurance programs are sound).

Accident investigations can also result in theinstitution of speci"c inspections/repairs/replace-ment of components, or increase the frequency ofinspecting/repairing/replacing certain components.For example, after identifying premature deteriora-tion of seat cushion "re-blocking materials asa contributing factor in the China Eastern Airlinesaccident, the NTSB suggested that principle main-tenance inspectors should inform operators of theneed to `periodically inspect "re-blocking mater-ials for wear and damage, and to replace defectivematerials (NTSB, 1993a).a Following the in-#ightseparation of an engine and engine pylon froma Boeing-747 due to a crack in the pylon forward"rewall web, Boeing issued a service bulletin, call-ing for a detailed visual inspection of this area onBoeing-747 aircraft with similar engines (NTSB,1993b). While NTSB reports thoroughly investi-gate the causal mechanisms of accidents and pro-vide useful recommendations for addressing thesefailures in the system, relying on accident reportsto identify these mechanisms is, aside from theobvious catastrophe of any loss of life, extremelyine$cient. In addition, in order to perform thedetailed analyses of accidents required, feedback atthe operator level is diminished by its delay, and

148 K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161

therefore contributing situational factors, are oftenmissed.

5.2.2. Anonymous incident reportsThe NTSB conducts formal investigations of ac-

cidents. Incidents are reported to the AviationSafety Reporting System (ASRS), which ismaintained by NASA though an independentcontractor. The ASRS allows individuals toanonymously report incidents and problems toa centralized, non-regulatory repository. Afterclarifying details in the reports, ASRS personnelremove identifying information, allowing reportersto remain anonymous. Pilots are the primary usersof this system and typically report incidents occur-ring during operation of the aircraft. Althoughmechanics also enter reports to the system, suchcontributions are much less frequent than the con-tributions of pilots. Maintenance/inspection issues,however, also arise from pilot-initiated reports.ASRS reports are searchable and are reviewed forsalient or recurrent problems in a periodical, `Call-backa. Researchers can request key-word sortedcompilations of ASRS reports. Thus, ASRS reportsprovide a means of identifying contributions ofinspection and maintenance human error to avi-ation incidents.

One must exercise caution in using ASRS data.Several factors limit interpretation of this data: (1)reports are based on one person's perspective, andthis perspective may be biased, (2) results of ASRSsearches are not typically all-inclusive, rather theyprovide a sample of reports containing searchterms, (3) the probability of reporting is ostensiblynot equal over all types of incidents or potentialreporters, therefore reports contained in the ASRSdo not necessarily represent a statistically validsample of actual occurrences. It is also important tonote that, in exchange for reporting incidents, pilotsreceive limited immunity from FAA prosecution.Finally, although there is a mechanism for main-tenance personnel to report incidents, ASRS re-ports are predominantly supplied by pilots. Despitethese limiting considerations, ASRS reports pro-vide useful information. They detail more examplesof error phenotypes than NTSB reports, and pro-vide a user's view of the factors involved, usuallyrelatively soon after the incident. Allen and Rankin

(1995) suggest that the aviation system would bewell-served by an anonymous reporting system spe-ci"cally for aviation inspection and maintenancetechnicians to report incidents during the mainten-ance process.

5.2.3. Internal error reporting systemsAirlines also maintain their own error reporting

systems. Recent e!orts have attempted to overcomethree problems with prior generations of error re-porting at the airline level: (1) error reporting sys-tems have not been integrated within an airline, (2)performance shaping factors surrounding error oc-currences are not well-documented in error reports,(3) error reporting systems have not been well in-tegrated across airlines.

Most airlines monitor incidents and accidents inthe system very stringently. Records are kept onsuch performance metrics as personnel injuries, air-craft or equipment damage, and delays. However,most of the error reporting systems are used separ-ately by di!erent departments and rarely used to-gether to analyze the system as a whole (Wennerand Drury, 1996). Therefore, separate error report-ing systems catalog di!erent error phenotypes. Thisdissociation by error phenotype limits the ability torecognize what may be a common error genotype.Wenner and Drury describe a situation that high-lights this problem: For example, if a mechanicdrops a wrench on his foot, the incident would berecorded as an &On the Job Injury'. If a mechanicdrops a wrench on an aircraft, damaging it severely,the incident would be recorded as &&Technical Op-erations Ground Damage.'' If the wrench was drop-ped on the aircraft, causing no damage, the incidentwould not be recorded at all. Finally, if a groundoperations employee drops a wrench on an aircraft,the incident would be recorded as &&Ground Opera-tions Ground Damage.'' In each of these scenarios,the error (genotype) was exactly the same, only the"nal consequences (error phenotype) di!ered, dif-ferentiating the way in which each of these incidentsis recorded. Clearly, the use of multiple reportingsystems that are maintained by di!erent depart-ments, makes root error detection more di$cult.Thus incident investigation and reporting toolsmust be developed so that they can be appliedacross airline systems. Wenner and Drury describe

K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161 149

a prototype system, called the Uni"ed IncidentReporting System (UIRS) that has a commonreporting form and an outcome-speci"c form.The common reporting form gathers data forall incidents irrespective of whether the incidentwas a paperwork error, an injury, grounddamage, or even had no adverse outcomes. Thisform then directs the user to one of the outcomespeci"c forms based on the hazard patternsdeveloped.

Airline error reporting schemes typically havebeen concerned with establishing accountability foran error and its consequences rather than under-standing the causal factors and the situational con-text of the error. Analyzing data collected from anerror reporting system that was designed to simplynote consequences and assign accountability is usu-ally fruitless. In these cases, accident/incident in-vestigation usually terminates as soon as blame canbe attributed to some human in the system. Sim-ilarly, error control methods derived from such anapproach are usually in the form of reprimandingand further training the operator, and instituting anadditional regulatory check for that speci"c occur-rence. With such simplistic error reporting schemes,then, the situational context at the point of error islost, and with it, the opportunity to more intelli-gently characterize and sensitively manage the trueerror mechanisms. A useful error reporting systemmust have a general theory of the task as well assituational factors which may a!ect task perfor-mance. The Maintenance Error Decision Aid(MEDA) developed by the Boeing Customer Servi-ces Division in cooperation with several airlinesand the FAA, is an error reporting system that triesto capture the causality of an incident in terms ofcontributing factors. The objectives of MEDA areto (Allen and Rankin, 1995):

f Improve airline maintenance organizations' un-derstanding of human performance issues.

f Provide line-level and organizational mainten-ance personnel a standard method for analyzingerrors.

f Identify maintenance system de"ciencies that in-crease exposure to error and decrease e$ciency.

f Provide a means of error-trend analysis for com-mercial airline maintenance organizations.

MEDA has "ve-categories for reporting an erroroccurrence:

f General data (e.g., airline, aircraft type, tail/#eetd, date of incident, time of incident),

f Operational event data (e.g., #ight delay, gatereturn, in-#ight shut down, aircraft damage),

f Maintenance error classi"cation (e.g., improperinstallation, improper servicing),

f Contributing factors analysis (e.g., factors relatedto information, equipment, airplane design, job,skills, environment, organization, supervision,communication. These include queries on cor-rect information, inaccessible aircraft space, newtask, inadequate task knowledge, time con-straints, environment, poor planning),

f Corrective actions (i.e., intervention strategy interms of: reviewing existing procedures and pol-icies to prevent such incidents, and identifyingnew corrective actions for local level.).

In a recent "eld evaluation, MEDA appeared tomeet its objectives (Allen and Rankin, 1995). Surveyresponses and evaluation of technicians' reportforms indicated that MEDA provided a usefulstandardized investigation methodology to themaintenance organization. Technicians used re-sponse forms in a manner consistent with MEDA'sstandardized investigation methodology. TheMEDA analysis identi"ed some maintenance de-"ciencies. Finally, survey data indicated thatmaintenance personnel's understanding of humanperformance issues improved after using MEDA(Allen and Rankin, 1995). The "eld test alsounderscored the di$culty of instituting a new tech-nology and process into an organization, andthe need for human factors and process-speci"ctraining. Currently, MEDA is available tocustomer airlines as a means of improving theirown maintenance operations and to improvecommunications with Boeing regarding designand manufacturing issues. MEDA promotes im-provements in error reporting by more e!ectivelycapturing concomitant situational factors. In addi-tion, by providing a common platform for errorreporting, MEDA provides the opportunity for in-creased integration of error reporting among air-lines and with manufacturers.

150 K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161

Results from internal error reporting systemsreach the FAA via formal channels, such as ServiceDi$culty Reports and Voluntary Disclosures. TheFAA, then, may disseminate information to manu-facturers regarding speci"c maintenance problemsby issuing Advisory Circulars and AirworthinessDirectives. Advisory Circulars provide guidance forcontrolling the maintenance process and are non-mandatory. However, the FAA only issues Air-worthiness Directives if it has conclusive proof ofa problem. Adherence to Airworthiness Directivesis mandatory. In addition to these reporting sys-tems, manufacturers issue service newsletters andbulletins which contain some information on hu-man error incidents.

5.3. Pro-active error detection methods

Error reporting systems are most useful for de-veloping error management strategies to preventthe speci"c error addressed from reoccurring. Insome cases, to the degree that situational factorsand human error generating mechanisms arecaptured in these error reporting schemes, thesesystems may identify more generalizable systemicerrors. However these methods are basically react-ive; that is, an accident, incident, or other system-de"ned deviation must occur to precipitate theseanalyses. The safety of the aviation system wouldbe much improved if we were able to identify sys-temic errors, and performance shaping factors ofthese errors, before incurring the costs associatedwith these precipitating events. The methodsdescribed below have recently been employed topro-actively identify error-generating situationsand characteristics of aviation maintenance andinspection operations.

5.3.1. AuditsThe Flight Standards service of the FAA per-

forms periodic audits of airline inspection andmaintenance programs. Most FAA audits are forregulatory purposes. These emphasize ensuringthat airlines follow prescribed inspection and main-tenance procedures, and have appropriate qualityassurance programs. In addition to these formal,regulatory audits, errors may be detected by a hu-man factors audit.

A human factors audit is a methodology for iden-tifying lapses in work practices, inadequacies in thework environment, and human-task mismatchesthat can lead to human errors. Most audits areconducted using checklists and questionnaires. Aspart of a human factors audit, human factors expertswith domain expertise may directly observe oper-ators performing their jobs. These observers note thetype, frequency, and cause of human errors. Obser-vations are typically recorded according to a classi-"cation of failure modes guided by task analysis, andan understanding of relevant situational factors inthe environment. As an example, Drury's aforemen-tioned de"nitions of inspection and maintenancetasks were the foundation for identifying failuremodes of these tasks. This classi"cation scheme pro-vided a structure for observing actual technicians'error patterns (Drury et al., 1990).

To conduct a human factors audit, one must "rstde"ne: (1) how to sample (frequency and distribu-tion of samples), (2) what factors to measure, (3)how to evaluate a sample (standards, good practi-ces, and principles, for comparison), and (4) how tocommunicate results (Drury, 1997; Chervak andDrury, 1995). An audit must demonstrate validity,reliability, sensitivity, and usability. When properlyused, audits can be an important means of pro-actively assessing error likelihood. However, auditscan be somewhat di$cult to conduct. They can beintrusive to the normal work environment. Theymust be performed by an individual with bothhuman factors and domain expertise, usually a rarecombination. It can be di$cult to obtain a largeenough sample size for a useful audit. Finally, theusual trade-o!s exist between the breadth anddepth of analysis, and time available to conduct theanalysis. Drury (1997) describes implementationconsiderations, and factors a!ecting the success ofaudits. Galaxy Scienti"c Corporation has de-veloped a computerized version of a human factorsaudit for aviation maintenance and inspection,ERNAP (Meghashyam, 1995).

5.3.2. Subjective evaluations of system reliabilitySome error detection systems use subjective rat-

ing scales to determine if the task environmentis error-prone. This methodology assumes that asystem's error-proneness can be deduced by having

K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161 151

people who work in the environment subjectivelyassess a set of factors (similar to the concept ofperformance shaping factors or error shaping fac-tors). While such assessments can be used e!ec-tively to complement reactive detection methods,implementing such systems requires caution. Onemust be careful to avoid biased questions, and tomotivate assessors to provide factual, unprejudicedassessments. In addition, one must determine thesample size of the data, how the resulting data willbe analyzed and interpreted, and how results willbe implemented. It is also important to provideproper feedback to volunteer assessors. Withoutvisible and e!ective feedback and actual resultantactions, such methods tend to lose credibility andcan become ine!ective.

James Reason developed MESH (Managing En-gineering Safety Health) as a pro-active subjectiveassessment tool for British Airways (Rogan, 1995b).MESH assumes that the system's intrinsic resis-tance to accident-producing factors is due to theinterplay of several factors at both the local work-place level and the organizational level. It a!ordsregular measurement of the maintenance workforce's and management's perception of the localand organizational factors. (Reason and Maddox,1995). Randomly selected assessors periodicallymake simple subjective ratings on certain systemfactors through an anonymous computer-basedsurvey. A given group of assessors operates fora limited period of time and is then replaced bya new group. The MESH program accumulatesthese inputs and summarizes the factors that couldcontribute to accidents or incidents. Quality con-trol groups within the airline identify and prioritizeissues in the MESH pro"les. Technicians assesslocal factors and technical management personnelassess organizational factors. Technicians receivefeedback in the form of newsletters and noticeboards (Rogan, 1995b).

5.3.3. Simulation approachesDirect observation, either by an expert or

through subjective evaluation by individuals ina system, requires intrusion into the workplace. Inaddition, interpreting these observations for causalmechanisms is often di$cult due to the number ofsituational and operator factors that vary, and in-

teract, within a real system. Simulation methodso!er an alternative approach for predicting humanerrors, allowing more careful control/isolationof situational and operator variables. Simulationimplementations range from part-task simulatorsin which one observes operators performing simu-lated tasks, to virtual environments which includesimulations of ambient conditions, to simulationsof the system that include a simulated operator.In the aviation maintenance and inspection envir-onment, part-task simulations provide the oppor-tunity to observe operators perform visual andeddy-current rivet inspections (e.g., Latorella et al.,1992). More advanced simulations have not yetbeen developed for aviation maintenance and in-spection, although they have been applied to otherwork domains. For example, a simulation exists ofmaintenance personnel in nuclear power plants(Gertman and Blackman, 1993).

5.4. Addressing human error control in aviationmaintenance and inspection

Once one determines that human error is a factorin an accident, incident, event, or error-likely situ-tation, one must address how to control, or man-age, this error. Most situations can be addressedthrough a variety of interventions. Further, inter-ventions are most e!ectively implemented whenused in combination. Interventions for error reduc-tion include: selection, training, equipment design,job design, and aiding (Rouse, 1985). More detailedlists of interventions speci"cally intended for theaviation maintenance and inspection environmenthave been classi"ed as short term and long-terminterventions (Shepherd et al., 1991). This sectionreviews some of the techniques and approachesthat have been proposed and applied to reducingthe probability of human error in the aviationmaintenance environment. This review considersthe broad categories of: (1) training, (2) job designand organizational considerations, (3) workspaceand ambient environment design, (4) task equip-ment and information design, and (5) automation.Finally we describe an approach to identifying,selecting among, and justifying intervention strat-egies for managing errors in aviation maintenanceand inspection.

152 K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161

5.4.1. TrainingTraining at the individual operator level con-

tinues to improve and take advantage of new toolsand methodologies. However there are still oppor-tunities for improving individual training. Forexample, inspection training tends to focus on pro-cedural aspects (e.g., setting up NDT equipmentand troubleshooting rules) rather than providingfeedback for other, more cognitive aspects (e.g.,making perceptual judgments) (Prabhu and Drury,1992). Transport Canada developed a workshopfor training maintenance technicians to avoid theaforementioned `judgment interferencesa (Foyleand Dupont, 1995). Based on the TransactionalAnalysis model, the Dupont Model distinguishesbetween rational (`adulta) and emotional (`childa)motivations. This workshop provides examples ofhow the rational/emotional interactions can a!ecta person's judgment during work, identi"es techni-cians' behavioral types, and emphasizes the e!ectsof stress, fatigue, and lack of communication onhuman performance (Foyle and Dupont, 1995).While this form of training has been provided to#ightdeck crews for some time, it is only now beingextended to the maintenance environment.

In addition, e!orts have been directed towardproviding aviation inspection and maintenancetechnicians with computer-based training (CBT).Three years ago, the O$ce of Aviation Medicineinstituted a research and development plan to dem-onstrate and evaluate the use of CBT techniques forthese technicians. The prototype system providedmaintenance technicians with instruction for diag-nosing the environmental control system (ECS) ofa Boeing 767-300 (Johnson et al., 1992). In additionto providing diagnostic instruction and practice,the CBT system allows users access to all appropri-ate cockpit and maintenance bay controls for theECS and access to interactive pages in the Boeingfault isolation manual (FIM). The prototype systemwas distributed to most of the world's airlines, viathe Air Transport Association's (ATA) Main-tenance Training Committee, and to most FAAcerti"ed aviation maintenance technical schoolsthrough the Aviation Technician Education Coun-cil (ATEC) (Johnson and Shepherd, 1993). Resultsfrom an evaluation study demonstrated that stu-dents trained with the CBT system showed the

same level of post-training knowledge as thosetrained in the traditional instructor-led method.Subjective evaluations indicated that technicianspreferred a combination of human and CBT in-struction (Johnson and Shepherd, 1993).

A large portion of aircraft maintenance and in-spection activities are accomplished by techniciansworking in teams. Thus, a technician has to learn tobe a team member and coordinate and communic-ate e!ectively to accomplish the team objective.While teams can enhance performance, there arealso many opportunities for human error in thiswork structure. Such functions as decision-making,knowledge-sharing, and communicating goals andobjectives, play a crucial role in improving teamperformance. Recent e!orts in training focus onenhancing teamwork in aviation maintenance andinspection. The success of Crew Resource Manage-ment (CRM) in improving team performanceon the commercial #ight deck, provides a model forimproving collaborative work in the inspection/maintenance environment. The FAA has proposedextending the CRM approach to MaintenanceResource Management (MRM), or TechnicianResource Management (TRM), to encourage team-work and e!ective problem solving in maintenancecrews (FAA, 1991,1995). Evaluations of the fewMRM training programs attempted at airlineshave showed success in these e!orts (Taggert, 1990;Galaxy Scienti"c, 1993) in both objective per-formance measures and subjective measures oftechnician attitudes (Galaxy Scienti"c, 1993).Gramopadhye et al. (1996) describe a computer-based multimedia team training tool, the AircraftMaintenance Team Training software (AMTT),which includes a team skills instructional module.This module addresses the key dimensions teamskills: communication, decision-making, interper-sonal relationships, and leadership. AMTT also hasa task simulation module that allows users to applylearned team skills in a simulated aircraft mainten-ance situation.

Endsley and Robertson (1996) analyze situationawareness (SA) requirements for aircraft mainten-ance teams by asking operators to interpretelements in the environment and describe theirspatial and temporal positions and trajectories(Endsley, 1988). Results suggested improvements

K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161 153

for technician team training to reduce error andenhance performance (Endsley and Robertson,1996). Teams should be trained to establish sharedmental models. That is, the team should havea clear understanding of what other teams knowand do not know. Teams should be trained toverbalize the decision-making process better. Tech-nicians must be trained to conduct better shiftmeetings and perform as a team member. That is,there should be explicit training for team leads toconvey to the team common goals, an understand-ing of who is doing what, and expectations regard-ing teamwork. Managers, leads, and techniciansshould be trained to provide better feedback (bothpositive and negative) on prior attempts. Addition-ally, training programs should address problemswhich cause a loss of situation awareness. Oneevaluation of the CRM approach in aviation main-tenance found that such team training could also beimproved by helping technicians transition newMRM skills to their actual working environment,to focus directly on training technicians how tovoice disagreements, and to plan and publicize forrecurrent MRM training (Galaxy Scienti"c, 1993).

5.4.2. Job design and organizational considerationsTraining has great appeal since it can rapidly

reach a whole department or company. Drury(1996) states that training can easily be made air-line-speci"c by using case studies of accidents orerrors from that airline or from similar operationsin other airlines. Drury describes an e!ort to bringabout ergonomic/human factors changes via train-ing in aviation maintenance. This e!ort requiredestablishing and training a human factors taskforce, comprised of both management and hangarwork force. A human factors expert coordinates theteam and acts as the advisor for human factorsissues. The training program utilized the SHEL(software, hardware, environment, and `livewarea(humans)) model (Edwards, 1972) to organize thehuman factors material. Results from this e!ortindicate several organizational factors that arecritical to the success of these task forces; focusingon issues at the right level, having a champion inthe organization for the whole e!ort, and maintain-ing trust that management actions would followrecommendations. Allen and Rankin (1995) also

suggest organizational changes to improve aviationmaintenance and inspection. Disciplinary actionsshould be uniform throughout an organization andshould be structured to complement limited-immunity policies in conjunction with incidentreporting (Allen and Rankin, 1995). At a higher-level of organization, the outputs of safety andinformation programs should be shared among air-lines and manufacturers (Allen and Rankin, 1995).

5.4.3. Workspace and ambient environment designSeveral aspects of the ambient environment af-

fect maintenance performance. Some maintenancetasks are performed in extremely cold conditions.At times this forces technicians to wear gloves dur-ing performance of their tasks, further complicatingmanipulation and tactile sensation. Volatile hydro-carbons in fuel tanks and cleaning agents at otheraccess areas produce noxious fumes for inspectorsand maintenance technicians. Recently new sol-vents have been identi"ed that are less noxious tooperators (Drury, 1996). Research has most fullyaddressed issues of lighting adequacy and postural/biomechanic hazards associated with aviationmaintenance and inspection tasks.

Ninety percent of all inspection is visual inspec-tion (Johnson and Shepherd, 1993). A generalmethodology, has been developed in cooperationwith an airline partner, to recommend optimalillumination equipment for individual inspectiontasks (Johnson and Shepherd, 1993). The resultingmethodology uses task analysis data, lightingevaluations, subjective input from inspectors andevaluation of illumination sources to specify betterportable area lighting, task lighting, and ambientillumination.

Maintenance tasks often require technicians toassume di$cult postures and to work in restrictedspaces. Reynolds et al. (1994) investigates the e!ectsof performance in restricted spaces on e!ort andperformance. Vertical restrictions on an inspectiontask demonstrably increase postural stress and res-piration rate variability, a measure of decreasedattentiveness (Reynolds et al., 1994). Sagittal re-strictions, however, appear to improve operatorperformance over unrestricted control conditions,indicating that some physical restriction mayimprove task focus (Reynolds et al., 1994). This

154 K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161

Table 8Usability criteria for user interface evaluation (Ravden and Johnson, 1989)

Factors Guidelines

Visual Clarity Information displayed on screen should be clear and well organizedConsistency The way the system looks/works should be consistent at all timesInformative Feedback Wherever appropriate and possible, the user should be given clear, informative feedback on where

they are in the system, what actions they have taken, whether these actions have been successful, andwhat actions to be taken next

Explicitness The system structure and working should be clear to the userAppropriate Functionality The system should meet the needs of the user when carrying out the taskFlexibility and Control The interface should be #exible in the way the information is presented to the usersError Prevention and Control Design system to minimize the possibility of user errorUser Guidance and Support Informative, easy-to-use and relevant guidance and support should be provided to help users

understand the system

research has developed a program for identifyingmaintenance tasks where postural demands exceedhuman capability while ensuring safe and reliableperformance, and for suggesting solutions to identi-"ed problematic tasks (Johnson and Shepherd,1993).

Unfortunately, this reactive approach is the onlyreal solution for improving postural problems formost of the aircraft being serviced today. However,future aircraft may in#ict fewer such postural prob-lems. Understanding that an aircraft design whichforces awkward postures, restricted access to com-ponents, and requires frequent raising and loweringof equipment can lead to increased maintenance-related error frequency, Boeing has developed spe-ci"c design criteria for the B-777 to facilitate accessand maintenance (Marx, 1992). While this e!ort ispromising, a more general e!ort towards develop-ing standard guidelines for maintainability and in-spectability would be most useful. MIL-STD-1472cprovides some guidelines for maintainability.Mason (1990) describes the Bretby MaintainabilityIndex that modi"es the SAE maintainability index(SAE, 1976) to include such issues as weightof components, size and position of access aper-tures, restricted access for tools. This has beendeveloped mainly for construction machinery,however it could be extended to focus on aviationmaintainability.

5.4.4. Task equipment and information designSome `human errorsa in aviation maintenance

and inspection derive from poorly designed interfa-

ces to equipment and information. Therefore, onemethod for controlling these errors is to redesignequipment and interfaces to information systems(including those implemented in paper). Interfaceissues are also important in the development of newequipment. Aviation maintenance and inspectiontasks are increasingly computer based and includethe use of new tools and techniques. As such, thehuman/machine interface to computer-based sys-tems and new equipment is increasingly important.In general, designing these interfaces would bene-"t from usability assessment and engineering(e.g., Ravden and Johnson, 1989; Nielsen, 1993)(Table 8). Some speci"c human/machine interfaceimprovements to aviation inspection and main-tenance equipment include using templates forinterpreting and calibrating NDI signals, and im-proving the interface to work cards. Technologicalimprovements have also been applied to functionsat higher organizational levels. For example, onecomputer-based system allows FAA inspectorsto record information while auditing maintenanceoperations using a stylus input device (Laytonand Johnson, 1993).

Work cards have been improved in three funda-mental ways. First, the form of the work card hasbeen redesigned to present information in a morereadable, and organized manner. Second, workcards can be improved by providing the appropri-ate content, and in a usable form (e.g., graphically),and a!ording easy access to reference materials.Third, the physical interface to the work cardis important and must be considered. Patel and

K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161 155

Table 9Examples of automation in aviation maintenance and inspection (Drury, 1996)

Function Automation examples

Planning f Automated stock control and parts ordering to ensure that lead times for obtaining partsdoes not extend with out-of-service time.

f Optimization heuristics for packaging required items into the length of a check visit.

Opening/Cleaning f NDI devices eliminate need to open aircraft in some cases.

Inspection: initiate f Automated information presentation (e.g., Marx, 1992)f Hypertext workcards allow access to background documentation (Lofgren and Drury,

1994).f IMIS system in military aircraft (Johnson, 1990).

Inspection: access f Climbing robot performs eddy-current scanning of lap-splice joints (Albert et al., 1994).Inspection: search & decision f E!ort towards automating signal processing and aiding "nal decision making (Johnson,

1989).

Inspection: response f Hypertext workcards allow integration of inspection performance with documentation ofresponse (Lofgren and Drury, 1994).

Repair: initiate f Aircraft Visit Management System (AVMS) at United Airlines integrates inspection andrepair activities (Goldsby, 1991).

Repair: diagnosis f Variety of AI and expert system approaches to diagnosis aiding (e.g., Husni, 1992).f Computer-based training with intelligent tutoring for diagnosis (Johnson et al., 1992).f On-board diagnostic system improvements (e.g., Hessburg, 1992).

Repair: repair/replace f Automation mostly limited to individual repair shop lines.f CNC machining and robotic welding systems (Goldsby, 1991).

Repair: reset (none)Buyback/Return to Service f Integration of inspection performance and response and maintenance actions makes

buyback easier.f Bar codes on badges and workcards to automate job control notation (Goldsby, 1991).f Fully electronic logbook proposed.

colleagues (Patel et al., 1994,1993; Drury, 2000)investigate these issues extensively. This researchidenti"es human factors guidelines for formulatinga more technician-centered work card. In a "eldevaluation, aviation inspectors preferred workcards redesigned according to these guidelines overoriginal work cards (Patel et al., 1994). Drury (2000)extends this research to improve the physical inter-face of the work card. They implement aviationinspection work cards in a computer-based hyper-text system on a portable computer. Field eva-luations of this prototype system demonstrated afurther improvement with this implementation.

Technicians often must refer to reference manualsin the course of performing inspections and main-tenance work. Providing this reference informationin a more accessible and usable form facilitatesperformance (e.g., Inaba, 1991).

5.4.5. AutomationAlthough some automation interventions appear

useful, development of these has typically beentechnology driven, rather than human-centeredand requirements driven. This approach hasresulted in the development of automationsystems that are not well integrated (Drury, 1996).

156 K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161

Table 9 presents automation approaches toaviation maintenance and inspection by function(Drury, 1996). In his review, Drury emphasizesthe importance of considering the additionaltraining requirements of automation interventionsand sensitively incorporating automation intothe organizational context and individuals'jobs.

5.4.6. Comprehensive and integrated approaches toerror management

In addition to managing human error in aviationmaintenance, one must manage interventions. Thatis, how does one generate potential alternativeintervention strategies, choose among these al-ternatives, and cost-justify these solutions tomanagement? Drury et al. (1996) describe a proto-type system for a more comprehensive approach toerror reduction and management. This system be-gins with the assumption that pro-active errormonitoring and reactive error reporting are bothessential for e!ective error control. It is based onthe premise that it is important to identify andorganize error reporting information in a mannerconsistent with intervention strategies. The pro-posed system has "ve modules: (1) error reporting,(2) critical incident reporting (reports of situationswhere errors/incidents almost happened but wererecovered without consequences), (3) error audit(auditing speci"c tasks to "nd human-system mis-matches), (4) error assessment (anonymous assess-ment of the task environment by technicians andmanagement), and (5) solutions database (of in-formation from industry sources and human fac-tors experts for design changes). This prototypesystem is called PERS (Pro-active Error ReductionSystem) and is being developed under an FAAresearch grant.

6. Conclusion

Aviation maintenance is a complex organizationin which individuals perform varied tasks in anenvironment with time pressures, minimal feed-back, and sometimes di$cult ambient conditions.Aircraft, as well as inspection and maintenance

equipment, are becoming more complex. As thecommercial aviation #eet ages, and work force ofmaintenance personnel diminishes, maintenanceworkload is increasing. These pressures exacerbatethe likelihood of human error in aviation mainten-ance and inspection processes. In fact, these errorshave various e!ects on the aviation system; frominconsequential slips, to those which a!ect airlinee$ciency and passenger convenience, to those fewwhich ultimately result in an accident. In recogni-tion of this, the focus is now more toward under-standing the nature of human error in aviationmaintenance and inspection, and improvingmethods for detecting and managing these errors.We review both reactive and pro-active methods oferror detection, and several intervention strategiesfor controlling human errors in aviation mainten-ance and inspection. Future directions in this areawould be to develop: (1) a maintenance incidentreporting system with limited immunity for report-ing (Allen and Rankin, 1995), (2) a standardized,but rich, vocabulary/indexing scheme for charac-terizing situational and operator factors in errorreporting, (3) technologies to facilitate recognitionof hazardous patterns in situational and operatorfactors, (4) aviation maintenance system simulation(cf. MAPPS in Gertman and Blackman (1993)and CES in Woods et al., 1987), (5) virtualenvironmental simulations to support experimentalinvestigations, (6) methodologies for identify-ing organizational structures and job designcharacteristics which dampen the likelihood andperseverance of human errors, (7) truly human-centered, integrated task aiding, automation, andtraining.

Acknowledgements

The authors are most appreciative of Dr. ColinDrury for his useful comments on the organizationand early drafts of this work, and of the reviewersfor their careful reading and helpful criticisms. The"rst author gratefully acknowledges contributionof "nancial support from the Industrial Engineer-ing department at the University of Bu!alo,through Dr. Colin Drury, for preparation of thiswork.

K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161 157

References

Air Transport Association, 1994. The Annual Report of the U.S.Scheduled Airline Industry (Brochure). Air Transport Asso-ciation, Washington, DC.

Albert, C.J., Kaufman, W.M., Siegel, M.W., 1994. Developmentof an Automated Nondestructive Inspection (ANDI) Systemfor Commercial Aircraft: Phase 1 Report (DOT/FAA/CT-94/23). National Technical Information Service, Spring"eld,VA.

Allen Jr., J.P., Rankin, W.L., 1995. A summary of the use andimpact of the Maintenance Error Decision Aid (MEDA) onthe commercial aviation industry. In: Proceedings of theFlight Safety Foundation International Federation of Air-worthiness 48th Annual International Air Safety Seminar.Flight Safety Foundation, Arlington, VA, pp. 359}369.

Boeing Airplane Company, 1993a. Statistical Summary of Com-mercial Jet Aircraft Accidents, Worldwide Operations,1959}1992. Boeing Commercial Airplane Group, Seattle, WA.

Boeing Airplane Company, 1993b. Accident Prevention Strat-egies: Removing Links in the Accident Chain. Boeing Com-mercial Airplane Group, Seattle, WA.

Cacciabue, P.C., Carpignano, A., Vivalda, C., 1993. A dynamicreliability technique for error assessment in man-machinesystems. International Journal of Man-Machine Studies 38,403}428.

Chervak, S.G., Drury, C.G., 1995. Human factors audit programfor maintenance. In: Human Factors in Aviation Mainten-ance Phase V Progress Report. Federal Aviation Adminis-tration, Washington, DC, pp. 93}126.

Drury, C.G., 2000. Relative advantages of portable computer-based workcards for aircraft inspection. International Jour-nal of Industrial Ergonomics 3(1), in press.

Drury, C.G., 1997. Chapter 49: Human factors audit. In:Salvendy, G. (Ed.), Handbook of Human Factors andErgonomics. Wiley, New York, NY.

Drury, C.G., 1996. Automation in quality control and mainten-ance. In: Parasuraman, R., Mouloua, M. (Eds.), Automationand Human Performance: Theory and Applications. Law-rence Erlbaum Associates, Mahwah, NJ, pp. 407}426.

Drury, C.G., 1994. Function allocation in manufacturing. In:Robertston, S.A. (Ed.), Proceedings of the Ergonomics So-ciety 1994 Annual Conference. Taylor & Francis, London,pp. 2}16.

Drury, C.G., 1991. Errors in aviation maintenance: taxonomyand control. In: Proceedings of the Human Factors Society35th Annual Meeting. Human Factors and Ergonomics So-ciety, Santa Monica, CA, pp. 42}46.

Drury, C.G., Prabhu, P., Gramopadhye, A., 1990. Task analysisof aircraft inspection activities: methods and "ndings. In:Proceedings of the Human Factors 34th Annual Meeting.Human Factors and Ergonomics Society, Santa Monica,CA, pp. 1181}1185.

Drury, C.G., Prabhu, P.V., Wenner, C., Ramamurthy, M., 1996.Proactive error reporting system, In: Human Factors inAviation Maintenance Phase VI Progress Report. FederalAviation Administration, Washington, DC.

Edwards, E., 1972. Man and machine: systems for safety. In: Pro-ceedings of British Airline Pilots Association Technical Sympo-sium. British Airline Pilots Association, London, pp. 21}36.

Endsley, M.R., 1988. Design and evaluation for situation aware-ness enhancements. In: Proceedings of the Human FactorsSociety 32nd Annual Meeting. Human Factors and Ergo-nomics Society, Santa Monica, CA, pp. 97}101.

Endsley, M.R., Robertson, M.M., 1996. Team situation aware-ness in aviation maintenance. In: Proceedings of the HumanFactors and Ergonomics Society 40th Annual Meeting. Hu-man Factors and Ergonomics Society, Santa Monica, CA,pp. 1077}1081.

Federal Aviation Administration, 1995. Zero accidents: a sharedresponsibility. Aviation Safety Action Plan. U.S. Depart-ment of Transportation, Washington, DC.

Federal Aviation Administration, 1991. Assessment of Mainten-ance Performance Measures and A Demonstration of CRMTraining Evaluation Analysis Plan. Task 3 Report. FederalAviation Administration, Washington, DC.

Foyle, B., Dupont, G., 1995. Maintenance errors and their pre-vention. In: Proceedings of the Flight Safety FoundationInternational Federation of Airworthiness 48th Annual In-ternational Air Safety Seminar. Flight Safety Foundation,Arlington, VA, pp. 353}357.

Galaxy Scienti"c Corporation, 1993. Human Factors in Avi-ation Maintenance } Phase 2 Progress Report (NTISDOT/FAA/AM-93/5). National Technical Information Ser-vice, Spring"eld, VA.

Gertman, D.I., Blackman, H.S., 1993. Human Reliability andSafety Analysis Data Handbook. Wiley, New York.

Goldsby, R.P., 1991. E!ects of automation in maintenance. In:Proceedings of the Fifth Federal Aviation AdministrationMeeting on Human Factors Issues in Aircraft Maintenanceand Inspection. National Technical Information Service,Spring"eld, VA, pp. 103}123.

Goranson, U.G., 1989. Continuing airworthiness of aging jettransports. In: Proceedings of the Second Annual Inter-national Conference on Aging Aircraft. Federal AviationAdministration, Washington, DC, pp. 61}89.

Graeber, R.C., Marx, D.A., 1993. Reducing human error inaircraft maintenance operations. In: Proceedings of theFlight Safety Foundation International Federation of Air-worthiness 46th Annual International Air Safety Seminar.Flight Safety Foundation, Arlington, VA, pp. 147}160.

Gramopadhye, A., Kraus, D., Rao, P., Jebaraj, D., 1996.Application of advanced technology to team training. In:Proceedings of the Human Factors Society 40th AnnualMeeting. Human Factors and Ergonomics Society, SantaMonica, CA, pp. 1072}1076.

Gregory, W., 1993. Maintainability by design. In: Proceedings ofthe Fifth Annual Society of Automotive Engineers: Reliabil-ity, Maintainability, and Supportability Workshop. Societyof Automotive Engineers, Warrendale, PA.

Hagemaier, D., 1989. Nondestructive testing of aging aircraft.In: Proceedings of the Second Annual International Confer-ence on Aging Aircraft. Federal Aviation Administration,Washington, DC, pp. 186}198.

158 K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161

Hessburg, J., 1992. Human factors considerations on the 777on-board maintenance system design. In: Proceedings of theSixth Federal Aviation Administration Meeting on HumanFactors Issues in Aircraft Maintenance and Inspection`Maintenance 2000a. National Technical Information Ser-vice, Spring"eld, VA, pp. 77}91.

Hollnagel, E., 1991. The genotype and phenotype of erroneousaction: implications for HCI design. In: Alty, J., Weir, G.(Eds.), Human-Computer Interaction and Complex Systems.Academic Press, London.

Hollnagel, E. 1990. The design of error tolerant interfaces. In:Proceedings of The First International Symposium onGround Data Systems for Spacecraft Control. Darmstadt,Germany, June 26}29.

Husni, M. 1992. Advances in arti"cial intelligence for aircraftmaintenance. In: Proceedings of the Sixth Federal AviationAdministration Meeting on Human Factors Issues in AircraftMaintenance and Inspection `Maintenance 2000.a NationalTechnical Information Service, Spring"eld, VA, pp. 157}171.

Inaba, K., 1991. Converting technical publications into mainten-ance performance aids. In: Proceedings of the Second An-nual International Conference on Human Factors in AgingAircraft. Federal Aviation Administration, Washington, DC,pp. 167}193.

Inaba, K., Dey, M., 1991. Programmatic Root Cause Analysis ofMaintenance Personnel Performance Problems. U.S. Nu-clear Regulatory Commission, Washington, DC.

Johnson, W.B., 1990. An integrated maintenance informationsystem (IMIS): an update. In: Final Report Second FederalAviation Administration Meeting on Human Factors Issuesin Aircraft Maintenance and Inspection `Information Ex-change and Communications.a Federal Aviation Adminis-tration, Washington, DC, pp. 141}150.

Johnson, W.B., 1989. Research and development. In: Proceed-ings of the Second Annual International Conference onAging Aircraft. Federal Aviation Administration, Washing-ton, DC, pp. 30}35.

Johnson, W.B., Shepherd, W.T., 1993. The impact of humanfactors research on commercial aircraft maintenance andinspection. In: Proceedings from the Flight Safety Founda-tion 46th Annual International Air Safety Seminar. FlightSafety Foundation, Arlington, VA, pp. 187}200.

Johnson, W.B., Norton, J.E., Utsman, L.G. 1992. New techno-logy for the schoolhouse and #ightline maintenance environ-ments. In: Proceedings of the Seventh FAA Conference onHuman Factors in Aircraft Maintenance and Inspection(NTIS PB93-146975). Federal Aviation Administration,Washington, DC.

Kirwan, B., 1992a. Human error identi"cation in human relia-bility assessment. Part 1: overview of approaches. AppliedErgonomics 23 (5), 299}318.

Kirwan, B., 1992b. Human error identi"cation in humanreliability assessment. Part 2: detailed comparison oftechniques. Applied Ergonomics 23 (6), 371}381.

Latia, D.C., 1993. Nondestructive Testing for Aircraft (Orderno. EA-415). IAP, Casper, WY.

Latorella, K.A., Gramopadhye, A.K., Prabhu, P.V., Drury, C.G.,Smith, M.A., Shanahan, D.E., 1992. Computer-simulated

aircraft Inspection tasks for o!-line experimentation. In:Proceedings of the Human Factors Society 36th AnnualMeeting. Human Factors and Ergonomics Society, SantaMonica, CA, pp. 92}96.

Latorella, K., Drury, C.G. 1992. A framework for human relia-bility in aircraft inspection. In: Proceedings of the SeventhFAA Meeting on Human Factors Issues in Aircraft Main-tenance and Inspection. Federal Aviation Administration,Washington, DC.

Layton, C.F., Johnson, W.B., 1993. Job performance aids for the#ight standard service. In: Proceedings of the Human Fac-tors and Ergonomics 37th Annual Meeting. Human Factorsand Ergonomics Society, Santa Monica, CA, pp. 26}29.

Lock, M.W.B., Strutt, J.E., 1985. Reliability of In-Service Inspec-tion of Transport Aircraft Structures (Report 85013). CivilAviation Authority, London.

Loftgren, J., Drury, C.G., 1994. Human factors advancesat Continental Airlines. In: Proceedings of the EighthFAA/OAM Meeting on Human Factors in Aviation Main-tenance and Inspection `Trends and Advances in AviationMaintenance Operations.a National Technical InformationService, Spring"eld, VA, pp. 117}138.

Lorenzo, D., 1990. A Manager's Guide to Reducing Human Er-rors: Improving Human Performance in the Chemical Indus-try. Chemical Manufacturers Associationp, Washington, DC.

Lutzinger, R., 1992. TomorrowmH s problems as seen by mainten-ance managers. In: Proceedings of the Sixth FAA Meetingon Human Factors Issues in Aircraft Maintenance and In-spection. National Technical Information Service, Spring-"eld, VA, pp. 92}100.

Marx, D., 1992. Looking towards 2000: the evolution of humanfactors in maintenance. In: Proceedings of the Sixth FAAMeeting on Human Factors Issues in Aircraft Maintenanceand Inspection. National Technical Information Service,Spring"eld, VA, pp. 64}76.

Mason, S., 1990. Improving plant and machinery maintainabil-ity. Applied Ergonomics 3, 15}24.

Meghashyam, G., 1995. Electronic ergonomic audit system formaintenance and inspection. In: Proceedings of the HumanFactors and Ergonomics Society 39th Annual Meeting. Hu-man Factors and Ergonomics Society. Santa Monica, CA,pp. 1071}1813.

Nader, R., Smith, W.J., 1994. Collision Course. TAB Books,Blue Ridge Summit, PA.

National Transportation Safety Board, 1984. Aircraft AccidentReport, Eastern Airlines, Inc., L-1011, Miami, FL, May 5,1983, (NTSB/AAR-84-04). National Transportation SafetyBoard, Washington, DC.

National Transportation Safety Board, 1989. Aircraft AccidentReport, Aloha Airlines, Flight 243, Boeing 737-200, N73711,Maui, Hawaii, April 28, 1988, (NTSB/AAR-89-03). NationalTransportation Safety Board, Washington, DC.

National Transportation Safety Board, 1990. Aircraft AccidentReport, United Airlines, Flight 232, McDonnell Douglas,DC-10, Sioux Gateway Airport, Sioux City, Iowa, July 19,1989, (NTSB/AAR-90-06). National Transportation SafetyBoard, Washington, DC.

K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161 159

National Transportation Safety Board, 1992. Aircraft AccidentReport, Continental Express, Flight 135, Embraer 120,N33701, Eagle Lake, TX, September 11, 1991, (NTSB/AAR-92-04). National Transportation Safety Board, Washington,DC.

National Transportation Safety Board, 1993a. Aircraft AccidentReport, China Eastern Airlines, Flight 583. MD-11, B-2171,Shemya, Alaska, April 6, 1993, (NTSB/AAR-93-07). NationalTransportation Safety Board, Washington, DC.

National Transportation Safety Board, 1993b. Aircraft Acci-dent Report, Japan Airlines, Inc., Flight 46E, B-747-121,N473EV, Anchorage, Alaska, March 31, 1993, (NTSB/AAR-93-033). National Transportation Safety Board,Washington, DC.

Nielsen, J., 1993. Usability Engineering. Academic Press, Cam-bridge MA.

Norman, D.A., 1981. Categorization of action slips. Psychologi-cal Review 88, 1}15.

Patel, S., Drury, C.G., Loftgren, J., 1994. Design of work controlcards for aircraft inspection. Applied Ergonomics 25 (5),283}293.

Patel, S., Drury, C.G., Prabhu, P., 1993. Design and usabilityevaluation of work control documentation. In: Proceedingsof the Human Factors and Ergonomics Society 37th AnnualMeeting. Human Factors and Ergonomics Society, SantaMonica, CA, pp. 1156}1160.

Prabhu, P., Drury, C.G., 1992. A framework for the design of theaircraft inspection information environment. In: Proceedingsof the Seventh FAA Meeting on Human Factors Issuesin Aircraft Maintenance and Inspection. Federal AviationAdministration, Washington, DC.

Prabhu, P., Sharit, J., Drury, C., 1992b. Classi"cation of tem-poral errors in CIM systems: development of a frameworkfor deriving human-centered information requirements. In-ternational Journal of Computer Integrated Manufacturing5 (2), 68}80.

Quintanilla, S.A.R., 1987. New technologies and human error:social and organizational factors. In: Rasmussen, J., Duncan,K., LePlat, J. (Eds.), New Technology and Human Error.Wiley, New York.

Rasmussen, J., 1990. The role of error in organizing behavior.Ergonomics 33 (10/11), 1185}1199.

Rasmussen, J., 1985. Trends in human reliability analysis. Ergo-nomics 28 (8), 1185}1195.

Rasmussen, J., 1982. Human errors: a taxonomy for describinghuman malfunctions in indistrial installations. Journal ofOccupational Accidents 4, 311}333.

Rasmussen, J., Vicente, K., 1989. Coping with human errorsthrough system design: implications for ecological interfacedesign. International Journal of Man-Machine Studies 31,517}534.

Ravden, S.J., Johnson, G.I., 1989. Evaluating Usability of Hu-man-Computer Interfaces: A Practical Method. HalstedPress, New York.

Reason, J. Maddox, M.E. 1995. Human error. In: HumanFactors Guide for Aviation Maintenance. U.S. Departmentof Transportation, Washington, DC (Chapter 14).

Reason, J., 1990. Human Error. Cambridge University Press,New York.

Reason, J., 1987. Collective planning and its failures. In: Ras-mussen, J., Duncan, K., LePlat, J. (Eds.), New Technologyand Human Error. Wiley, New York.

Reynolds, J.L., Drury, C.G., Sharit, J., Cerny, F., 1994.The e!ects of di!erent forms of space restriction on inspectionperformance. In: Proceedings of the Human Factors andErgonomics Society 38th Annual Meeting. Human Factorsand Ergonomics Society, Santa Monica, CA, pp. 631}635.

Reynolds, J.L., Drury, C.G., 1993. An evaluation of the visualenvironment in aircraft inspection. In: Proceedings of theHuman Factors and Ergonomics Society 37th Annual Meet-ing. Human Factors and Ergonomics Society, SantaMonica, CA, pp. 34}38.

Rogan, E., 1995a. MESH: Managing Engineering Safety Health.Human Factors 2, 5}6. British Airways Engineering, Lon-don.

Rogan, E., 1995b. Human factors in maintenance and engineer-ing. In: Human Factors in Aviation Maintenance PhaseV Progress Report. Federal Aviation Administration, Wash-ington, DC, pp. 255}260.

Rouse, W.B., 1985. Optimal allocation of system developmentresources and/or tolerate human error. IEEE Transactionson Systems, Man and Cybernetics 15 (5), 620}630.

Rouse, W.B., Rouse, S.H., 1983. Analysis and classi"cation ofhuman error. IEEE Transactions on Systems, Man andCybernetics 13 (4), 539}549.

Ru!ner, J.W., 1990. A Survey of Human Factors Methodologiesand Models for Improving the Maintainability of EmergingArmy Aviation Systems. US Army Research Institute for theBehavioral and Social Sciences, Arlington, VA.

Society of Automotive Engineers (SAE). 1976. Engineering De-sign Serviceability Guidelines } Construction and IndustrialMachinery. Society of Automotive Engineers, Warrendale,PA.

Sears, R.L., 1986. A new look at accident contributions andthe implications of operational training programs. (unpub-lished report). Boeing Commercial Airplane Group, Seattle,WA. Cited in Graeber, R.C., Marx, D.A., 1993. Reducinghuman error in aircraft maintenance operations. In: Pro-ceedings of the Flight Safety Foundation International Fed-eration of Airworthiness 46th Annual International AirSafety Seminar. Flight Safety Foundation, Arlington, VA,pp. 147}160.

Seigel, A.I., Wolf, J.J., Lautman, M.R., 1975. A family of modelsfor measuring human reliability. In: Proceedings of the An-nual Reliability and Maintainability Symposium. Institute ofElectrical and Electronic Engineers, New York, NY, pp.110}115.

Senders, J.W., Moray, N.P., 1991. Human Error: Cause, Predic-tion, and Reduction. Lawrence Erlbaum Associates, Hill-sdale, NJ.

Shepherd, W.T., Johnson, W.B., 1995. Human factors in airlinemaintenance and inspection, In: Proceedings of the Interna-tional Air Transport Association Maintenance Meeting. In-ternational Air Transport Association, Montreal, Canada.

160 K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161

Shepherd, W.T., Johnson, W.B., Drury, C.D., Taylor, J.C.,Berninger, D., 1991. Human Factors in Aviation Mainten-ance, Phase 1: Progress Report: DOT/FAA/AM-91/16.National Technical Information Service, Spring"eld, VA.

Strauch, B., Sandler, C.E., 1984. Human factors considerationsin aviation maintenance. In: Proceedings of the HumanFactors 28th Annual Meeting. Human Factors and Ergo-nomics Society, Santa Monica, CA, pp. 913}915.

Swain, A., Guttman, H. 1983. Handbook of Human ReliabilityAnalysis with Emphasis on Nuclear Poser Plan Applica-tions: Final Report (NUREG/CR-1278). United States Nu-clear Regulatory Commission, Washington, DC.

Taggert, W., 1990. Introducing CRM into maintenance training.In: Proceedings of the Third International Symposium onHuman Factors in Aircraft Maintenance and Inspection. Fed-eral Aviation Administration, Washington, DC, pp. 93}110.

Taylor, J.C., 1990. Organizational context for aircraft mainten-ance and inspection. In: Proceedings of the Human Factors34th Annual Meeting. Human Factors and Ergonomics So-ciety, Santa Monica, CA, pp. 1176}1180.

Taylor, D.H., 1987. The hermeneutics of accidents and safety. In:Rasmussen, J., Duncan, K., LePlat, J. (Eds.), New Techno-logy and Human Error. Wiley, New York, NY.

United Kingdom, Civil Aviation Authority (UK CAA). 1992.Flight Safety Occurrence Document, 92/D/12, June 9, 1992.Cited in Allen, J.P., Rankin, W.L., 1995. A summary of theuse and impact of the maintenance error decision aid(MEDA) on the commercial aviation industry. In: Proceed-ings of the Flight Safety Foundation International Feder-ation of Airworthiness 48th Annual International Air SafetySeminar. Flight Safety Foundation, Arlington, VA, pp.359}369.

Wenner, C.L., Drury, C.G., 1996. A Uni"ed Incident ReportingSystem for Maintenance Facilities. In: Human Factors inAviation Maintenance Phase VI Progress Report. FederalAviation Administration, Washington, DC.

Woods, D., Johannesen, L., Cook, R., Sarter, N., 1995. BehindHuman Error: Cognitive Systems, Computers, and Hind-sight. Crew Systems Ergonomics Information and AnalysisCenter, Wright-Patterson Air Force Base, OH.

Woods, D.D., Roth, E.M., Pople, Jr., H.E., 1987. CognitiveSystems Engineering: An Arti"cial Intelligence Spleen forHuman Performance Assessement (NUREG-CR-4862). USNuclear Regulatory Commission, Washington, DC.

K.A. Latorella, P.V. Prabhu / International Journal of Industrial Ergonomics 26 (2000) 133}161 161