Download presentation
Presentation is loading. Please wait.
1
HFACS in Navy & Marine Corps Safety
Naval Safety Center
2
Overview Human Factors
The study of how people perform in complex environments Human capabilities / Limitations Equipment / System design Team & organizational influences Human Error Failure to achieve intended outcome Actions go as planned but the plan is inadequate Satisfactory plan, but performance is deficient 80% of mishaps caused by human error Experience has shown that human factors play a role in approximately 80% of mishaps while other causes may be poor procedural documents and material failures. Human error has been implicated in 60-80% of accidents in complex, high technology systems. These systems include aviation, nuclear power, oil, medical, rail, and marine transport industries. Although the overall rate of many industrial and transportation accidents has declined steadily during the past 20 years, reductions in human error-related accidents have not paralleled those related to mechanical/environmental factors. Indeed, humans have played a progressively more important causal role in both civil and military aviation accidents as aircraft equipment has become more reliable (Nagel, 1988). It is not surprising then, that human error has been implicated in 60-80% of accidents in aviation and other complex systems. In fact, while accidents solely attributable to environmental and mechanical factors have been greatly reduced over the last several years, those attributable to human error continue to plague organizations. FOUO
3
Aviation Safety Historical Perspective
Acquisition System Changes Organizational Structure Policy & Programs Maintenance & Standardization Technical Solutions Training Risk Management – Operations & Maintenance Angled decks Aviation Safety Center Squadron Safety program Naval Aviation Maintenance Program, 1959 FRS, 1961 NATOPS, 1961 Ground Proximity Warning System (GPWS), 1978 ACT CRM ORM HFACS Mishaps per 100K flight hours This chart offers a look at naval aviation risk mitigation strategies over the years. On the vertical axis is the number of mishaps per every 100,000 flight hours. And on the horizontal axis is the fiscal year in which those strategies were introduced. Initial strategies focused primarily on engineering and technology aspects. While the human in the loop is a consideration, it was primarily in the context of man-machine interaction. In the early 90s, a Navy research and development effort was undertaken to identify the most common behavioral factors related to mishaps. Aircrew Coordination Training (ACT) and Crew Resource Management (CRM) were subsequently developed to improve mission effectiveness by minimizing preventable errors amongst the crew. Not long after, Operational Risk Management (ORM) was developed to address the inherent risk encountered in training, missions, operations, and in personal activities. ORM reduces or offsets risks by systematically identifying hazards, even within an established routine. It’s essentially risk mitigation across the board, whether your task is the mission or the mundane. HFACS incorporates the entire system – and is the best means we currently have to get to the “Why” of a mishap. One thing you’ll notice on this chart is that there appears to be a leveling off in the mishap rate. Indeed, there’s been around a 10-yr rate of around mishaps per 100K flight hours. The question we continually ask is whether ALL mishaps are preventable. Application of HFACS suggests that, with very few exceptions, this is true. HOWEVER, the likelihood of getting to zero is extremely low as long as humans are in the mix. MFOQA ASAP HF FOCUS FOUO 3
4
Origins of DoD HFACS Analysis Tool
Goal has been to proactively reduce mishaps by addressing Human Factors & Human Error Drs Wiegmann & Shapell’s HFACS system Brings together human factors, operations, human systems, systems safety and engineering issues (Man, Machine, Medium, Mission, Management) Focus on the system instead of the individual DoD-HFACS Taxonomy & Nano-Codes are the follow-on product Secretary of Defense memo of 19 May 2003 called for a 50% reduction in mishaps, later changed to 75%. Established Defense Safety Oversight Council (DSOC) DSOC established The Aviation Safety Improvement Task Force (ASI-TF) Establishment of Human Factors Working Group (all branches of DoD plus CG) Dr.’s Wiegmann & Shapell’s HFACS system brings together human factors, operations, human systems, systems safety, and engineering issues. In other words, they developed a system to help understand the interface of how humans interact with machines, in various mediums, missions and types and levels of management. The purpose is to focus on the system rather than an individual who committed the unsafe acts. Their product brought forth the HFACS model that was adopted by the DoD Safety community in a MOA signed in 2005 by the Joint Services Safety Chiefs. FOUO
5
Reason’s “Swiss-cheese” Model
Inputs Organizational Factors Latent Failures / Conditions Unsafe Supervision Preconditions for Unsafe Acts Active & Latent Failures / Conditions Unsafe Acts Active Failures / Conditions This figure illustrates James Reason’s (1990) model of how humans contribute to the breakdown of safe operations. In this model, system failures can be of two types - active or latent. An active failure [or active condition] is an unsafe act that has an immediate adverse effect on the system. Such unsafe acts are usually made by the front line operator. A pilot raising the landing lever instead of the flap lever exemplifies this failure type (ICAO, 1993). Other active failures may include misreading the FMS display, or not consulting an approach plate. These active failures represent deviations from safe operations, and therefore can be conceptualized as “holes” in the system; they are failures in the system’s defenses. Similar to safe and productive behavior, however, unsafe acts are also set up by preconditions within the system. Preconditions for unsafe acts may be such things as a loss of situational awareness by the pilot, or poor CRM practices of the crew. These preconditions, however, are themselves set up by poor supervisory practices, for example, inadequate training, or improper crew pairing. Decisions by management on how pilots should be selected or how resources are to be allocated within the organization also contribute to these system failures. For example, management may cut back on monies spent on training due to a reduction in passengers or escalating costs due to economic inflation. Finally, the regulatory body may also fail to inspect or monitor airport operations adequately, which further degrades the integrity of the system. All of these factors are referred to as latent conditions and can often lie dormant for a long time. They may not even be harmful if they occur in isolation. However, when all of these latent and active conditions interact, they create a “window of opportunity” for an mishap to occur. Layers of Swiss cheese are conditions at each level Some cheese has more holes than others Goal: More “solid cheese” -- Countermeasures to avoid the holes Human Factors Analysis & Classification (HFACS) defines the holes in the cheese! Failed or Absent Defenses Accident & Injury MISHAP Adapted from Reason (1990) FOUO
6
Where do Commanders and others usually look to prevent mishaps?
Operators System failures are like dominos, with the failure of one “domino” effecting the toppling of the next. The end result is the mishap or injury. When this happens, however, we often forget that the mishap itself is the last “domino” in this sequence, and that many dominos fell well before the mishap occurred. As a result, we tend to focus almost exclusively on the people responsible for front line operations (i.e., the crew or individual). Unfortunately, this has lead mishap operators to feel severely scrutinized, as if they are being placed under a microscope or interrogated for a crime. Rather than scrutinizing the failure of a single system component, we must take a step back and look at the entire sequence of events that lead to the mishap. A systems perspective requires that we examine blemishes or faults throughout the entire system. After all, it is often the failure of multiple components that combined together to produce an mishap. Some people may raise the question, “Why stop at the organizational or even industry level?” Does the system’s boundary really end there? Presumably everything has a prior cause. Therefore, we could potentially trace the cause of an mishap all the way back to the Big Bang. Stopping at the organizational level is just arbitrary. Theoretically this may be true. But we need to be practical. In seeking the reasons for an mishap, we should search far enough back to identify factors that, if corrected, would render the system more tolerant to, or even prevent, subsequent encounters with conditions that produced the original mishap. The people most concerned and best equipped to do this are those within the organization (Reason, 1990). Unsafe Acts Mishap FOUO
7
DoD-Human Factors Analysis Classification System (DoD HFACS)
The Utilization of DoD-HFACS So how is it used? The answer may seem simple, but without a structured and scientifically validated framework, it is often difficult to compare one accident to another. For example, how do you compare cannon crew members entering the wrong information into the fire control system or utilities engineers incorrectly loading the wrong chemicals into a field water supply treatment unit? By using the HFACS framework, these two events can be compared not only by the psychological origins of the unsafe acts, but also by the latent conditions within the organization that allowed these acts to happen. When events are broken down into the underlying causal factors, it is then that common trends within an organization can be identified. In identifying common trends, an organization can start to identify where interventions are needed. While traditional approached to intervention development are generally successful at preventing the same or similar accidents from happening again, how do we use our historical data to protect against a variety of accidents? After all, isn’t the goal of accident investigation to reduce the frequency and severity of accidents? By using HFACS, an organization can identify where hazards have arisen historically and start to prevent against these hazards leading to improved human performance and decreased accident and injury rates. HFACS is Used in a Variety of Industries While the HFACS framework has seen remarkable success in a variety of industries (e.g. mining, construction, rail and healthcare), the first successful use of the HFACS framework occurred where the framework originated, the Navy. The Navy was experiencing a high percentage of aviation accidents associated with human performance issues. Using the HFACS framework, the Navy was able to identify that nearly one-third of all accidents were associated with routine violations. Once this trend was identified, the Navy was able to implement interventions that not only reduced the percentage of accident associated with violations, but sustained this reduction over time. By using the HFACS taxonomy in conjunction with accident investigation, organizations are able to identify the breakdowns within the entire system that allowed an accident to occur. HFACS can also be used proactively by analyzing historical events to identify reoccurring trends in human performance and system deficiencies. Both of these methods will allow organizations to identify weak areas and implement targeted, data-driven interventions that will ultimately reduce accident and injury rates. I think the key to explaining the use and more importantly the application of HFACS is the fact it is data driven. Data is the one thing all DoD Services have in ample supply. HUMAN FACTORS ARE THE END-USER COGNITIVE, PHYSICAL, SENSORY, AND TEAM DYNAMIC ABILITIES REQUIRED TO PERFORM SYSTEM OPERATIONAL, MAINTENANCE, AND SUPPORT JOB TASKS. HUMAN FACTORS ENGINEERS CONTRIBUTE TO THE DEFENSE ACQUISITION PROCESS BY ENSURING THAT THE PROGRAM MANAGER PROVIDES FOR THE EFFECTIVE UTILIZATION OF PERSONNEL BY DESIGNING SYSTEMS THAT CAPITALIZE ON AND DO NOT EXCEED THE ABILITIES (COGNITIVE, PHYSICAL, SENSORY, AND TEAM DYNAMIC) OF THE USER POPULATION. THE HUMAN FACTORS ENGINEERING COMMUNITY INTEGRATES THE HUMAN CHARACTERISTICS OF THE USER POPULATION INTO THE SYSTEM DEFINITION, DESIGN, DEVELOPMENT, AND EVALUATION PROCESSES TO OPTIMIZE HUMAN-MACHINE PERFORMANCE FOR BOTH OPERATION AND MAINTENANCE OF THE SYSTEM. HUMAN FACTORS ENGINEERING IS PRIMARILY CONCERNED WITH DESIGNING HUMAN-MACHINE INTERFACES CONSISTENT WITH THE PHYSICAL, COGNITIVE, AND SENSORY ABILITIES OF THE USER POPULATION. HUMAN-MACHINE INTERFACES INCLUDE: *FUNCTIONAL INTERFACES (FUNCTIONS AND TASKS, AND ALLOCATION OF FUNCTIONS TO HUMAN PERFORMANCE OR AUTOMATION); * INFORMATIONAL INTERFACES (INFORMATION AND CHARACTERISTICS OF INFORMATION THAT PROVIDE THE HUMAN WITH THE KNOWLEDGE, UNDERSTANDING AND AWARENESS OF WHAT IS HAPPENING IN THE TACTICAL ENVIRONMENT AND IN THE SYSTEM); * ENVIRONMENTAL INTERFACES (THE NATURAL AND ARTIFICIAL ENVIRONMENTS, ENVIRONMENTAL CONTROLS, AND FACILITY DESIGN); * COOPERATIONAL INTERFACES (PROVISIONS FOR TEAM PERFORMANCE, COOPERATION, COLLABORATION, AND COMMUNICATION AMONG TEAM MEMBERS AND WITH OTHER PERSONNEL); * ORGANIZATIONAL INTERFACES (JOB DESIGN, MANAGEMENT STRUCTURE, COMMAND AUTHORITY, POLICIES AND REGULATIONS THAT IMPACT BEHAVIOR); * OPERATIONAL INTERFACES (ASPECTS OF A SYSTEM THAT SUPPORT SUCCESSFUL OPERATION OF THE SYSTEM SUCH AS PROCEDURES, DOCUMENTATION, WORKLOADS, JOB AIDS); * COGNITIVE INTERFACES (DECISION RULES, DECISION SUPPORT SYSTEMS, PROVISION FOR MAINTAINING SITUATION AWARENESS, MENTAL MODELS OF THE TACTICAL ENVIRONMENT, PROVISIONS FOR KNOWLEDGE GENERATION, COGNITIVE SKILLS AND ATTITUDES, MEMORY AIDS); AND, * PHYSICAL INTERFACES (HARDWARE AND SOFTWARE ELEMENTS DESIGNED TO ENABLE AND FACILITATE EFFECTIVE AND SAFE HUMAN PERFORMANCE SUCH AS CONTROLS, DISPLAYS, WORKSTATIONS, WORKSITES, ACCESSES, LABELS AND MARKINGS, STRUCTURES, STEPS AND LADDERS, HANDHOLDS, MAINTENANCE PROVISIONS, ETC,). DoD-Human Factors Analysis Classification System (DoD HFACS) An Invaluable Investigation and Hazard Identification tool FOUO
8
DoD HFACS Guiding Principles
Principle 1: The DoD is similar in nature to other complex productive systems. Principle 2: Human errors are inevitable within such a system. Principle 3: Blaming an error on the service member is like blaming a mechanical failure on the equipment. Principle 4: A mishap, no matter how minor, is a failure of the Principle 5: Mishap investigation and error prevention (aka: Risk Management) go hand-in-hand. HFACS is based on five fundamental “principles.” These principles reflect our underlying philosophy or assumptions about aviation operations. Principle 1: Aviation is similar in nature to other complex productive systems. (IS THE MARINE CORPS A SYSTEM? – YES) As such, the framework commonly used to describe productive systems can also be used to understand flight operations. Using a systems approach also helps identify the underlying causes of mishaps and provides a better understanding of how system components may interact to affect safety. Principle 2: Human errors are inevitable within productive systems. (NOTE: THE PERCENTAGE OF HUMAN ERROR IN ALL MISHAPS) To err is human, and therefore we should strive to reduce the consequences of human errors rather than preventing them. Principle 3: Blaming an error on the pilot is like blaming a mechanical failure on the aircraft. Aircrew often serve as the last barrier that stops a sequence of events from causing an mishap. When errors do occur, they are often only a symptom of the systems underlying problem. Principle 4: An mishap, no matter how minor, is a failure of the system. Systemic problems are often the cause of aircrew error and we must search the system to determine “why” the errors occurred. We need to look at the entire sequence of events and the multiple factors that contributed to the mishap. (INCLUDES ALL NEAR MISSES - 1st AID, CLASS-D) Principle 5: Mishap investigation and error prevention go hand-in-hand. Searching for “why” an error occurred is not to reassign blame or liability, nor even to excuse the error, but to identify the underlying system deficiencies that might cause an mishap to occur again. Prevention not punishment should be our goal. FOUO
9
Why use DoD HFACS? Structured Analysis of Human Error
Sophisticated, complete…Detects error patterns Aids in the development of interview questions Get to the “Why”…Not just the “What” Framework of providing a more insightful root cause determination – the “Big Picture” Targets the need for specific intervention …. better command decisions…more effective ORM ! Benefits of DoD HFACS: Structured – easy enough to follow. Why - getting down to the Why something happened. Also, effective tool to enhance ORM strategies. Standardized – Allows all services and DoD identify trends common to a similar or unlike processes rather than within one service. Valuable – Again, supports ORM strategies. The cause of most mishaps is systemic in nature. Therefore, if improvements in safety are to be realized, the occurrence and/or consequences of both latent system failures and active aircrew errors need to be addressed. However, current mishap investigation and prevention procedures focus almost exclusively on active failures. As we have just discussed, HFACS provides a framework for understanding the “big picture.” It forces us to examine failures throughout the system and prevents us from focusing solely on the pilot. HFACS gives us an effective tool for understanding how mishaps originate and, therefore, how they might be prevented. However, there are other questions that still need to be addressed. For example, “What exactly are the important human factors safety issues and their interrelationships?” In other words, “What are the holes in the cheese?” “Aren’t they too numerous to define?” In the next section of this handout, we present a framework that has been developed to answer these questions. We then illustrate its utility using an actual aviation mishap scenario. Finally, there remains the question of how HFACS can be used as a proactive tool for analyzing data and pinpointing safety problems before an mishap occurs. We briefly touch upon these issues in the intervention section of this workshop. FOUO
10
Why use DoD HFACS? (cont.)
A standard, Data-Driven Approach across the DoD Supports research across the Forces Applicable to existing data Easily applied to new mishaps and near misses Valuable beyond the Operational setting Applies to both On and Off Duty events Benefits of DoD HFACS: Structured – easy enough to follow. Why - getting down to the Why something happened. Also, effective tool to enhance ORM strategies. Standardized – Allows all services and DoD identify trends common to a similar or unlike processes rather than within one service. Valuable – Again, supports ORM strategies. The cause of most mishaps is systemic in nature. Therefore, if improvements in safety are to be realized, the occurrence and/or consequences of both latent system failures and active operator errors need to be addressed. However, current mishap investigation and prevention procedures focus almost exclusively on active failures. As we have just discussed, HFACS provides a framework for understanding the “big picture.” It forces us to examine failures throughout the system and prevents us from focusing solely on the operator. HFACS gives us an effective tool for understanding how mishaps originate and, therefore, how they might be prevented. However, there are other questions that still need to be addressed. For example, “What exactly are the important human factors safety issues and their interrelationships?” In other words, “What are the holes in the cheese?” “Aren’t they too numerous to define?” In the next section, we present a framework that has been developed to answer these questions. We then illustrate its utility using an actual aviation mishap scenario. Finally, there remains the question of how HFACS can be used as a proactive tool for analyzing data and pinpointing safety problems before an mishap occurs. We briefly touch upon these issues in the intervention section of this workshop. FOUO
11
Planned Inappropriate
Personnel Selection & Staffing Policy & Process Issues Climate / Culture Influences ORGANIZATIONAL INFLUENCES Supervisory Violations UNSAFE SUPERVISION Physical Environment Technological Physical Problem State of Mind Sensory Misperception Condition of Individuals PRECONDITIONS Performance-Based Errors Judgment & Decision- Making Errors ACTS Environmental Factors Personnel Teamwork Mental Awareness Planned Inappropriate Operations Inadequate Supervision Resource Problems Unfortunately, what was missing from most of James Reason’s “Swiss cheese” model was a definition of the holes. In effect, we were left with a better understanding of how human error occurs, but no specifics about the failures. What we did was define the holes in the cheese using a classification system called the Human Factors Analysis and Classification System (HFACS). As decisions are made, taskers “roll downhill”. Often there is a snowball effect of residual risk that piles up on the individual. The desired activity is to have these risks identified and assessed, with this additional information communicated back up the chain of command. Hazards identified need to be brought to the attention of the appropriate level of command for guidance and informed decision making. It is important that the correct information detailing operational risk get to the appropriate level of command for an informed decision. TO INSTRUCTOR: (Have Students turn to Chapter 5 and Appendix A in the MCO P5102.1B – relate this slide to the SIREP Part B and Appendix A) Drawing upon Reason's (1990) and Wiegmann and Shappell’s (2003) concept of active failures and latent failures/conditions, a new DoD taxonomy was developed to identify hazards and risks called the DoD Human Factors Analysis and Classification System. DOD-HFACS describes four main tiers of failures/conditions: 1) Acts, 2) Preconditions, 3) Supervision, and 4) Organizational Influences A brief description of the major tiers with associated categories and sub-categories follows, beginning with the tier most closely tied to the mishap. Attachment 1 is the in-depth reference document, and contains all the currently accepted definitions for the subcodes that fall within the 4 major tiers of human error. This document is subject to review and update every 6 months by the Human Factors Working Group of the Joint Services Safety Chiefs.
12
Questions ATAQ
13
How DoD HFACS is applied STEP 1: What is / are the unsafe act(s)?
Unsafe acts can be divided into two main categories - errors and violations. Errors are generally defined as mental or physical activities that fail to achieve their intended outcome. There are three basic error types - skill-based, perceptual, and decision. Violations are actions that represent a willful disregard for rules and regulations. There are two basic types of violations - routine and exceptional. Errors Violations - willful disregard for rules and regulations FOUO
14
STEP 1 (second question):
UNSAFE ACTS Errors Violations Performance-Based Errors Judgment &Decision Errors Willful disregard for rules and regulations Mental or physical activities that fail to achieve their intended outcome Unsafe Acts Unsafe Acts Unsafe acts can be divided into two main categories - errors and violations. Errors are generally defined as mental or physical activities that fail to achieve their intended outcome. There are three basic error types - skill-based, perceptual, and decision. Violations are actions that represent a willful disregard for rules and regulations. There are two basic types of violations - routine and exceptional. Errors Violations - willful disregard for rules and regulations STEP 1 (second question): Is the act an ERROR or VIOLATION? FOUO
15
FOUO
16
How DoD HFACS is applied
PRECONDITIONS FOR UNSAFE ACTS STEP 2: Why did the person commit the unsafe act(s)? Step 2 involves finding out why an unsafe act occurred. At this level especially, there may be numerous factors at play. There usually isn’t a “smoking gun” when it comes to risks to safety. Instead, there may be dozens of factors identified that contributed to the unsafe act(s). FOUO
17
Sensory Misperception
PRECONDITIONS FOR UNSAFE ACTS Environmental Factors Conditions of Individuals Teamwork Physical Environment Technological Environment Preconditions for Unsafe Acts Latent, pre-conditions of operators directly influence their performance. In this framework, there are two main categories of preconditions for unsafe acts - substandard conditions and substandard practices. Some are controllable by the operator, some are not. A design decision may interfere with safe operation, but there’s not a whole lot the end user can do about that. Often leads to workarounds that can compromise safety. Other factors should be easily controlled by the operator to reduce risks. Physical Problem State of Mind Sensory Misperception Mental Awareness FOUO
18
Personnel Factors/Communication/Environmental (Fog)
1977 was the year of the deadliest aviation crash, killing 583 people. Occurred on the island of Tenerife between 2 747s (Dutch airline KLM & Pan Am). A bomb explosion at their intended landing airport, and the threat of a second bomb, caused many aircraft to be diverted to Los Rodeos Airport. Among them were KLM Flight 4805 and Pan Am Flight 1736 – the two aircraft involved in the accident. At Los Rodeos Airport, air traffic controllers were forced to park many of the airplanes on the taxiway, thereby blocking it. Further complicating the situation, while authorities waited to reopen Gran Canaria, a dense fog developed at Tenerife, greatly reducing visibility. The planes are invisible to each other as well as to control tower. Pan Am was actually on the incorrect taxiway and planning to fix this fog-induced mistake when the mishap occurred. In short, after non-standard terminology is used by KLM, ATC route clearance is mistaken for a takeoff clearance. The captain, Jacob Van Zanten, is eager to takeoff (and increasingly annoyed at having to wait). After mistaking clearance, Van Zanten says, “We’re going.” and begins accelerating down the foggy runway. “And we’re still taxiing down the runway,” relays the Pan Am first officer. At the same instant, the tower radios a message to KLM. “Okay,” says the controller. “Stand by for takeoff. I will call you.” Either of these transmissions would have been enough to stop Van Zanten in his tracks. He would still have had time to discontinue the roll. The problem is, because the transmissions occur simultaneously, they overlap and are thereby canceled out. The next communication is from tower to Pan Am: “Report when runway clear.” “We’ll report when we’re clear”, says the Pan Am captain. Van Zanten and his first officer miss this communication. But the second officer, sitting behind them, does not. Alarmed, with their plane now racing forward at a hundred knots, he leans forward. “Is he not clear?” he asks. “That Pan American?” “Oh, yes,” Van Zanten answers emphatically. Both KLM first and second officers had questioned Van Zanten’s actions at some point in the moments prior to the collision. But, while we know there should be no rank in the cockpit, Jacob Van Zanten was the airline’s top 747 instructor pilot and a KLM celebrity. He was literally the poster boy for their advertising. This may have contributed to a lack of assertiveness on the part of the other two gentlemen in the cockpit. This is a far simplified analysis of this mishap, but it highlights the fact that any one of these small deviations from normal operations wouldn’t likely result in the mishap. Together, however, they create a perfect storm for disaster. Personnel Factors/Communication/Environmental (Fog) PE Environmental Conditions Affecting Vision PC Personality Style PP Nonstandard Terminology PE Communication Equipment Inadequate PP Rank/Position Intimidation (‘Captain is God’ mentality) PP Lack of assertiveness FOUO
19
How DoD HFACS is applied
STEP 3: Who knew about the person’s preconditions but did not take the proper steps to avoid the unsafe act(s)/condition(s)? In other words….. Who failed to apply the principles of risk management? RM Principles: Accept risk when benefits outweigh the costs Accept no unnecessary risks Anticipate and manage risk by planning Make risk decisions at the right level FOUO
20
Planned Inappropriate First line supervisors
UNSAFE SUPERVISION Supervisory Violations Planned Inappropriate Operations Inadequate Supervision Is a factor in a mishap if the methods, decisions or policies of the supervisory chain of command directly affect practices, conditions, or actions of individual and result in human error or an unsafe situation. -Discuss that “Supervisory” is focused on “First Line Supervisors” who set the example (Positive/Negative) Important to point out that, except in the case of Violations, leaders may not know they are making an error. Unsafe Supervision The Human Factors Working Group has determined that a mishap event can often be traced back to the supervisory chain of command. As such, there are three major categories of Unsafe Supervision: Deficiencies in supervision and line management directly influence the unsafe conditions and actions of operators. Unforeseen - those unsafe management and/or supervisory practices that go unnoticed, yet are not result of negligence or adverse behavior – pertains to middle management (not CEO or higher organizational level) Known - unsafe management of operations which were a direct result of supervisory action or inaction [NOTE: “known” in this instance does not imply that supervisor intentionally did something wrong, rather it refers to those instances in which supervisor erred in managing a known aspect of the operation] Unsafe Supervision First line supervisors Preconditions to Unsafe Acts Unsafe Acts FOUO
21
Inadequate Supervision
Example of: Inadequate Supervision SI007 Failed to Correct Risky Practices LtCol Holland - Hot stick (i.e. hot dog) pilot Known to take the B52 to the extreme limits, In previous missions – he put so much stress on the airframe, that it popped 500 rivets In another mission – pulled so steep (going straight up), fuel ran out of the vent hose. He was also known to put the B52 into death spirals (straight down) – this was against strict AF policy. FOUO
22
TO INSTRUCTOR: Have students discuss the possibilites as to why this (leaving the gas pump with the hose attached) could happen. Questions FOUO
23
How DoD HFACS is applied
ORGANIZATIONAL INFLUENCES STEP 4: Are there organizational vulnerabilities that affected supervisory practices and/or set the stage for unsafe conditions and acts? In other words….. Did policies, training, communications, or funding have an impact? FOUO
24
Organizational Influences
Resource Problems Personnel Selection & Staffing Policy & Process Issues Climate / Culture Influences Are factors in a mishap if the communications, actions, omissions or policies of upper-level management directly or indirectly affect supervisory practices, conditions or actions of the operator(s) and result in system failure, human error or an unsafe situation. Organizational Influences Are factors in a mishap if the communications, actions, omissions or policies of upper-level management directly or indirectly affect supervisory practices, conditions or actions of the operator(s) and result in system failure, human error or an unsafe situation. Fallible decisions of upper-level management directly affect supervisory practices, as well as the conditions and actions of operators. These latent conditions generally involve issues related to Resource/Acquisition Management, Organizational Climate, and Organizational Processes. Organizational Influences (latent conditions) – fallible decisions of upper level management Organizational Influences Unsafe Supervision Preconditions to Unsafe Acts Unsafe Acts FOUO
25
HFACS Example H-60 enters out of control flight after loss of tail rotor authority. Aircraft becomes uncontrollable resulting in a hard landing and rollover. Engineering investigation reveals improperly installed tail-rotor mechanism pin. FOUO
26
DoD HFACS Coding Act: Procedure performed incorrectly (AE103). Tail rotor cotter pin not correctly installed Preconditions: warning systems (PE202). Lack of written warning for improperly installed pin Supervisory: inadequate oversight (SI001). QA missed improper pin placement Organizational: Informational resources (OR008). Lack of O-level diagram showing pin placement in relation to all other holes. Flawed Doctrine (OP005). Army had solved the problem several years before. Info not made part of Navy maintenance. FOUO
27
Review: How to apply DoD HFACS
What did the operator do/not do to cause the mishap? Why did the person commit the Unsafe Act? Who knew about the person’s preconditions but did not take steps to avoid the act? Are there organizational vulnerabilities that affected supervisory practices and/or set the stage for unsafe conditions or acts? FOUO
28
Summary DoD HFACS provides procedures for investigating & reporting all DoD mishaps Supports DODI Directs components to establish procedures to cross-feed human error data with standard taxonomy Intended for use by all involved with mishap investigation Intended for use on all classes of mishaps TO INSTRUCTOR: Have student read along with the executive summary in their workbook. Discuss how non-standard and simple communications can lead to misunderstandings and errors. NOTE: DoD HFACS is an excellent hazard identification tool during planning and operations FOUO
29
Questions? FOUO
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.