Presentation is loading. Please wait.

Presentation is loading. Please wait.

Naval Safety Center Surgeon

Similar presentations


Presentation on theme: "Naval Safety Center Surgeon"— Presentation transcript:

1 Naval Safety Center Surgeon
CAPT Jim Fraser Naval Safety Center Surgeon Naval Safety Center I’m CAPT Jim Fraser and I have the good fortune to serve as the Naval Safety Center Surgeon. Today, I would like to talk to you about how we investigate and analyze human error in Naval Aviation. I would like to show you the Human Factors Analysis and Classification System (HFACS) that we’ve developed at the Naval Safety Center and how we’ve used this tool to identify dangerous trends and hazards. I’ll also show you some of the subsequent intervention strategies that we’ve developed and tracked. Today, I will be talking about data that was obtained by analyzing the Aviation data base. However, HFACS is a tool that can and has been used to identify human factor trends in Maritime mishaps, nuclear power plant accidents, train mishaps and mishaps in the trucking industry. I believe that a modified version of HFACS could easily be adopted to analyze human error in the medical environment. 5001

2 Naval Aviation Mishap Rate (FY 50-99)
776 aircraft destroyed in 1954 22 aircraft destroyed in 1999 Angled decks Aviation Safety Center Naval Aviation Maintenance Program established in 1959 (NAMP) RAG concept initiated NATOPS Program initiated 1961 Squadron Safety program System Safety Designated Aircraft ACT This slide depicts where we’ve been over the last five decades. Numerous technical initiatives and standardization programs have significantly reduced our mishap rate. Our Safety Program chronicles these successful risk assessments and implementation of controls in a REACTIVE nature. All of these initiatives have contributed greatly to our low mishap rates today. However, the Naval Aviation mishap rate has stabilized between Class A mishaps/100,000 flight hours for the last several years. On the surface, the dramatic decrease in the mishap rate from the 50’s to present would send many to the showers, patting each other on the back for a job well done. In fact, some have argued that the present mishap rate is simply the “cost of doing business”. Well at the Naval Safety Center, we believe that view is simply not true! Let’s examine why…... Fiscal Year

3 Class A, B,& C Mishaps/100,000 Flight Hours
All NAVY/MARINE Class A, B, & C Mishaps 16 14 12 Human 10 Class A, B,& C Mishaps/100,000 Flight Hours 8 6 4 When you look at all A,B and C mishaps over time, you can see that the rate of human error mishaps was about the same as the mechanical or solely material failure rate back as recently as 1977. As you can see, through advances in engineering we’ve been able to bring the mechanical rate down to next to nothing. However, you can also see the human error mishap rate has only been reduced by half. Today, human error is the common denominator in about 80% of our mishaps. What makes matters worse is that human error is by definition preventable. The only question remaining is how…..? Mechanical 2 1977 1979 1981 1983 1985 1987 1989 1991 Year

4 …but wisdom is what we need!
“Knowledge is Good…” Credo, Faber College Data Information Knowledge Wisdom So we’ve got to become smarter about the weakest link in the chain - the human being. We are looking at data on human error in new ways. Data extracted for a specific purpose becomes information; Information analyzed becomes knowledge; Knowledge applied becomes wisdom. I would like to briefly talk about how we at the NSC use the data to look at human error in Naval Aviation. By the way, does anyone know where our “Knowledge is good” credo or Faber College is found? Answer: The classic American movie ‘Animal House’. …but wisdom is what we need!

5 The “Swiss Cheese” Model of Accident Causation (Reason, 1990)
Organizational Factors Excessive cost cutting Reduction in flight hours Unsafe Supervision Deficient training program Improper crew pairing Preconditions for Unsafe Acts Loss of Situational Awareness Poor CRM One of the ways in which we’ve tried to get smarter at the Naval Safety Center is to adopt the model of human error developed by James Reason which simply shows that any mishap is the end result of a chain of events, not solely the failure of the aircrew. Take for example controlled flight into terrain. It can happen for many reasons, but often it results from failing to scan your instruments properly when in IMC (Unsafe Acts in the figure). Well, we can certainly point the causal finger at the aircrew since they are ultimately responsible for the safety of flight, but this model says that other factors played a role. For example, if the aircrew was fatigued or spatially disoriented (Preconditions for Unsafe Acts in the figure) it’s not hard to imagine that they would be prone to a breakdown in instrument scan or susceptible to poor judgement (flying into IMC when VMC only). So do we fix the pilot, or do we focus higher up in the system. In fact, we can often trace the causal chain of events back up the supervisory chain, to crew manning for instance (Unsafe Supervision in the figure). Is it any surprise to anyone here that if we pair a couple of rookies together on a low-level hop, at night, in marginal weather that we have a mishap? Who do we hold responsible in that case? The crew…? Well they paid the ultimate price; but, sometimes we the supervisors are equally responsible. But the model doesn’t stop there. What about those individuals making decisions at the highest levels? After all, Squadron CO’s have to play the hand they’re dealt. Take for example the CFIT mishap we’re talking about. When squadron’s are faced with the loss of flight hours or manning shortages, they are often left with few choices. If so, you may find yourselves weighing risk against benefit with the scales tipped in the wrong direction. Clearly, decisions made at the highest level impact heavily on decisions made in the trenches and must therefore be accounted for. Ultimately then, these fallible decisions and human errors make holes in the “cheese”. It’s when they line up (click again), as they did here, that tragedy often results. So, you’re probably wondering, just what are these holes in the cheese? Unsafe Acts Failed to Scan Instruments Penetrated IMC when VMC only Failures in the System Accident & Injury Crashed into side of mountain

6 Resource Management Organizational Climate Process ORGANIZATIONAL INFLUENCES Inadequate Supervision Planned Inappropriate Operations Failed to Correct Problem Supervisory Violations UNSAFE SUPERVISION Substandard Conditions of Operators PRECONDITIONS FOR UNSAFE ACTS At the Naval Safety Center, we have described the holes in the cheese developing a classification system that we call the Human Factors Analysis and Classification System (HFACS). Now I’m not going to go box by box through the HFACS model today. We could do that another time if you were interested. For today, it is only important to understand for purposes of this lecture that HFACS builds on Jim Reason’s model and focuses on the latent causes of a mishap in addition to the ultimate unsafe act.. We have now completed a post accident analysis on the 249 human error related U.S. Navy/Marine Corp Class A aviation mishaps occurring between the years FY Using HFACS we classified mishap causal factors as determined by the Aircraft Mishap Board into one of 17 categories at four levels of human failure (Unsafe Aircrew Acts, Preconditions for Unsafe Acts, Unsafe Supervision, and Organizational Influences). It was no surprise to find that the last fatal flaw in the chain of events were in the error or violations category. The larger question is why did these active failures (errors and violations) occur? Using HFACS we established that virtually none of of the unsafe acts committed by aircrew happened in isolation -- nearly all were preceded by either preconditions inherent among operators, unsafe supervisory practices, organizational influences, or all of the above. This is what HFACS brings to the table. HFACS enables us to define the holes or human failures at each level and peel the human factors onion back until we can get at the root of the problem. Adverse Physiological States Physical/ Mental Limitations Adverse Mental States Personal Readiness Crew Resource Mismanagement Substandard Practices of Operators Errors Perceptual Skill-Based UNSAFE ACTS Decision Exceptional Routine Violations

7 VIOLATIONS Violations UNSAFE ACTS Errors Perceptual Decision
Skill-Based VIOLATIONS Violation of Orders/Regulations/SOP - Failed to Inspect ACFT after In-Flight Caution Light - Violated Squadron SOP Restricting Flight Below 500’ - Failed to Comply with NATOPS During Streaming - Conducted Night Training and Ops Mission with PAX - Elected to File VFR in Marginal Weather Conditions - Failed to Use Radar Advisories from ATC - Inadequate Brief and Limits on Mission - HAC Knowingly Accepted Non-Current Crew Failed to Adhere to Brief Not Current/Qualified for Mission Improper Procedure I would like to briefly mention a few of the areas where HFACS has helped us to identify hazards and develop intervention strategies. I will start with the work we’ve done looking at violations. Violations in general are the willful departure from authority. We typically break down this category of violations into two subcategories, infractions and exceptional violations. The first infractions tend to be routine or habitual in nature and are typically condoned and perpetuated by the chain of command (like everyone going 64 when the speed limit is 55). Less frequently we see exceptional violations which are isolated departures from authority that are not necessarily indicative of an individual’s typical behavior nor condoned by the chain of command (these are the folks going 120 in a 55 zone). Some of the more frequent types of violations that we’ve seen are listed on the slide. We all know that mishaps not uncommonly have one or more violations as a causal factor however……….

8 U.S. Navy/Marine Corps Class A Mishaps
Violations (FY90-96) Percentage of Mishaps I don’t believe anyone knew how frequently violations were a contributing factor in our Class A mishaps until we looked closely at this last year using HFACS. As you can see when we first looked at this in 1997, we found that violations were determined by our Aviation Mishap Boards to be a causal factor in our human factor mishaps about 40% of the time. Furthermore, in some of our tactical communities violations represented 70% of all human factor mishaps. Needless to say, the response from Naval leadership when they were shown these surprising statistics was fast and furious. USN Helo USMC Helo USN TACAIR USMC TACAIR 10

9 Percentage Fiscal Year
Percentage of Human Error Mishaps Associated with Violations (FY 91-99) Percentage FY91-FY99 Human Error Mishaps: 182 FY91-FY99 Violations: 56 or 31% of all mishaps in the FY91-FY99 timeframe. This is the good news slide. Subsequent to our study of violations we now have evidence of significant progress in reducing and containing violations. This initial reduction of violations has been accomplished with strong leadership and personal discipline through-out the Fleet. However, it was after this study that we first realized the true significance of a good or bad command culture in terms of Naval Aviation safety. It was at this point that we developed what we felt would be a more long term intervention strategy. We developed and began providing command culture workshops to the fleet. These workshops have become increasingly popular over the past couple of years and are in such demand that we are quickly outstripping our resources to provide them. I’ll tell you more about these surveys at the end of my brief. USN/USMC TACAIR & HELO CLASS A MISHAPS Fiscal Year

10 Breakdown in Visual Scan (53) Failed to See and Avoid (12)
Violations Exceptional Routine UNSAFE ACTS Errors Perceptual Decision Skill-Based Unsafe Acts SKILL-BASED ERRORS Breakdown in Visual Scan (53) Failed to See and Avoid (12) Poor Technique (12) Omitted checklist item (10) Inadvertent Operation of Control (10) Improper Use of Flight Controls (10) However, despite our progress in reducing and containing violations, we have seen other types of human error increase. In particular, I would like to briefly talk about what we are seeing in terms of skill based errors. In essence, skill based errors are errors in the execution of a response that should be highly automated. SBE’s typically occur during missions that are highly practiced and require minimum conscious thought. As a result SBE’s are particularly vulnerable to failures of attention, memory, or stick and rudder skills. The slide above shows you the number of times that these sub-categories of SBE’s have occurred in the 182 Tactical Fixed wing and Helicopter Class A human error mishaps that occurred between FY

11 Percentage of Human Error Mishaps Associated with
Skill-based Errors (FY 91-99) FY91-FY99 Human Error Mishaps: 182 FY91-FY99 Violations: 96 or 56% of all mishaps in the FY91-FY99 timeframe. At the Naval Safety Center we have identified an alarming trend. It is now apparent that SBE’s are emerging unchecked throughout this decade in an ever-larger percentage of our mishaps. Furthermore, our statistician has confirmed that not only has the percentage of SBE’s risen but the rate of SBE’s has actually risen as well. If the trend continues this year, we will see SBE’s in more than 65% of our human error mishaps, contrasted to only 44% involvement in the early 90’s. This increase in both the percentage of human error mishaps associated with SBE’s and the rate of SBE’s represents low hanging fruit. If we want to decrease the mishap rate we must impact the human error that is consistently found in 80% of our mishaps. In particular, we need to look at the SBE’s that represent the largest single category of human error. USN/USMC TACAIR & HELO CLASS A MISHAPS

12 Back to the Basics Focus on:
Reemphasize the need for an efficient visual scan Prioritizing attention Recognizing extremis situations Refine basic flight skills (Stick-and-Rudder) Practice procedures As we begin to look at possible intervention strategies to reverse the current trend in increasing skill-based errors, it seemed intuitive that we needed to go back to the basics. But the problem is really much bigger than that. We know that stick and rudder skills as well as attention and memory failures are tied to experience, currency and most of all, proficiency. Like any similar skill in humans, skills deteriorate if not practiced. The simple intervention strategies mentioned above are certainly a good start, however keep in mind that these intervention strategies are only directed at aircrew. If we have learned anything from HFACS, we know that unsafe acts do not occur in isolation. They are made possible up by various preconditions in the system.

13 Errors Perceptual Skill-Based UNSAFE ACTS Decision Exceptional Routine Violations Inadequate Supervision Planned Inappropriate Operations Failed to Correct Problem Supervisory SUPERVISION Substandard Conditions of Operators PRECONDITIONS FOR UNSAFE ACTS Adverse Physiological States Physical/ Mental Limitations Adverse Mental States (83%) Personal Readiness Crew Resource Mismanagement (60%) Substandard Practices of Operators Resource Management Organizational Climate Process ORGANIZATIONAL INFLUENCES Before we focused on intervention strategies focused at the unsafe acts tier, we needed to look at the latent failures that set the aircrew up for failure. When we looked at the data, we found several significant associations between the 103 SBE’s and the various preconditions that preceeded them. For instance, the most significant preconditions associated with SBE’s were adverse mental states (83%) and crew resource mismanagement/ACT failures (60%). Adverse mental states include such things as channelized attention, task saturation, complacency, mental fatigue, haste, life stressors, and loss of situational awareness. Crew Resource Mismanagement encompasses failure to communicate and coordinate, failure to provide back-up, and failure to conduct an adequate brief. Similarly, these preconditions were preceeded or influenced by unsafe supervisory practices 42% of the time, and by higher level organizational influences 51% of the time. Clearly, we must develop intervention strategies which address not just the pilot in the cockpit, but these earlier links in the causal chain as well.

14 ADVERSE MENTAL STATE Fatigue (11) Distracted (10) Complacency (9)
Preconditions for Unsafe Acts Unsafe Acts Substandard Conditions of Operators Substandard Practices of Operators PRECONDITIONS FOR UNSAFE ACTS Personal Readiness Crew Resource Mismanagement Adverse Physiological States Physical/ Mental Limitations Adverse Mental States ADVERSE MENTAL STATE Channelized Attention/ Task Saturation (48) Fatigue (11) Distracted (10) Complacency (9) Loss of SA (7) This is the breakdown of the major sub-categories of AMS that were identified as causal factors by the AMBs and proceeded SBEs an amazing 83% of the time. As you can see, Channelized attention and Task saturation comprised the great majority of AMS that proceeded the SBEs.

15 CREW RESOURCE MISMANAGEMENT
PRECONDITIONS FOR UNSAFE ACTS Personal Readiness Interpersonal Resource Mismanagement Adverse Physiological States Physical/ Mental Limitations Adverse Mental States Substandard Conditions of Operators Substandard Practices of Operators Preconditions for Unsafe Acts Unsafe Acts Crew Resource Mismanagement CREW RESOURCE MISMANAGEMENT Failed to Communicate/Coordinate (26) Failed to Backup (17) Failed to Conduct Adequate Brief (11) Finally, this is the breakdown of the sub-categories of CRM failures that proceeded SBEs 60% of the time. In this case, the kinds of CRM failures identified by the AMBs were more or less evenly distributed.

16 Why are We Seeing an Increase in Skill-based Errors?
Lack of flight time? Quality of flight time? Decreasing experience OpsTempo? Perstempo? Shortcoming in our training program? Shortcoming in our operational practices? Would increased simulator-flight time be an effective intervention? So, we wondered, why are we seeing an increase in SBEs? Is it lack of flight time? Is it the quality of the flight time? Is it decreasing experience? Is it the OpsTempo or the PersTempo? Is it some short coming in our training program? Is it a shortcoming in our operational practices? And finally, would increased simulator time be an effective intervention?

17 Focus for Intervention Strategies
Skill-based Error Distribution by ACFT Model Skill-based Error by Mission Profile Administrative Phase of Flight Mission-related Phase of Flight Skill-based error by Mishap Characteristics Out-of-Control Flight (OOCF) Controlled-Flight into Terrain (CFIT) MIDAIR Skill-based error by Pilot Experience AV-8 Snapshot This is a list of some of the areas where we searched for evidence of potential for an intervention strategy that would reduce SBEs. I won’t go through the long version of this brief and show you all the data. Instead I’ll go straight to the part of the analysis where we looked at pilot experience. This is where we found what we felt was the best potential for an effective intervention strategy.

18 Tacair In-Model Flight Hour Distribution vs. Tacair Skill-Based Errors
As you can see, we thought there was value in separating TACAIR and HELO. Here you are looking at the TACAIR In-model flight hour experience compared to the distribution of skill-based errors. Pilot In-Model flight hours provide a rough measure of exposure and associated risk for the various groups. *Not surprisingly, the take-away is that increased in-model experience reduces the SBE mishap risk. *Over 500 hours, the percentages for community experience and SBE mishap percentage merge and thereafter SBE mishap risk appears to decrease. Pilot Flight Hour Distribution % Skill-Based Error Mishaps FY90 - FY98 65 Mishaps

19 Helo In-Model Flight Hour Distribution vs. Helo Skill-Based Errors
*Similar to Tacair, Helo In-Model hours appears to decrease mishap risk, particularly above 1000 hours experience. In the case of Helos we thought we had a very interesting finding when we saw that the hour in-model aviators seemed to be at greater risk for SBE mishaps than the less than 500 hour aviators. That had us confused until we talked to our Rotary Wing analysts. The reason you see this is the result of the fact that the less than 500 hour time in-model helo-drivers spend most of that initial time with experienced senior helo drivers and have most of their time as co-pilot time. However, the hour in-model aviators are now no longer paired with an experienced pilot and are well on their way to their FCF and HAC qualifications. They are also at a point in their careers when they may be tasked with multiple collateral duties and are no longer focused almost exclusively on learning to fly. They are at a point when they may be the new senior pilot in the aircraft but they don’t really have the hours to be truly proficient with their flying skills and are therefore prone to SBEs. Pilot Flight Hour Distribution Skill-Based Errors FY90 - FY98 21 Mishaps

20 AV-8 In-Model Flight Hour Distribution vs. AV-8 Skill-Based Errors
Because the AV-8 community stood out in so many of our analytical subsets we wanted to take a closer look at AV-8 experience. 28% of the AV-8 In-Model Flight Hour Pilots appear to have less than 500 hours experience. However, 80% of the AV-8 SBE mishaps occur to pilots in this group. AV-8 Pilot Flight Hour Distribution % By Number Mishaps AV-8 Skill-Based Errors FY90 - FY98 10 Mishaps

21 Tacair Conclusions Experience Counts (500+ In-Model Hours)
Most Prevalent Skill-Based Errors Breakdown in Visual/Instrument Scan/Cross Checking/See & Avoid Most Prevalent SBE Preconditions Adverse Mental State Channelized Attention/Task Saturation CRM

22 Helo Conclusions cont. Experience Counts (500/1000+ In-Model Hours)
Most Prevalent Skill-Based Errors Breakdown in Visual/Instrument Scan/Cross Checking/See & Avoid Most Prevalent SBE Preconditions Adverse Mental State Channelized Attention/Task Saturation CRM

23 Intervention Strategies
Increase the in-model experience pool toward 500/1000+ hours use simulator time to augment flight time and achieve earlier proficiency Emphasize development of psychomotor skills use simulator time to augment flight time and the development of a proper scan and stick and rudder skills Emphasize avoidance of preconditions use simulator time to augment flight time and development of automated basic flight skills that enable an aviator to avoid channelized attention/task saturation and improve CRM skills Both military and civilian flight simulation research have clearly demonstrated the cost-effective transfer of certain simulator-learned skills to actual flying skills. In other words, the use of simulators has been shown to be an effective augment to actual flight time in the development of proficient flying skills. These simulator learned flying skills include both perceptual- motor and procedural proficiency. In other words, the use of simulators has been shown to be an effective augment to actual flight time in the development of the kind of stick and rudder skills that comprise the great majority of the SBEs i.e. deficient scans, improper use of flight controls, poor technique, etc. Similarly, simulators have been shown to accelerate the automation of these tasks and improve the aviators time sharing capability between tasks. Aviators using simulators learn quicker the importance of each task and how to allocate resources between tasks. Therefore, the use of simulators has been shown to be effective in reducing the the AMS such as Channelized attention and Task saturation that frequently proceed SBEs. The use of simulators in combination with video feedback (e.g. CAPAS) shows great promise for maintaining or improving proficiency as well as for maintaining or improving CRM skills. I’ll talk more about CAPAS in a minute. The bottom line: With the use of effective simulators, we expect to see an even greater improvement in our safety rates and fewer losses in irreplaceable lives and aircraft. We will be tracking this closely.

24 CREW RESOURCE MISMANAGEMENT
PRECONDITIONS FOR UNSAFE ACTS Personal Readiness Interpersonal Resource Mismanagement Adverse Physiological States Physical/ Mental Limitations Adverse Mental States Substandard Conditions of Operators Substandard Practices of Operators Crew Resource Mismanagement CREW RESOURCE MISMANAGEMENT Not Working as a Team Poor Aircrew Coordination Improper Briefing Before a Mission Inadequate Coordination of Flight The final example of where we have used our analysis of human error I’ll briefly discuss pertains to Crew Resource Mismanagement. You’ll remember that I just talked about the fact that Crew Resource Mismanagement appeared in 60% of skill based error mishaps. We wanted to take a closer look at these CRM failures. As I mentioned earlier, Crew Resource Mismanagement encompasses failure to communicate and coordinate, failure to provide back-up, and failure to conduct an adequate brief.

25 Percentage of Human Error Mishaps Associated with
Crew Resource Management Failures (FY 90-98) Percentage When we examined the 182 TACAIR and Helicopter Class A human factor mishaps between FY91-98, we found that 56% had at least one CRM failure. As you can see from the graph, not only is this figure high, but much like skill-based errors, the trend appears to be on the rise. This percentage is very similar to that observed prior to the implementation of aircrew training in the Fleet. It appears that the initial benefits of the ACT program developed in the late 80’s and early 90’s have not persisted. We feel that this increasing trend of CRM failures represents a significant hazard to Naval aviation Fiscal Year

26 Percentage of CRM Failures by Flight Conditions and Aircraft Community (FY 90-98)
TACAIR Helo Percentage We wanted to examine these CRM-related mishaps a little closer and determine if the type of flight operations (routine, non-routine, emergency)played a role in the in the cause of CRM failures. Here is what we found: Overall, 28% of the CRM mishaps involved at least one CRM failure during preflight, including the preflight brief. Percentages were relatively equal for both TACAIR (32%) and Rotary Wing (24%)communities. Lessons Learned :”Brief the Flight, Fly the Brief” Do not hurry or cut corners during the brief. Be assertive, do not accept inadequate briefs or fly maneuvers that have not been briefed. Preflight Routine Operations Emergency Flight Condition

27 Percentage of CRM Failures by Flight Conditions and Aircraft Community (FY 90-98)
TACAIR Helo Percentage Almost 2/3 (65%) of the CRM mishaps involved at least one CRM failure during routine flight operations (no emergency or system malfunction). However, percentages were larger for Rotary Wing (78%) than for TACAIR (41%). Rotary Wing aircrew might perceive CRM as a strategy for managing workload during emergency situations rather than a process of sharing information, thoughts, and intentions during routine flight. Lesson Learned: Rotary Wing CRM training should emphasize the importance of aircrew coordination during seemingly benign, routine flight operations. Preflight Routine Operations Emergency Flight Condition

28 Percentage of CRM Failures by Flight Conditions and Aircraft Community (FY 90-98)
TACAIR Helo Percentage Approximately 35% of the CRM mishaps involved at least one CRM failure during non-routine (e.g. system malfunction) or extremis situations (e.g., loss of controlled flight). However, percentages were larger for TACAIR (49%) than for Rotary Wing (16%). TACAIR aircrew may be less likely to coordinate actions outside the cockpit during emergency situations or communicate the need for assistance from wingmen, ATC, or other operational personnel. Lesson Learned: TACAIR CRM training should focus on aircrew coordination during extremis situations. Preflight Routine Operations Emergency Flight Condition

29 CRM SUMMARY Even after the systematic, fleet-wide implementation of ACT, over 50% of TACAIR and Rotary Wing human factor mishaps involved at least one instance of CRM failure. The need to tailor ACT to the specific needs of the fleet is clear, yet data required for developing such curriculum has bee lacking. The need to tailor ACT to the specific needs of the fleet is clear, yet data required for developing such curriculum has been lacking. The findings in our analysis regarding the flight conditions under which CRM failures occurred, may provide some direction for tailoring effective intervention strategies. Although more in depth analyses are needed to determine the causes of, and solutions for CRM failures the use of simulation in combination with video feed back (e.g. CAPAS) shows great promise as an intervention strategy for improving CRM skills. USMC Rotary Wing, 1997

30 CAPAS: Computer-Aided Performance Analysis System
Simulation-based technology designed to: Maintain/improve aircrew proficiency Maintain/improve CRM skills Identify unsafe trends Augment standardization of training efforts CAPAS or Computer Aided Performance Analysis System is a simulation technology that is designed to: technology designed to: Maintain/improve aircrew proficiency Maintain/improve CRM skills Identify unsafe trends Augment standardization of training efforts

31 HOW DO WE MATCH UP? TACAIR Comparison Rotary Wing Comparison USN/ USAF
USMC (139) (72) Unsafe Acts Errors Skill-based Error 61% 60% Decision Error 55% 43% Perceptual Error 24% 31% Violations 28% 7% Preconditions for Unsafe Acts Substandard Condition Adverse Mental State 73% 53% Adverse Physiological State 23% 31% Physical/Mental Limitation 6% 11% Substandard Practice Crew Resource Management 53% 17% Personal Readiness 4% 4% Unsafe Supervision 35% 8% Inadequate Supervision 23% 3% Planned Inappropriate Ops 12% 3% Failed to Correct Problem 6% 3% Supervisory Violation 9% 0% USN/ USA USMC (60) (62) Unsafe Acts Errors Skill-based Error 37% 48% Decision Error 58% 37% Perceptual Error 33% 45% Violations 48% 27% Preconditions for Unsafe Acts Substandard Condition Adverse Mental State 75% 74% Adverse Physiological State 28% 3% Physical/Mental Limitation 12% 6% Substandard Practice Crew Resource Management 80% 39% Personal Readiness 3% 0% Unsafe Supervision 50% 32% Inadequate Supervision 32% 23% Planned Inappropriate Ops 12% 8% Failed to Correct Problem 13% 5% Supervisory Violation 12% 3% Finally, I would like to briefly return to violatons and show you some of our more recent intervention strategies. One of the things that HFACS enables us to do is speak a common language with our sister services, other DOD agencies, the civilian airlines and academia. Two of the studies that we’ve completed reveal some very interesting findings .The above table shows a comparison of HFACS factors in Class A mishaps, the Naval services compared to the Air Force for TACAIR and to the Army for helicopters. Although the number of years examined is approximately the same in both cases, the data were not collected from precisely the same time periods. (USAF 91-97; USA 92-97; USN/USMC 90-98) The obvious differences that are immediately apparent are found in the categories of Violations, Crew Resource Management, Decision Errors, and generally across Supervisory categories. First, and most pronounced, mishaps in the Naval services have had significantly more numbers of causal factors attributable to aircrew violations than their counterparts in the other services. In the TACAIR community, Navy/Marine Corps showed 28% of their mishaps attributable to violations, vice 7% in the Air Force. Similarly, 48% of mishaps in the USN/USMC helicopter communities included violations, vice 27% in the Army. CRM, Decision Errors, and the Supervisory Categories also showed significant differences. As you might expect,.there has been considerable discussion regarding these studies and the fact that they make the Navy/Marine Corps look like the “bad boys” of military aviation. I believe the answer lies in part, with the different ways our aviation mishap boards are put together, how we subsequently send our mishap investigations through our endorsement chain and how we ultimately report our conclusions in different ways. However, I also believe our Naval culture contributes to these differences. In the Navy/Marine Corps we have a 225 year old tradition of giving our Commanders lots of flexibility and authority. As some have put it, we are not as rigid in the Navy/Marine Corps as are our sister services- In the Navy/Marine Corps the rule of thumb has traditionally been: “you can do it unless there is a rule that says you can’t”. In our sister services there is less flexibility and the rule of thumb seems to be: “you can’t do it unless there is a rule that says you can”. To put this another way, the late Gene Roddenberry (the creator of Star Trek) was once asked why he selected the Navy as his model instead of the Air Force. His response was “ he chose the Navy because they have always had the tradition, flexibility and authority to take immediate actions to accomplish the mission. He went on to say “had he chosen the Air Force model he feared his audience would become bored while his characters waited for direction from higher authority. USN/USMC: FY90-FY98; USAF: FY91-FY97 USN/USMC: FY90-FY98; USA: FY92-FY97

32 Command Culture “A shared characteristic or characteristics of a
particular social group, organization, or society…” This brings me to the final area that I would like to address: Command Culture. As you’ve just seen, we in the Navy have inherited a culture that is not always conducive to aviation safety. Therefore the Naval Safety Center’s Aeromedical folks are actively engaged in the assessment and evaluation of a squadron’s safety culture. We do this in two ways.

33 Squadron Safety Surveys
The aeromedical member of our aviation safety survey teams has always completed a mini cultural survey when he or she accompanies the Safety Center team on safety center survey that every squadron in the Navy/Marine Corps is required to receive every years. We administer a 20 question questionnaire to representative groups within the squadron. This instrument has 18 questions scaled in order to ballpark the squadrons attitudes on safety. The last two questions are open ended; If the squadron had a flight mishap, how/why will it happen. There is quite a huge variation of results between squadrons and the answer to those last two questions can really be enlightening. For example, in one squadron survey, 2 pilots responded that their next flight mishap was going to be from flying formation. They went on to state they thought this was poor risk management as there is no tactical reason for a P3 to ever fly formation, especially near the airfield where one senior officer liked to lead those type of flights. In another squadron there were a number of enlisted maintainers that all suggested their next mishap would be the result of shoddy maintenance practices on the night-shift. It’s amazing, we offer members of the squadron confidentiality and ask them how and why their next mishap will occur and they tell us! It’s not rocket science but it works! Cultural Assessment Questionnaires Informal Interviews Aircrew/Maintenance Confidential and anonymous No written report, verbal feedback to Skipper and officers of his choice

34 Culture Workshop (CWS)
Senior Naval Reserve Officers serve as facilitators Naval Safety Center to serve as model manager CWS by command request 2 day observation Small group meetings divided by rank Confidential and anonymous No written report, verbal feedback to the Skipper and other officers of his/her choice However, we are now taking a much more involved and closer look at our squadrons during our Cultural Workshops. Experience has shown that degenerative organizational cultures often foster the development of unhealthy practices, or habits, which ultimately can contribute to or result in a mishap. The failure to correct underlying deficiencies often is an error of omission, vice commission, based on an inability to see the problem. The CW premise is that--safety exists on a foundation of trust, integrity, and leadership: created and sustained by effective communication. A trained senior Naval Reserve officer serves as the facilitator and uses directed individual and group discussions with command personnel, to examine and quantify the underlying elements that form a unit’s culture. Feedback on the relative strengths or weaknesses are presented to the unit’s Commander during a frank and strictly confidential debrief. Specific squadron results are never disseminated outside the unit, however. dangerous trends that are identified are shared with the Commander of the Naval Safety Center. The two-day process is completely voluntary and conducted solely for the benefit of the participating unit. In summary, we at the Naval Safety Center feel that we can be of great service to our Fleet squadrons by showing the Commanding Officers where their own squadron’s culture may or may not be conducive to safety. If the Naval Services can foster and maintain squadron cultures that are more conducive to safety we’ll lose fewer lives and aircraft.

35 “Whenever we talk about a pilot who has been killed in a flying accident, we should all keep one thing in mind. He...made a judgment. He believed in it so strongly that he knowingly bet his life on it. That his judgment was faulty is a tragedy,… “Whenever we talk about a pilot who has been killed in a flying accident, we should all keep one thing in mind. He...made a judgment. He believed in it so strongly that he knowingly bet his life on it. That his judgment was faulty is a tragedy. Every instructor, supervisor, and contemporary who ever spoke to him had the opportunity to influence his judgement, so a little bit of all of us goes with every pilot we lose”. Every instructor, supervisor, and contemporary who ever spoke to him had the opportunity to influence his judgement, so a little bit of all of us goes with every pilot we lose.” Anonymous USN Rotary Wing, 1997

36 Questions 5045


Download ppt "Naval Safety Center Surgeon"

Similar presentations


Ads by Google