Download presentation
Presentation is loading. Please wait.
Published byWinifred Hood Modified over 8 years ago
1
Failure Modes and Effects Analysis
2
2 Failure Modes and Effects Analysis (FMEA) ● Purpose – Systematically, prospectively, identify ways in which a process (system) can fail: Failure Modes. – Identify factors potentially contributing to failures. – Estimate risk of failure – Identify countermeasures – Develop requirements ● Method (WSE workbook 2.0) – Copy IDEF0 model nodelist into WSE [Overview] worksheet – Select task on which to perform FMEA – Prepare task FMEA worksheet from [FMEA template] worksheet ● copy [FMEA template] worksheet ● copy/type task name into upper left cell ● rename new task FMEA worksheet to task's A-number – Perform FMEA
3
3 PRIS IDEF0 Model Nodes in WSE Workbook
4
4 PRIS FMEA in WSE Workbook
5
5 FMEA Fields ● Contributing Factors – Circumstances, conditions, events, factors,... anything that might make a failure mode more likely. ● Potential Failure Mode – A way in which the process/system can fail: ● human error (see below) ● acute injury (e.g., cut, broken bone) ● cumulative injury (e.g., carpal tunnel syndrome) ● likely and significant equipment/tool failure ● etc. ● Potential Effects of Failure Mode – Consequences of the failure.
6
6 FMEA Fields (2) ● Severity – Numeric rating of the estimated severity of consequences, of the failure, e.g., ● negligible ● minor ● moderate ● major ● catastrophic ● Probability – Numeric rating of the estimated probability that the failure will occur, e.g., 1. Remote 2. Unlikely 3. Occasional 4. Common 5. Frequent
7
7 FMEA Fields (3) ● Nondetectability – Numeric rating of the difficulty of detecting the failure in time to prevent or mitigate its consequences, e.g., 1.Should the failure occur, there is a virtual certainty that the existing controls will detect it. 2.Should the failure occur there is a high probability that the existing controls will detect it. 3.Existing controls have difficulty detecting the failure 4.Controls are weak. Detection could depend on a lucky catch. 5.No controls for this failure mode exist ● RPN: Risk Priority Number – RPN = Probability x Severity x Nondetectability
8
8 FMEA Fields (4) ● Potential Remediations – Design strategies that could reduce the likelihood of the failure mode. ● Design Requirements – Abbreviated requirements statements for remediations. – Requirement numbers ([Requirements] worksheet).
9
Chapter 17 Human Error
10
IE 366 10 Human Error in the ● Cockpit ● Operating Room
11
IE 366 11 China Eastern Airlines Flight 583 Copyright Anthony Cheng, used with permission, downloaded from http://www.airliners.net/open.file/610387/M/ 30 Jun 04
12
IE 366 12 MD-11 Cockpit Copyright Harri Koskinen, used with permission, downloaded from http://www.airliners.net/open.file/463667/M/ 30 Jun 04
13
IE 366 13 China Eastern Airlines Flight 583 ● Background – McDonnell Douglas MD-11 – 6 April 1993 – Beijing → Los Angeles – 235 passengers, 20 crew on board ● Conditions – 950 nmi S of Shemya, Alaska – 33,000 ft MSL – 298 kt ● Events – inadvertent deployment of leading edge wing slats – series of violent oscillations – diverted to USAFB Shemya
14
IE 366 14 China Eastern Airlines Flight 583 (continued) ● Consequences – 2 passengers fatally injured – 53 passengers, 7 crew seriously injured – 96 passengers with minor injuries – passenger cabin substantially damaged ● NTSB Findings – deficient flap/slat handle design – captain (in right seat) inadvertently moved flap/slat handle while making corrections to MCDU ● Interpretation – System factors highly mode-sensitive control positive control bias inadequate protection against inadvertent operation proximity to other control – Human factors limits to motor accuracy inability to divide attention/distraction cost of concurrence
15
IE 366 15 Eastern Airlines Flight 401 Copyright Mike Sparkman, used with permission, downloaded from http://www.airliners.net/open.file/421710/L/ 30 Jun 04
16
IE 366 16 L-1011 Cockpit Copyright Gabe Pfeiffer, used with permission, downloaded from http://www.airliners.net/open.file/560839/L/ 30 Jun 04
17
IE 366 17 Eastern Airlines Flight 401 ● Background – Lockheed L-1011 – 29 December 1972 – JFK → Miami – 163 passengers, 13 crew on board ● Conditions – approaching Miami – < 2,000 ft MSL ● Events – landing gear handle down, nose gear green light failed to illuminate – climbed to 2,000 ft, set autopilot – entire flightcrew + observer tried to diagnose problem mechanical indicator in electronics bay nose gear light – began gradual descent – crashed into Everglades
18
IE 366 18 Eastern Airlines Flight 401 (continued) ● Consequences – 96 passengers, 5 crew fatally injured – 65 passengers, 10 crew non-fatally injured ● NTSB Findings – Flightcrew was pre-occupied with nose gear problem distracted from monitoring instruments – Captain failed to assure that someone was monitoring altitude ● Interpretation – System factors susceptibility to minor malfunctions low altitude high speed subtle autopilot behavior – Human factors inability to divide attention stress-induced narrowing of attention
19
IE 366 19 Human Error and Aviation Safety Source: Boeing Commercial Airplanes Primary Causes of Aircraft Accidents Hull Loss Accidents – Worldwide Commercial Jet Fleet – 1994 Through 2003
20
IE 366 20 Human Error and Operating Room Performance ● Patient Safety – Health Care, in general ● 44,000 - 98,000 deaths in the US annually due to errors ● 8th leading cause of death in US ● Exceeds motor vehicle accident deaths, etc. ● $17 - $29 billion spent annually (Institute of Medicine, 2000) – In the hospital ● 3.7% of hospital admissions experience an adverse event (Leape et al, 1991) – In the OR ● 41% of hospital adverse events occur in the OR
21
IE 366 21 Laparoscopy: Minimally Invasive Surgery of the Abdomen Photo courtesy of Alex Gandsas, MD. Used with permission.
22
IE 366 22 Laparoscopy Patient’s abdominal cavity Overhead View Side View
23
IE 366 23 Laparoscopy: Trocar Injuries ● Trocar should – penetrate abdominal wall – just enter abdominal cavity ● Trocar may – penetrate too far – nick bowel (especially if adhesions present from previous surgery) – damage major blood vessels (especially aorta) ● Trocar injuries – US ● 10,000 trocar injuries ~1991 – 2001 ● 50 – 240 fatalities / year – Europe ● 1 injury / 1,000 surgeries (0.1%) ● 1 fatality / 100,000 surgeries (0.001%)
24
IE 366 24 Patient Mortality in US Laparoscopic Cholecystectomies, 1991-2000 (source: Hunter, 2006) Totals YearAllLaparoscopicMortality (Laparoscopic) 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 494,541 517,433 506,980 470,692 465,703 429,439 428,901 421,583 406,817 405,367 431,605 N/A 270,988 (52.37%) a 324,794 (64.06%) 319,837 (67.95%) 315,956 (67.84%) 296,245 (68.98%) 300,364 (70.03%) 299,710 (71.09%) 292,651 (71.94%) 299,099 (73.78%) 324,783 (75.25%) N/A 900 (0.33%) b 1,424 (0.44%) 1,381 (0.43%) 1,097 (0.35%) 1,201 (0.41%) 1,305 (0.43%) 1,332 (0.44%) 1,685 (0.58%) 1,613 (0.54%) 1,630 (0.50%) N/A, Case volume not available due absence of specific ICD-9 code for LC or partial LC a LC volume projected for first 3 quarters of 1991, based on last quarter data b Mortality derived from last quarter 1991 data only
25
IE 366 25 Lessons Learned from Aviation and Medicine ● Causes of Adverse Events – Rarely individual incompetence lack of due diligence bad luck – Commonly Vulnerable Systems* Fallible Humans* * from E.L. Wiener, “Fallible Humans and Vulnerable Systems: Lessons Learned from Aviation,” 1987
26
IE 366 26 Human Fallibility + System Vulnerability → Human Error ● Human Error – an occasion “… in which a planned sequence of mental or physical activities fails to achieve its intended outcome, and when these failures cannot be attributed to the intervention of some chance agency.” – James Reason, Human Error ● Broad Classes of Human Errors – Planning Failures: Mistakes inappropriate plan, misdirected intention plan carried out as intended e.g., wrong medication prescribed & administered – Execution Failures Slips –appropriate plan, intention –part of plan performed incorrectly –e.g., trocar injury Lapses –appropriate plan, intention –part of plan omitted –e.g., forget to remove sponge before closing
27
IE 366 27 Understanding Human Fallibilities: The AORTA (Stage) Model of Human Performance Attend Observe Remember ThinkAct attend to one task attend to several tasks see/read hear feel (palpate) detect discriminate recognize perceive memorize recall (long-/short-term) maintain mental model calculate decide solve develop alternatives choose alternative select response reach grasp move/manipulate speak walk/run respond Environment stimuli responses
28
IE 366 28 Some Common Human Fallibilities Attend Observe Remember ThinkAct limited attentional resources attraction to salient but irrelevant cues inability to focus attention: distraction inability to divide attention: tunneling Environment stimuli responses
29
IE 366 29 Some Common Human Fallibilities Attend Observe Remember ThinkAct detection thresholds limited visual field sensory impairments auditory masking discrimination thresholds vigilance loss Environment stimuli responses
30
IE 366 30 Some Common Human Fallibilities Attend Observe Remember ThinkAct limited working memory capacity (7 + 2 “chunks”) limited working memory duration (< 20 sec) inefficient chunking verbal/spatial dominance weak long-term memory associations Environment stimuli responses
31
IE 366 31 Some Common Human Fallibilities Attend Observe Remember ThinkAct anchoring, confirmation bias recency bias tendency to treat all sources as equally reliable bias against absence of cues asymmetric valuation (gain/loss) overconfidence erroneous mental model Environment stimuli responses
32
IE 366 32 Some Common Human Fallibilities Attend Observe Remember ThinkAct anthropometric limits neuromuscular limits strength limits response time considerations speed/accuracy tradeoff Environment stimuli
33
IE 366 33 Reducing Human Error in the Cockpit ● Universality of pilot error widely acknowledged. ● Most pilot error due to innate human fallibility. ● Contribution of pilot error to aircraft accidents well understood. ● Pilot error anticipated, prevented, mitigated. ● Countermeasures, e.g., – pointer/counter altimeter – moving “tape” altimeter – moving map display – checklists – other mnemonics (memory aids) – Crew Resource Management (CRM) training
34
IE 366 34 Three-Pointer Altimeter
35
IE 366 35 Pointer-Counter Altimeter
36
IE 366 36 Moving Tape Altimeter
37
IE 366 37 Moving Map Display
38
IE 366 38 Checklists e.g., Boeing 757
39
IE 366 39 Other Mnemonics (Memory Aids) ● Before-Landing Mnemonic: GUMPS – Gas (fuel system configuration) – Undercarriage (landing gear down) – Mixture (engine fuel/air mixture) – Props (propeller pitch) – Systems (other system settings)
40
IE 366 40 Crew Resource Management (CRM) Training ● Communication / Interpersonal Skills ● Situational Awareness ● Problem Solving / Decision Making / Judgment ● Leadership / “Followership” / Teamwork ● Task Management ● Stress Management ● Critique
41
IE 366 41 Results (not attributable solely to Human Factors Engineering) ● Hull loss rate (per million departures) – 1959-1990: 1.91 – 1994-2003: 0.96 ● Hull loss accidents due to flightcrew error – 1959-1990: 75% – 1994-2003: 62%
42
IE 366 42 Human Factors Engineering Principles to Reduce Human Error ● Display Design Principles – Typography – Display Coding (color, size, location,...) – Pictorial Realism ● Control Design Principles – Control Coding (type, location, size, shape,...) – Movement Compatibility – Prevention of Inadvertent Activation ● Workstation Design Principles – Accessibility – Accommodation – Frequency of Use – Order of Use – Spatial Compatibility – Standardization ● Job Performance Aids – Mnemonics, Memory Aids – Checklists – Decision Aids – Computation Aids and many more …
43
IE 366 43 Work Systems Engineering Guidelines for Reducing Human Error (from text) 1. Get enough information. 2.Ensure that information is understood. 3.Have proper equipment/procedures/skill. 4.Don't forget. 5.Simplify the task. 6.Allow enough time. 7.Have sufficient motivation/attention. 8.Give immediate feedback on errors. 9.Improve error detectability. 10.Minimize consequences of errors.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.