Download presentation
1
Part 1: Learning from Unexpected Events
Michael P. Silver, MPH Director, Scientific Affairs and Patient Safety HealthInsight
2
“Every system is perfectly designed to get the results it achieves”
The Design Challenge “Every system is perfectly designed to get the results it achieves” Benefits and harm are designed into health care systems
3
Design of health care systems and processes
Elements configured by designers include: People – education, training, orientation, … Materials – medications, supplies, … Tools – medical equipment, information technology, forms, communication media, … Methods – procedures, diagnostic and treatment processes, management practices, policies, communications practices, coordination of effort, …
4
Sources of design failure in complex systems
Design flaws are expected because (for example): Actual operations are more complex than our design models System elements interact in unexpected ways Procedures, tools, and materials are used in ways not anticipated Multiple designers with potentially different goals and assumptions Safety features, defenses become degraded over time Environmental conditions, expectations, and demands change over time
5
The world points out our design flaws to us
In the course of actual operations, design flaws will produce: Errors, unsafe acts, procedure violations Glitches Near-misses Accidents Injury Sentinel events/catastrophes (We may also learn from other people’s failures)
6
We have a hard time listening to the world!
Victims of (apparent) success We may not hear about many failures (especially “small” ones) or recognize them as associated with our decisions Designs work most of the time Dedicated staff negotiates hazards, improvises, and complete the design for us Because of this, and other biases, failures and accidents may be understood as the product of individual failures rather than design flaws Difficult to attend to all of the lessons available Rush to closure Review focused on immediate causes/ reluctance to look deeper There’s always something else that seems more pressing
7
Brown, Clark, Anderson, Ramon
A lesson from the Columbia accident Brown, Clark, Anderson, Ramon Husband, Chawla, McCool
10
Was foam insulation supposed to fall off the external fuel tank?
No! (of course not) Yet, it did Foam loss had occurred on about 80% of previous missions A recognized, major source of damage to thermal protection tiles A routine occurrence Large pieces of foam had detached from the left bipod ramp in approximately 10% of previous missions Only two missions previous, a large piece of foam from the left bipod ramp impacted a ring that attaches the solid rocket boosters to the external fuel tank During the Columbia mission, the foam strike was considered to be a significant maintenance issue, but not a mission safety issue
11
On-Line Exercise: You’re the Teacher
Identify an example from your clinical experience* that can be used to illustrate the “drift toward failure” observed in the Columbia disaster Warning signs observed, warning signs ignored Normalization of deviance Past successes used as evidence of future success *If you have no such experience, develop a plan to connect with clinician(s) to identify examples.
12
Incident Investigation/RCA – In context
Event detection Event reporting Incident investigation/ RCA Improved understanding of system/ processes Effective solutions/ improved designs Increased safety Safety Culture Reporting culture Assumptions about the meaning of error and accidents Demonstrated organizational value and commitment to safety Prospective (organizational) accountability “Just” response to error, unsafe acts, and accidents Organizational learning
13
Event Reporting In order to learn from unexpected events, we must first learn of them Sentinel events Patient harm No-harm events Near misses Unsafe acts, errors hazardous conditions accidents waiting to happen Lesser events not only provide opportunities for learning, but by their sheer volume represent substantial waste, frustration, and re-work.
14
Off-Line Assessment: Event reporting in your facility
Gather a team to review how effectively event reporting is supported Does reporting place undue burdens on reporters? Have expectations for reporting been clearly communicated? “No harm, no report”? Consistent message from supervisors? “Not our shift/not our department”? How do staff know that there will be a “just” response to events identified How do we provide feedback to reporters (both immediate and in terms of actions taken)?
15
Learning from Recovery/Mitigation
System reliability and safety can be improved by: reducing failures, errors, and unsafe acts increasing the likelihood that these are detected and prevented from propagating both Also promotes a better understanding and appreciation for defenses
16
Part 1 Summary Incident investigation and root cause analysis
Central to ongoing system design process Difficult to do well Depends on and reinforces event reporting Is an outgrowth of and partially defines organizational safety culture Is a key process of safety management
17
Part 2: Understanding the Causes of Events
18
Why event investigation is difficult
Natural reactions to failure Tendency to stop too soon Overconfidence in our re-constructed reality “The root cause” myth
19
Our reactions to failure
Typical reactions to failure are: Retrospective—hindsight bias Proximal—focus on the “sharp end” Counterfactual—lay out what people could have done Judgmental—determine what people should have done, the fundamental attribution error
20
On-Line Exercise: “any good nurse …”
In the course of event investigations and RCA you can expect to encounter the “any good nurse …” reaction. Describe how this might negatively impact the investigation process How can you anticipate and/or respond to this reaction?
21
Stopping too soon Lack training in event investigation
We don’t ask enough questions Shallow understanding of the causes of events Lack resources and commitment to thorough investigations
22
Overconfidence in our re-constructed reality
People perceive events differently Common sense is an illusion Unique senses Unique knowledge Unique conclusions
23
The “the root cause” myth
There are multiple causes to accidents Root cause analysis (RCA) is not about finding the one root cause
24
The “New View” of human error
Human error is not the cause of events, it is a symptom of deeper troubles in the system Human error is not the conclusion of an investigation, it is the beginning Events are the result of multiple causes
29
Creating the holes Active Failures Latent conditions
Errors and violations (unsafe acts) committed at the “sharp end” of the system Have direct and immediate impact on safety, with potentially harmful effects Latent conditions Present in all systems for long periods of time Increase likelihood of active failures
30
“Latent conditions are present in all systems
“Latent conditions are present in all systems. They are an inevitable part of organizational life.” James Reason “Managing the Risks of Organizational Accidents”
31
Root Causes A root cause is typically a finding related to a process or system that has potential for redesign to reduce risk Active failures are rarely root causes Latent conditions over which we have control are often root causes
32
On-Line Exercise: The failed RCA
The evidence suggests that, currently, most RCAs conducted in health care are ineffective. How would you know that an RCA had failed?/ What are the characteristics of a failed RCA? (Off-line) What RCA practices and procedures do you think would be likely to produce a failed RCA?
33
On investigating human error
“The point of a human error investigation is to understand why actions and assessments that are now controversial, made sense to people at the time. You have to push on people’s mistakes until they make sense—relentlessly.” Sidney Dekker
34
Getting Inside the Tunnel
Possibility 2 Possibility 1 Actual Outcome Screen Beans®
35
Outside the Tunnel Inside the Tunnel Outcome determines culpability
“Look at this! It should have been so clear!” We judge people for what they did Inside the Tunnel Quality of decisions not determined by outcome Realize evidence does not arrive as revelations Refrain from judging people for errors
36
Lessons from the Tunnel
We haven’t fully understood an event if we don’t see the actors’ actions as reasonable. The point of a human error investigation is to understand why people did what they did, not to judge them for what they did not do.
37
Summary New view of human error Events are the result of many causes
Active failures and latent conditions create holes in our system’s defenses Root cause are causes with potential for redesign to reduce risk Active failures are rarely root causes, latent conditions are often root causes Getting inside the tunnel will help us understand why events occur
38
Questions? Comments? References
Dekker, S. The Field Guide to Human Error Investigations. Burlington, VT: Ashgate, 2002. Gano DL. Apollo Root Cause Analysis: A New Way of Thinking. Yakima, WA: Apollonian Publications (last accessed 7/16/06) Reason J. Managing the Risks of Organizational Accidents. Brookfield, VT: Ashgate, 1997.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.