Presentation is loading. Please wait.

Presentation is loading. Please wait.

Building a Culture of Learning Curators of the Just Culture Community

Similar presentations


Presentation on theme: "Building a Culture of Learning Curators of the Just Culture Community"— Presentation transcript:

1 Building a Culture of Learning Curators of the Just Culture Community
Presentation to Give some background on JC When and where it started Some of the success stories of JC and modeling techniques. Major airline cut installation errors by 50% out of heavy check maintenance Reduced ground damage events by 80% at one major airline and 50% at another Cut medication errors in half at one hospital. Reduced work comp expenditures by 25% at large manufacturing company John Westphal Outcome Engineering Curators of the Just Culture Community

2 Agenda What is Just Culture Aligning Beliefs Managing System Design
Managing Behavior Learning Through Events Enterprise Risk Management Just Culture Implementation Review and Wrap up

3 Introduction to the Just Culture
Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

4 An Introduction to Just Culture
The single greatest impediment to error prevention in the medical industry is “that we punish people for making mistakes.” Dr. Lucian Leape Professor, Harvard School of Public Health Testimony before Congress on Health Care Quality Improvement Learning objective: This quote represents Lucian’s view (1998) that healthcare had a blame/shame approach to healthcare. Facilitator notes: A few years ago, Lucian Leape testified before congress. He said that the single greatest impediment to error prevention in the medical industry is “that we punish people for making mistakes.” We don’t have a learning culture. Generally, in our society, we only report what we cannot hide. We don’t often learn when mistakes are made. One of the reasons is that we don’t have a culture where we can raise our hand and say, “I have made a mistake.” We punish people for making a mistake. In the past, Healthcare has been a punitive culture. How do we create that learning culture when we live in fear that the hospital board or the state regulator or nursing board or department of health will come after us if they find out we make a mistake? The Institute of Medicine came out with a report in 1999, “To Err Is Human” and in it they said that 44,000 to 98,000 people lose their lives each year in our nations hospitals due to medical error. It is characterized as the sixth leading cause of death in the nation. Lucian Leape said that part of the problem is that when anyone makes a mistake, we punish them. He said that this is just not the way forward. How do we hold each other accountable to get the best safety outcome, the best privacy outcome, the best customer satisfaction? That’s what Just Culture is going to be about. When an employee makes a mistake there are two pieces we should look at. There is the emotional issue: How could this happen? Then there is the science of it: what the best way is to prevent it from happening again. In school I was taught that experts come in pairs. There are two sides to each issue and you can find experts on each side of an issue. I am going to share some polar views on accountability and let’s see what you think. Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

5 An Introduction to Just Culture
“There are activities in which the degree of professional skill which must be required is so high, and the potential consequences of the smallest departure from that high standard are so serious, that one failure to perform in accordance with those standards is enough to justify dismissal.” Lord Denning English Judge From Wikipedia, the free encyclopedia Jump to: navigation, search Tenerife disaster (KLM 4805 and Pan Am 1736) CG rendering of Pan Am 1736 about to be hit by KLM 4805.SummaryDateMarch 27, 1977TypeRunway collisionSiteLos Rodeos Airport (now Tenerife North Airport) Tenerife, Canary IslandsTotal injuries61Total fatalities583First aircraftTypeBoeing NameClipper VictorOperatorPan American World AirwaysTail numberN736PAFlight originLos Angeles Int'l AirportStopoverJohn F. Kennedy Int'l AirportDestinationGran Canaria AirportPassengers380Crew16Survivors61Second aircraftTypeBoeing BNameRijn ("Rhine")OperatorKLMTail numberPH-BUFFlight originAmsterdam Airport SchipholDestinationGran Canaria AirportPassengers234Crew14Survivors0The Tenerife disaster took place on March 27, 1977 at 17:06:56 local time (also UTC) when two Boeing 747 airliners collided at Los Rodeos Airport (now known as Tenerife North Airport) on the island of Tenerife in the Canary Islands, Spain. Five hundred and eighty-three people were killed, the highest number of fatalities (excluding ground fatalities) of any single accident in aviation history. MS Herald of Free Enterprise was a roll-on roll-off (RORO) car and passenger ferry owned by Townsend Thoresen. She was one of three ships commissioned by the company to operate on the Dover–Calais route across the English Channel. The ferry capsized on the night of 6 March 1987 killing 193 passengers and crew. This was the worst maritime disaster involving a British registered ship in peacetime since the sinking of the Iolaire in On the day of the disaster, the Herald of Free Enterprise was working the route between Dover and the Belgian port of Bruges-Zeebrugge. The linkspan at Zeebrugge comprised a single deck and so could not be used to load decks E and G simultaneously. The ramp could also not be raised high enough to meet the level of deck E due to the high spring tides being encountered at that time. This was commonly known and was overcome by trimming the ship bow heavy by filling forward ballast tanks. The Herald was due to be modified during its refit in 1987 to overcome this problem. Before dropping moorings, it was normal practice for a member of the crew, the Assistant Bosun, to close the doors, the First Officer also remained on deck to ensure they were closed before returning to the wheel house. To keep on schedule, the First Officer returned to the wheel house before the ship dropped its moorings leaving closing of the doors the responsibility of the Assistant Bosun, Mark Stanley. Mark Stanley had taken a short break after cleaning the car deck upon arrival at Zeebrugge. He had returned to his cabin and was still asleep when the ship dropped its moorings. The captain could only assume that the doors had been closed since he could not see them from the wheel house due to their construction and had no indicator lights in the wheelhouse. There was confusion as to why no one else closed the doors. 5

6 An Introduction to Just Culture
“People make errors, which lead to accidents. Accidents lead to deaths. The standard solution is to blame the people involved. If we find out who made the errors and punish them, we solve the problem, right? Wrong. The problem is seldom the fault of an individual; it is the fault of the system. Change the people without changing the system and the problems will continue.” Don Norman The Design of Everyday Things Learning objective: Display empathy with both of these views. There is a balance between the two views. Is punishing people the way to prevent accidents? Maybe. Is an adverse event always due to system error? Not always. Just Culture is about both. Examining both views! Facilitator notes: Let’s go to the flip side. Don Norman wrote a great book, The Design of Everyday Things, a great introduction to human factors. He is a former Executive with Apple. Look at his quote. Who do you like? Is it always the system fault? It’s half Lord Denning, half Don Norman. Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

7 What system of accountability best supports system safety?
The Problem Statement What system of accountability best supports system safety? Support of System Safety As applied to: Providers Managers Institutions Regulators Learning objective: On one end of the spectrum, we have Lord Denning and the punitive culture where one failure is enough to justify dismissal. This creates an environment where no one reports anything. You would just be in a place of fear all of the time. It would require that we not be human, because as humans, you and I are going to make mistakes. At the other end of the spectrum is Don Norman and the blame-free culture. It’s not the fault of the individual; it is all the system. What does that say to our employees? Just do what you want to do, it’s all a system issue. Some where in the middle is a system of accountability that bests supports a system of safety as applied to providers, managers, institutions, and regulators. Blame-Free Culture Punitive Culture Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

8 Four Cornerstones of a Just Culture
Create a Learning Culture Create an Open and Fair Culture Design Safe Systems Manage Behavioral Choices Learning objective(s)s: Cultural change depends on consistent messages to staff from managers, HR, and leadership. Champions will use education, feedback, and reinforcement to speed the “learning curve” A learning culture rests on the quality of event/error investigations and follow-up. Champions hold the data and are the “memory” to see that managers follow- through. Concern: managers will and motivations to deal with difficult issues Champions will use their critical thinking skills to identify systems issues that cross departments or functions Champions are in unique position to evaluate managers changed responses to errors/events and personnel behavioral change through SALTs, HR actions, and by encouraging unit managers to use system metrics to evaluate safety Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

9 Cornerstones of a Just Culture
Create a Learning Culture Eager to recognize risk at both the individual and organizational level Risk is seen through events, near misses, and observations of system design and behavioral choices Without learning we are destined to make the same mistakes OPTIONAL SLIDE Learning objective(s): Create a Learning Culture Key attribute in producing the best possible outcomes Risk seen through events, near misses, and observing the design of our systems and our own behaviors Without a learning culture, the staff is destined to repeat previous mistakes A learning culture provides ability to identify solutions having the greatest impact Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

10 Cornerstones of a Just Culture
Create an Open and Fair Culture Move away from an overly punitive culture and strike a middle ground between punitive and blame free Recognize human fallibility Humans will make mistakes Humans will drift away from what we have been taught OPTIONAL SLIDE Learning objective(s): Create an Open and Fair Culture This is neither a “punitive” nor a “blame free” culture – somewhere in the middle We must recognize our own fallibility – errors and drift All employees are accountable for their behavioral choices Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

11 Cornerstones of a Just Culture
Design Safe Systems Reduce opportunity for human error Capture errors before they become critical Allow recovery when the consequences of our error reaches the patient Facilitate our employees making good decisions OPTIONAL SLIDE Learning objective(s): Design Safe Systems We must put our employees in reliable systems Systems should anticipate errors, capture errors before they become critical, and allow recovery when consequences of our errors have reached the patient. Facilitate good decision making Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

12 Cornerstones of a Just Culture
Manage Behavioral Choices Humans will make mistakes. We must manage behavioral choices in a way that allows us to achieve the outcomes we desire Cultures will drift into unsafe places Coaching each other around reliable behaviors OPTIONAL SLIDE Learning objective(s): Manage Behavioral Choices Humans will drift into unsafe places We must productively coach our employees to reliable behaviors Recognize when corrective actions will get us the results we desire Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

13 Managerial and Staff Choices
What is a Just Culture? Adverse Events Supports a learning culture Focuses on proactive management of system design and management of behavioral choices Human Errors Learning objective: We have Adverse Events and Human Errors above the line. In Just Culture we refer to these as outputs. They are outputs of system design and managerial and staff behavioral choices. Those are the two tools we have – system design and behavioral choices. We can help our employees make good behavioral choices and we can design good systems around them, but we live with the errors that occur and the outcomes that are produced. For a physician that makes a misdiagnosis, the misdiagnosis is the error, and we say we have to live with that. The fix for misdiagnosis is not simply to say to the physician to stop misdiagnosing. It is to go back and say what was the system we designed and what behavioral choices did the physician and support staff make which led to that misdiagnosis. This will create a learning culture and Just Culture helps us see what is happening in our system design and in our behavioral choices. System Design Managerial and Staff Choices Learning Culture Just Culture Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

14 What We Must Believe About the Management of Risk
Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

15 Our Beliefs About Risk Management
To Err is Human To Drift is Human Risk is Everywhere We Must Manage in Support of Our Values We Are All Accountable Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

16 To Err is Human Learning objective:
Understand human fallibility and the most effective way to mitigate the adverse outcome. Facilitator notes: The starting point is that we walk into this class and say, you know what? To err is human, we’re not perfect human beings. We are going to make mistakes. We must, as an organization, recognize that at the very beginning. That’s the starting point. Does that mean we can’t get to an ultra high levels of reliability? No, because we can. You’re not going to get there by requiring perfection. We have to embrace our fallibility and design systems around our fallibility. When the Regulators like the Nursing board says that nurses are not allowed to make a mistake, the they think they are helping the safety of the profession and our answer is “no, you’re not.” You’re hurting the safety of the profession because you are preventing us from designing around that fallibility. We need the ability to say we are mistake prone and design systems that support us as fallible human beings. Give empathy and understanding We console the individual and examine the system Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

17 To Drift is Human Learning objective:
Understand drift as a part of human nature and the most effective way to mitigate an adverse outcome. Sometimes the better we are at what we do the less likely we are to recognize that we have drifted. (positive outcomes have reinforced at-risk behavior) Facilitator notes: Steve Irwin - Do you think that on this day Steve said I choose to put my child in harms way? Why would he do this? He grew so comfortable being in the pen that bringing his child in was just not that significant a risk. The rest of us in the world looked at that and said No! that was a dumb decision. Six billion people thought it was a dumb thing to do, but was he at a place where he perceived this a an OK risk to take? It’s a behavioral choice. When we talk about to err is human, we are talking about the inadvertent things we do. I didn’t mean to do that. Drift is the choice that you and I make convincing ourselves that we are in a safe place. Where were you taught to have your hands when you drive down the road? 10 and 2 (perhaps 9 and 3). Today the hands should be on 8 and 4. So where are your hands now when you drive? One at the top – one holding a Starbucks? Ladies, will you put on lipstick as you drive down the road? Will you put on mascara? You’ll do lipstick but you won’t do mascara because of the chance you might poke yourself in the eye. We all have our limits. We know where our hands are suppose to be – at 10 and 2. Guys have you ever tied your tie while driving down the road? The issue is we have been taught to do this but we begin to drift. Maybe it’s the pilot not using the checklist saying I can do this from memory. They say that one of the hardest things to change in the airline industry is the procedure for checking the air pressure in the plane tires. No one every reads the procedure for checking the pressure in the first place so how would they even know that the procedure has been changed! In healthcare we have provisions that say to use two patient identifiers in order to confirm our patients. But, Martha has been in the same room for a week. I deal with her day in and day out. That’s the drift. It’s not the error. It’s the choice to say I’m going to deviate from policy because I don’t see the risk any more. I’m going to drift into what we call at-risk behavior. We as humans are set up to drift into riskier and riskier choices until we get burned. Then we say – Oh my gosh – that does look like a bad decision (now) but we didn’t see it along the way. Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

18 Risk is Everywhere Risk Risk can be a perception
Risk can be an absolute Risk is not inherently bad Learning objective: His perception of risk is around the truck falling on him not the welding of the gas tank. His system design from keeping the truck from falling is not great either. His perception of risk in one area did not carry over in to the other Supporting information: Risk is everywhere. Risk is an engineering term. Everything in life has risk. We are not going to try to convince you that risk management is only about minimizing risk. It is about taking the right risk. Risk itself is not a bad concept. We have worked with a hospital that has courage as one of its values. To some extent that ties in to the willingness to take a risk. Risk is not a bad concept. It’s a perception. When Steve Irwin died, our perception about swimming with Sting Rays probably changed. It can be an absolute: one accident in every six million departures in aviation. But the idea here is that risk is not inherently bad. We are going to differentiate the notion of values from the notion of risk. Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

19 We Must Manage in Support of Our Values
Facilitator notes: On the left is Steve and his son. On the right is a picture of a baby on a process called ECMO (ExtraCorporeal Membrane Oxygenation). This device is used in some NICUs for newborns who are in pulmonary distress. Large catheters are placed in the baby’s neck. Blood is circulated through an oxygenator, it is warmed, and then returned into the baby. We by-pass the heart and lungs until they have had a chance to develop. Because of the danger involved it is usually used as a last resort. There is a 75% survival rate for babies on ECMO but because of the dangers involved there is a chance that between 5% and 10% might die because of infection, bubble in the system, anticoagulants used or some other factor. So, the doctor comes to you and says At this time, your baby is not equipped for life outside the womb. We have this machine that has a 75% chance of saving your baby’s life. Now, if you don’t go on it the prognosis is that your baby will die. If we use ECMO there is perhaps a 10% chance that we will inadvertently take the life of your child. There are all sorts of risks involved. What would you say? Let’s look at this. Is what Steve Irwin is doing (on the left picture) risky? Now ECMO. Is ECMO risky? The reversal of Risk is safety, where we are unsafe. Is what Steve doing unsafe? Yea, there’s no great calculation. It just looks unsafe. What about ECMO? Is ECMO unsafe? If you believe the rate of 1:100 chance we will by mistake inadvertently take the life of the child, would we consider ECMO to be an unsafe healthcare practice? Risk = Severity X Likelihood Safety is a value related to the reasonableness of the risk we take. Being a safe hospital does not mean eliminating all the risks. We want to minimize the risks but we also want to save as many people as possible. What we talk about in Just Culture is doing the right thing and that’s not always as easy as it seems. Risk = Severity x Likelihood Safety ~ Reasonableness of Risk Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

20 Our Values Overlapping values? Yes Competing values?
Still - we must prioritize and balance our support of our values Innovation Integrity Collaboration Giving Learning objective or facilitator note: With Just Culture you won’t hear us teaching that safety is or should be your most important value. Certainly, it should be one of your values, but it must compete with your other organizational values for it place in your hierarchy of values. Safety Caring Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

21 We Are All Accountable Across All Departments Across All Positions
Across All Behaviors Human error At-risk behavior Reckless behavior Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

22 Managing System Design
Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

23 The Safety Task: Managing System Reliability
100% 100% System Failure Design for System Reliability… Successful Operation Human factors design to reduce the rate of error Barriers to prevent failure Recovery to capture failures before they become critical Redundancy to limit the effects of failure Learning objective: We are talking about managing systems and behaviors of those that manage or work within the systems. Systems will never be 100% perfect. We have to manage the factors that influence reliability. Facilitator notes: Before we get into the model itself I want to talk about the safety task. Take a step back and say what is it we are trying to do in safety and how does accountability play a role in that? There are really two pieces to review: there is system reliability an human reliability – humans being a component of the system. What we are trying to design for system reliability, knowing that the system can never be perfect. We don’t design nuclear power plants never to melt down; we don’t design aircraft never to crash…we design it reducing the likelihood to a level that we hope it’s never going to happen in our life time or our kids lifetime, but the notion of it never happening changes how we would manage the risk. The only way we could to say that it is never going to happen is to believe that the components of the system are going to be perfect an good system design says No! No! No! You’re going to be imperfect. Parts are going to fail, humans are going to be fallible, now how do we build in good system reliability. We use things like human factors design criteria. We put in things like barriers. On your car, by the gas cap there are three barriers. 25 years ago, about once a year, we would have to go to our local auto stare to buy another gas cap because we left the gas cap up on the pump. What manufacturers did to help us out was to tether it to the car so that you can’t just drive away without it. They decreased the size of the hole for unleaded gas so that we can’t put leaded gasoline into a tank for unleaded gas. And, we put a devise on there that if you try to over-torque the cap, it will start to ratchet so that it won’t over-torque. Those are barriers. We bring in recovery. Recovery is a downstream check to catch an up stream error. Nurses are our recovery mechanism for physicians, ordering errors, or pharmacy dispensing errors, to catch and correct before it gets to the patient. And, we bring in redundancy. You will notice that every commercial airplane you fly will have two engines, twi pilots, three electrical systems (at least), three hydraulic systems. The FAA rule is that no single failure on that plane can lead to the loss of the aircraft. We don’t want to be one failure deep. It has to be multiple independent failures to lose the aircraft. So we design and a good place to see this is in surgical counts. We could say to the surgeon , Look you were taught to clean up your room as a kid, just clean up after surgery. Just don’t leave things behind. You can’t accountable for this. You’re suppose to know what body part you’re doing it on and you’re suppose to clean up when you’re done. But, that’s not how we design our system to prevent the retention of objects (instruments). What we did is we said we are going to have the scrub nurse count, we are going to have the circulator count, and if it is going to be left behind, it is going to take three independent errors…the surgeon going to have to miss it, the scrub nurse is going to have to miss it and the circulator is going to have to miss it. And, if they ever feel that process doesn’t work, what do we do? We say let’s make everything visible via x-ray, and if there is ever any doubt. We will x-ray the patient and catch it. This is reasonable system design. 0% 0% Poor Good Factors Affecting System Performance … knowing that systems will never be perfect Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

24 The Safety Task: Managing Human Reliability
100% 100% Human Error Design for Human Reliability… Successful Operation Information Equipment/Tools Design/Configuration Job/Task Qualifications/Skills Perception of Risk Individual Factors Environment/Facilities Organizational Environment Supervision Communication 0% Learning objective: Factors that improves the reliability of their choices. Factors affecting human performance. The second piece is human reliability. These go hand in hand. As your system reliability improves, the management of your human reliability must also improve. We’ve listed some here, but there are many more that you could put in place. Even though we have the surgeon, the scrub nurse, and the circulator doing their job, we need them to be as reliable as they can be. We don’t want to scrub nurse and the circulator to fall into an at-risk behavior where the scrub nurse holds up the sponge and the circulator counts the sponge. We want them to independently counting so that we are three independent failures from harm. We want to maximize system reliability, we want to maximize human reliability, but knowing that humans are never going to be perfect. We can’t have an expectation of a perfect human being in our system, but we are going to design around them good robust systems to give us the results we want. Poor Good Factors Affecting Human Performance … knowing humans will never be perfect Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

25 Seven Design Strategies Important to Managing Risk
“Make No Mistakes” Knowledge and Skill Performance Shaping Factors Barriers Redundancy Recovery Perception of High Risk Learning objective: Most of you don’t think of yourselves as system designers. More than likely, you didn’t have input into the pieces of equipment you use everyday. We think about system design in the larger sense on that we are talking not only about the equipment abut the policies and procedures that we design around out employees. So be thinking about how you as an executive play an important role in the design of your employees systems. Facilitator notes: We list seven design strategies. The first one, Make No Mistakes, is put in quotation marks because it is somewhat tongue-in-cheek. However, we do see organizations that appear to expect their employees to be perfect. We don’t advocate that as a strategy because we don’t think it is a very effective strategy. We as humans are fallible. We’re not perfect, we are all subject to human error and to drift. Don’t think that just telling your employees to not make mistakes is a very effective strategy. Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

26 Managing Behavior 10 and 2 or Eight and coffee we drift, we lose sight of the risk. Just as Britney may have lost sight of the risk here in an attempt to escape the media (paparazzi) Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

27 The Behaviors We Can Expect
Human error – an inadvertent action; inadvertently doing other that what should have been done; slip, lapse, mistake. At-risk behavior – a behavioral choice that increases risk where risk is not recognized, or is mistakenly believed to be justified. Reckless behavior – a behavioral choice to consciously disregard a substantial and unjustifiable risk. Learning objective: Humans are not always going to do the right thing. There are three big actions that they are going to take. They are going to make mistakes. The human error. The inadvertent, the slip, the lapse. They are going to drift into at-risk behavior where they make a noncompliant choice when they thought they were in a safe place. Thirdly, the reckless behavior. I know I am in an unsafe place an I choose to stay there. Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

28 Managing Behavior Human Error Slip Lapse Mistake Inadvertently doing other than what you should have done Often a product of our current system design Learning objective: Human Error: A slip; a lapse; a mistake Facilitator notes: This is a 2,000 pound bomb Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

29 Managing Human Error Two questions:
Did the employee make the correct behavioral choices in their task? Is the employee effectively managing their own performance shaping factors? If yes, the only answer is to console the employee – the error happened to them. Learning objective: If the answer to these questions is yes, then we console the employee and look to the system we have placed them in to see how the system contributed to the outcome. Facilitator notes or learning objective? What is consoling? A learning conversation. Why the event happened and what can be done to prevent it from happening again. Coaching is assessing the quality of their choice. Providing the perception of risk that they lost. Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

30 Managing Multiple Human Errors
What is the source of a pattern of human errors In the system? If yes, address the system. If no, can the repetitive errors be addressed through non- disciplinary means? Learning objective: Will the person address the performance shaping factors that are causing multiple human errors. Facilitator notes: If one person is making multiple mistakes what does that tell you? If multiple people are making the same mistake what does that tell you? Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

31 Managing Reckless Behavior
Conscious Disregard of a Substantial and Unjustifiable Risk Manage through: Disciplinary action Learning objective: Reckless behavior is rare. How many people come to work wanting to do a bad job? Reckless is a rare event. The issues are really human errors and at-risk behavior. Facilitator notes: When we have reckless behavior we engage our disciplinary policies. Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

32 Managing At-Risk Behaviors
A behavioral choice that increases risk without perceiving the risk (unintentional risk taking), or he/she mistakenly believes the risk to be justified Driven by perception of consequences Immediate and certain consequences are strong Delayed and uncertain consequences are weak Rules are generally weak Learning objective: ARB is where we make a choice. But we make a choice thinking we are in a safe place. You make a choice not to follow the law (policy or procedure) but we do not choose to have an adverse outcome. When we deviate from the standard, we call this drift. We drift thinking we are OK. Facilitator notes: Where were you taught to place your hands on the wheel when you drive? 10 and 2. Where were they when you drove to work today? What about your speed when you drive? Driving 69 in a 60 when that green Kawasaki Ninja come flying by doing a wheelie at 100 mph. We look down our noses and say He’s reckless! He’s different from me; I’m a safe violator! The lesson I learn from the Steve Irwin incident is that sometimes, the better you are at what you do as a professional, the less likely you are to recognize that you are in a risky place. In an interview following the incident, Steve Irwin said, You people don’t understand. I’m a professional. I would never do anything to put my son in harms way; as if he were immune to making a mistake. We are in a sense the products of our own experience. Think about driving a car, talking on our cell phone, listening to the radio, drinking a Starbucks. We do this largely because nothing bad has happened to us so far. Each time we do this and nothing bad happens, we reinforce the at-risk behavior. We begin to think it’s OK. We begin to think we are in a safe place. ARB is our biggest challenge in our hospitals, but it is also our biggest opportunities for improvement. When you think about your organization, think about the number of people engaged in ARB. That’s where you opportunities will be. You will see people in your organization who make mistakes and you will see a few who are occasionally reckless, but most of your organization will be engaged in ARB of one type or another at some time throughout the day. Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

33 Managing At-Risk Behaviors
A behavioral choice Managed by adding forcing functions (barriers to prevent non-compliance) Managed by changing perceptions of risk Increasing situational awareness Learning objective: How do we manage ARB? We look at the system in which our employee is working. Is there anything contributing to the employee doing it in an unsafe manner? Are there barriers to prevent non-compliance? Are there things we can do to change the perception of risk? Training, Education, Knowledge? Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

34 Managing At-Risk Behaviors
"The best car safety device is a rear-view mirror with a cop in it." Dudley Moore Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

35 The Three Behaviors Console Coach Punish Human Error At-Risk Behavior
Reckless Behavior Product of Our Current System Design A Choice: Risk Believed Insignificant or Justified Conscious Disregard of Unjustifiable Risk Manage through changes in: Processes Procedures Training Design Environment Manage through: Removing incentives for at-risk behaviors Creating incentives for healthy behaviors Increasing situational awareness Manage through: Remedial action Disciplinary action Console Coach Punish Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

36 The Role of Event Investigation
Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

37 The Basics of Event Investigation
What happened? What normally happens? Increasing value What does procedure require? Learning objective: Supporting information: Why did it happen? How were we managing it? Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

38 Just Culture Rules of Investigation
1. Causal statements should clearly show the “cause and effect” relationship. 2. Negative descriptors (such as “poorly” or “inadequate”) may not be used in causal statements. 3. Each human error should have a preceding cause. 4. Each procedural deviation should have a preceding cause. 5. Failure to act is only causal when there was a pre- existing duty. The Five Rules of Causation The five investigative rules were developed through Federal Aviation Administration research to improve the event investigation process by creating minimum standards for where an investigation should go, where it cannot go, and how the results should be documented. The rules were created in response to the very real biases we all bring to the investigation process. Each rule and its application are discussed below: Rule 1 – Causal Statements should clearly show the “cause and effect” relationship. This is the simplest of the rules. When describing why an event has occurred, the investigator should show each link in the chain of events, and each link should be clear to the reader of the report. The investigator should not merely state what they think may be a root cause, but should instead show each link from the causes to the undesirable outcome being investigated. Even a statement like “employee was fatigued” is deficient without a description of how and why this led to an adverse event. The bottom line: the reader needs to understand the investigator’s logic in linking causes to the outcome being investigated. Rule 2 – Negative descriptors (such as poorly or inadequate) should not be used in causal statements. As humans, we try to make each job we have as easy as possible. Unfortunately, this human tendency works its way into the event investigation process. We may shorten our findings by saying “procedure was poorly written” when we really have a much more detailed explanation in our mind. To facilitate clear cause and effect descriptions (and avoid inflammatory, blaming statements), a negative descriptor should not be used as a placeholder for a more accurate, clear description. Even words like carelessness and complacency are bad choices because they are broad, negative judgments that do little to describe the actual conditions or behaviors that led to the mishap. Rule 3 – Each Human Error must have a preceding cause. Most of our adverse events involve at least one Human Error. Unfortunately, the discovery that a human has erred does little to aid the prevention process. Investigations must search to determine WHY the Human Error occurred. For every Human Error in the causal chain, it should have a corresponding cause. It is the cause of the error that leads us to effective prevention strategies. Rule 4 – Each procedural deviation must have a preceding cause. Procedural violations are like errors in that they are not directly manageable. Instead, it is the cause of the procedural violation that we can manage. If an employee is violating a procedure because it is the local norm, we will have to address the incentives that created the norm. If an employee is violating a procedure because he is not aware of the policy in the first place, we have to work on education. If there is a procedural deviation in the causal chain, it should have an explanation. Rule 5 – Failure to act is only causal when there was a pre-existing duty to act. We can all find ways in which an adverse event would not have occurred – but this is not the purpose of causal investigation. Instead, we must search to find why this mishap occurred in the system as it is designed today. A nurse’s failure to detect a pharmacy error can only be causal if the nurse was required to look for the error in the first place. The duty to perform may come from a policy, a standard of practice, or the commitment to a colleague. This rule keeps the investigator focused on cause, and away from our views of how the mishap might have been prevented. Prevention comes only after the cause has been determined. Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

39 It’s About a Proactive Learning Culture
Events should be seen as opportunities to inform our risk model System risk Behavioral risk Often, events are seen as things to be fixed Where management decisions are based upon where our limited resources can be applied to minimize the risk of harm, knowing our system is comprised of sometimes faulty equipment, imperfect processes, and fallible human beings Learning objective: Supporting information: Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

40 Enterprise Risk Management
40

41 Acute Care Medication ST-PRA

42 Risk Reduction Table Medication Errors

43 Just Culture Implementation
Taking the Necessary Steps Copyright 2007, Outcome Engineering, LLC. All rights reserved.

44 What We Have Seen at Other Organizations
There is a small (5%) population of the staff that is openly opposed to management There is a larger (20%) population that believes this is the right way to go The remainder (75%) have expressed that they believe the program will work, but likely will not buy into the program until they see management start to adhere to the philosophy One fifth believe this is the right way to go—they are enthusiastic and want to be part of it. 75% think the program can work, but are waiting to see if management will “walk its talk” before they take it seriously. They complain that there are too many initiatives at the same time, that there is a history of punitive culture, that managers aren’t visible to line staff much of the time, they want to see some results, or that they don’t believe that incidence reports are being taken seriously, etc. Central is the desire to see managers “walk the talk,” no surprise to any of you. So how do you do that? Management accountabilities sounds threatening—but what we’re really talking about is expectations. I’m sure many of you could have written these—you’ve been trained and have been doing this job for years. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

45 Words to Describe Implementation
A Journey? An Intervention? A Program? A Set of Tools? A Model? A Foundation? A Lifestyle? Copyright 2007, Outcome Engineering, LLC. All rights reserved.

46 Measurements of Success
Error Rates Safe Systems Outcomes Culture Safe Choices Periodic gap analysis Self-reports, audits, observations, interviews, investigations Near misses Adverse events (iatrogenic harm) Copyright 2007, Outcome Engineering, LLC. All rights reserved.

47 A Multi-Cycle Improvement Process
Executives and Champions Training Just Culture Training for Managers Safe Choices Training for Staff Baseline Benchmarking: Behavioral Markers Benchmarking: Behavioral Markers First Cycle Tools Subsequent Cycle Tools Baseline Culture System Design Peer Review Baseline Gap Analyses Culture Change Managerial Accountabilities Post Gap Analyses Coaching and Mentoring for Managers Additional Tools and Training Event Investigation Event Investigation Monthly Organizational Coaching and Mentoring Denotes measurement and/or feedback loop Copyright 2007, Outcome Engineering, LLC. All rights reserved.

48 The Just Culture Algorithm
The Algorithm is designed to help you determine what to do when your employee has made an error, or not met a duty to the organization. The Just Culture Algorithm simply allows us to independently assess each breach of a duty. The Algorithm addresses the conflicts that might arise with overlapping duties when you apply the social utility test. A set of ordered steps for solving a problem. Copyright 2007, Outcome Engineering, LLC. All rights reserved. Copyright 2007, Outcome Engineering, LLC. All rights reserved.

49 Review & Wrap Up

50 Just Culture - What’s it About?
It’s about both Error and Drift It’s both Pre- and Post-Event It’s about Executive Commitment It’s about Values and Expectations It’s about System Design and Behavioral Choices It’s for All Employees It’s Partnership with the Regulator It’s About Doing The Right Thing 50

51 Doves or Hawks? Who Are We?
Share the Viracon story

52 Epilogue “Most healthcare providers choose a life of service. They put themselves in harm’s way to care for others. They expect a lot of themselves as professionals. Yet, they remain fallible human beings, regardless of any oaths to do no harm. They are going to make mistakes and occasionally drift into risky places (see hand hygiene). The future of our nation’s health depends upon our ability to learn from their errors and at-risk behaviors.” David Marx, JD Whack-a-Mole 52

53 Thank You John Westphal
Outcome Engineering 2200 W. Spring Creek Parkway Plano, TX


Download ppt "Building a Culture of Learning Curators of the Just Culture Community"

Similar presentations


Ads by Google