Download presentation
Presentation is loading. Please wait.
1
Human Performance and Patient Safety
Jim McMenemy Winnipeg Regional Health Authority
2
Outline Human Factors definition Human Error
Evolution of Human Error Understanding Organizations and Socio-technical systems Vulnerability and Countermeasures
3
Meaning of Human Factors
What do we mean by “Human Factors”? “Human Factors is concerned to optimize the relationship between people and their activities, by the systematic application of human sciences, integrated within the framework of systems engineering.” (ICAO Digest No. 1) EO#1: EXPLAIN WHAT IS THE DISCIPLINE OF HUMAN FACTORS ACCORDING TO ICAO AND BRIEFLY EXPLAIN SOME OF THE TOPICS THAT ARE COVERED BY THE DISCIPLINE AND SOME OF THE TOPICS THAT ARE NOT. <Advance the first line of the slide and ask the group> What do we mean by Human Factors? <Discuss and encourage the group to get to the broader definition that includes more than just individual human limitations> <Advance the second line of the slide> According to ICAO, “Human Factors is concerned to optimize the relationship between people and their activities, by the systematic application of human sciences, integrated within the framework of systems engineering.” Human Factors is about people [interacting with technology]: it is about people in their working and living environments, and it is about their relationship with equipment, procedures and living environments . Just as importantly, it is about their relationships with other people. Its twin objectives can be seen as safety and efficiency (ICAO Circular 227). The field is cross-disciplinary in that it involves operations (different industries), psychology, kinesiology, engineering, sociology, computer studies and other fields. It demands that people work together to understand performance and make the system robust. The Human Factors courses that you have taken in your operational careers (flight ops, ATC, maintenance, etc.) focus on teaching operational folks how to individually manage individual human factors (stress, fatigue, inattention, etc.). Before the implementation of SMS we tried to move everyone to understand that Human Factors was about organizationally managing individual human factors (shift schedules, communication protocols, etc.). With our transition to SMS, it is clear we now have a much greater understanding of the field applied to aviation and we are now demanding organizations manage all human-technology interactions at the system level (individual, organization, equipment, procedural, environmental - human interactions); we now know a systems/organizational approach is much more successful than the individual approach. This is how systemic issues and solutions are found. It is critical that we begin to advance this understanding. All TCCA staff need to understand not only the individual human factors faced in day-to-day operational work, but also the human-equipment factors, the human-procedure factors, the human-environment factors, the human- organizational factors and so on - organizationally how to manage all human factors issues (human-technology interactions). The discipline of Human Factors is much wider in scope than has traditionally been thought by the aviation industry. With the increase in complexity in our aviation system we must begin to expand everyone's definition and understanding of Human Factors. ICAO defines Human Factors as the discipline "concerned to optimize the relationship between people and their activities, by the systematic application of human sciences, integrated with the framework of systems engineering." (ICAO Digest No.1). The International Ergonomics Association sees no difference between the words ergonomics and human factors and defines it as: "the scientific discipline concerned with the understanding of the interactions among humans and other elements of a system, and the profession that applies theory, principles, data, and methods to design in order to optimize human well-being and overall system performance". In simple terms Human Factors is about optimizing the human-technology relationship to accomplish work. Technology here is used broadly - a procedure is a piece of technology. One of the goals of discipline is to manage systems well to support human goals, while protecting against human limits. To give a specific example, we want to understand what are some of the more frequent human-technology breakdowns and methods to identify these breakdowns in context so appropriate mitigations can be implemented. The context in which interactions breakdown is critically important. Being able to identify and describe the context in which human performance takes place and explain how the context contributed is a critical skill in the practice of Human Factors. With SMS our goal is to have industry do this work, however, our inspectors must also understand how this work is done for their oversight responsibilities. Inspectors will need to know about Human Factors methods (e.g. Reason's model of accident causation), they will need to know the attributes of good outcomes of Human Factors methods (e.g. accident reports), they will need additional language to explain organizational factors and context... and so on.
4
Human Error Knowledge and Error flow from the same mental source; only success can tell one from the other. Ernst Mach (1905)
5
Human Error What is Human Error?
Human Error is a generic term used to describe all those occasions where a planned sequence of mental or physical activities fails to achieve its intended outcome, and when these failures cannot be attributed to outside intervention (Reason, 1990). EO#3. DEFINE HUMAN ERROR, BRIEFLY DESCRIBE THE HISTORY OF HUMAN ERROR, EXPLAIN THE OLD VIEW AND THE BAD APPLE THEORY (BASIC ATTRIBUTION ERROR), WHY IT IS PROBLEMATIC, AND HOW THE NEW VIEW HELPS US LEARN FROM OUR ERRORS. A classification scheme used by ICAO: “SLIPS AND LAPSES” are errors which result from some failure in execution of an action sequence, regardless of whether or not the plan which guided them was adequate to achieve its objective. “MISTAKES” are failures in the selection of an objective or the means to achieve it, irrespective of whether or not the actions directed by the ‘scheme’ run according to plan. Slips and lapses are essentially conditioned or automatic responses, with little , if any, conscious decision making. Mistakes involve deliberate decision-making and evaluation, based on knowledge, experience and mental models that have worked well in the past. “ADAPTATIONS” can be defined as deliberate, but not necessarily reprehensible, deviations from those practices deemed necessary (by designers, managers, regulators) to ensure safe operation.
7
Traditional Approach #1
People make mistakes on the job because of: Stupidity Carelessness Complacency Incompetence, etc.
8
Traditional Error Prevention
Make rules Enforce rules Punish violators Fire them Suspend them Retrain them Counsel them If you follow the rules you cannot have an accident
9
Traditional Approach #2 Humanistic
Accidents happen because of Human Error People do not try to make mistakes They must be Broken, defective, deficient….. Therefore “Fix the people”
10
Error Prevention by Fixing the People
Decision-making training Be more: Vigilant Careful More, more, more…. But…. They weren’t broken
11
Human Error Why is the Old View so popular? Cheap and easy Saving face
Personal responsibility and the illusions of omnipotence Cheap and easy: it is a deviously simple approach. It is cheap to implement. The Old View believes failure is an aberration, a temporary hiccup in an otherwise smoothly performing, safe operation. Nothing more fundamental, or more expensive, needs to be changed. Saving face: In the aftermath of failure, pressure can exist to save public image. To do something immediately to return the system to a safe state. Taking out defective practitioners is always a good start to saving face. It tells people that the mishap is not a systemic problem, but just a local glitch in an otherwise smooth operation. You are doing something; you are taking action. The fatal attribution error and the blame cycle is alive and well. Personal responsibility and the illusions of omnipotence: practitioners in safety- critical systems usually assume great personal responsibility for the outcomes of their actions. Practitioners get trained an paid to carry this responsibility. But the flip side of taking this responsibility is the assumption that they have the authority, the power, to match it. The assumption is that people can simply choose between making errors and not making them – independent of the world around them. In reality, people are not immune to pressures – and organizations would not want them to be (a Coast Guard helicopter pilot who never flies in poor visibility would not be a Coast Guard pilot for long). To err or not to err is not a choice. People’s work is subject to and constrained by multiple factors. How many times have you cursed your kids or your spouse or your sibling for doing something… without considering the factors that were present at the time? The Old View is a very normal reaction to failure. The problem is – we cannot make progress on safety with this view.
12
Human Error Underestimate the influence of the situation.
Basic Attribution Error: Tendency to attribute behaviour to an enduring quality of the person AND Underestimate the influence of the situation. The basic attribution error is the psychological way of describing the Old View… All humans have a tendency, when examining the behaviour of other people, to over- estimate the degree to which their behaviour results from permanent characteristics, such as attitude or personality and to under-estimate the influence of the situation.
13
Human Error Where the Old View falls short Local rationality
If your explanation still relies on unmotivated people, you have more work to do You have to assume that nobody comes to work to do a bad job You have to understand why what people did made sense to them at the time. Local rationality: People are doing reasonable things given their point of view and focus of attention; their knowledge of the situation; their objectives and the objectives of the larger organization in which they work. People in safety-critical jobs are generally motivated to stay alive, to keep their passengers and customers alive. They do not go out of their way to fly into mountainsides or windshear; to damage equipment, to install components backwards… In the end, what they are doing makes sense to them at that time. It has to make sense, otherwise they would not be doing it. So if you want to understand human error, your job is to understand why it made sense to them. Because if it made sense to them, it may well make sense to others, which means that the problem may show up again and again. If you want to understand human error, you have to assume that people were doing reasonable things given the complexities, dilemmas, trade-offs and uncertainty that surrounded them. Just finding and highlighting people’s mistakes explains nothing. Saying what people did not do, or what they should have done, does not explain why they did what they did.
14
Local Rationality Humans are the most flexible, adaptable, and valuable part of the system while at the same time they are most vulnerable to influences which can adversely affect performance. Most accidents have been attributed to ‘human error’… so the statistics say… Do we fix the people or the system in which the people work? To prevent accidents we address the causal, contributing and underlying factors of the system in which people work.
15
Human Error “Underneath every simple, obvious story about error, there is a deeper, more complex story…” “Take your pick: Blame human error or try to learn from failure…” (Dekker, 2006)
16
Human Error New View of Human Error on what goes wrong:
Human Error is a symptom of trouble deeper inside a system To explain failure, do not try to find where people went wrong Instead, find out how people’s assessments and actions made sense at the time given the circumstances that surrounded them The New View was born out of recent insights in the field of Human Factors, specifically the study of human performance in complex systems and normal work. Sources of error are structural, not personal. If you want to understand human error, you have to dig into the system in which people work. You have to stop looking for people’s personal shortcomings; Errors and accident are only remotely related. Accidents emerge from the system’s complexity, not from its apparent simplicity. That is, accidents do not just result from a human error, or a procedural “violation”. It takes many factors, all necessary and only jointly sufficient, to push a system over the edge of failure; Accidents are not the result of a breakdown of otherwise well-functioning processes. You think your system is basically safe and that accidents can only happen if somebody does something really stupid or dangerous. Instead, the research is showing us how accidents are actually structural by-products of a system’s normal functioning. What is striking about many mishaps is that people were doing exactly the sorts of things they would usually be doing- the things that usually lead to success and safety. People are doing what makes sense given the situational indications, operational pressures and organizational norms existing at the time. Accidents are seldom preceded by bizarre behaviour. To adopt the new view you must acknowledge that failures are baked into the very nature of your work and organization; that they are symptoms of deeper trouble or by-products of systemic brittleness in the way you do your business. It means having to acknowledge that mishaps are the result of everyday influences on everyday decision making, not isolated cases of erratic individuals behaving unrepresentatively. Ift means having to find out why what people did back there and then actually made sense given the organization and operation that surrounded them.
19
System-induced violations System-induced errors
Human Error 10% 90% Culpable Blameless Sabotage Substance abuse Reckless violations etc. System-induced violations System-induced errors etc. Very few people are found culpable and were actually malicious in their intent – most are blameless and were actually trying to do their very best in an imperfect system. With the increase in the tendency to charge pilots, mechanics, doctors and other practitioners with criminal negligence, it is important for us to really understand the influence of the system. To adopt the new approach to human error, we have to understand why it is so easy to believe the old approach and take conscious steps towards the new view. There are lots of examples where the general population and the strength of the Old View seriously affected the lives of people who were just trying to do a good job in an imperfect system: Singapor Six, Denver nurses trial, Oscar November, etc….
20
Organizations and Socio-technical Systems
Some system defences: Physical design aspects Job design elements Adequate resources Company safety management systems Effective regulatory system National legislation International agreements… In order to understand how decision makers’ actions or inactions influence safety, it is necessary to introduce a contemporary view of accident causation. As a complex socio-technical system, aviation requires the precise coordination of a large number of human and technical elements to function. The aviation system utilizes an elaborate array of systemic safety defences to protect against human errors. These defences include such things as: Physical design aspects: controls and displays, safety guards, special tools Job design elements: sequencing of tasks, procedural compliance, readbacks, documentation of work done Adequate resources: equipment, trained personnel Company safety management systems: incident reporting, trend analysis, safety audits An effective regulatory system: air regulations, safety oversight and enforcement National legislation: establishment and organization of civil aviation administration, aviation laws International agreements: ICAO, SARPs, JARs Accidents in such a well-defended system are the product of the confluence of a number of enabling factors, each one essential but not sufficient along to breach system defences. However, sometimes it is the complexity of these defences that can result in an accident. Another aspect of normal socio-technical system performance not explained by this organization model is the drift or migration of work practices from prescribed policies and procedures over time. Because these systems are so well-defended, with many defences located in different parts of the organization, small changes in practice to deal with local conditions lead to interactive complexity and decoupling of interdependent activities. These changes, drifts from prescribed activities to practiced activities, result from two primary forces: the drive for productivity and the drive for efficiency. One model of socio-technical system performance that goes beyond James Reason’s model, that addresses the dynamic nature of performance is Rasmussen’s risk management framework.
21
Wiener's "Iron Law" ...if equipment is designed correctly for human use in the first place, the cost is high, but it is paid only once. If poor design must be compensated for in training departments and operations, the price must be paid every day. And what is worse, with weak, potentially error-inducing designs, one cannot be sure that when the chips are down, the correct responses will be made. Wiener, Earl, L. (1993) Intervention Strategies for the Management of Human Error. NASA Contractor Report P. 13.
22
Error & Vulnerability Attention is a finite resource
Overload Distraction Interruption Fatigue – 17 hours = .05% BAC Equipment design Team coordination
23
Summary Meaning of Human Factors Human Error
From Organizations to Socio-technical Systems Vulnerability & Countermeasures
24
Recommended Reading The Human Factor (Kim Vicente)
The Field Guide to Understanding Human Error (Sidney Dekker) Managing the Risks of Organizational Accidents (James Reason) 10 Questions About Human Error (Sidney Dekker)
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.