Presentation is loading. Please wait.

Presentation is loading. Please wait.

INSY 3020/7976/ ENH 670 Human Error.

Similar presentations


Presentation on theme: "INSY 3020/7976/ ENH 670 Human Error."— Presentation transcript:

1 INSY 3020/7976/ ENH 670 Human Error

2 Human Error Defined  An inappropriate or undesirable human decision or behavior that reduces or has the potential to reduce effectiveness, safety, or system performance  A human action/decision that exceeds system tolerances “An action is taken that was ‘not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits” (Senders & Moray, 1991, p. 25 as cited in Proctor & van Zandt, 1994, p. 43). (Reason, 1992) definition of human error is the most widely accepted: an error is a failure of achieving the intended outcome in a planned sequence of mental or physical activities

3 Human Error Operator error: Design error:
Due entirely to the human operator. You can’t eliminate all of these, but a good human factors design will make these virtually impossible. Design error: Due to poor design.

4 Examples

5 Examples

6 Examples

7 Examples

8 Examples

9 Human Error Human Error Probability - the ratio of errors made with respect to the number of opportunities for error; P(error) = 1 - Human Reliability

10 Reliability Analysis Total system reliability is a function of the
reliability of the components. Component reliability: r = 1 - p. r = component reliability. p = probability of component failure. Two kinds of systems: Serial: Sequence of components. Parallel: Two or more components perform the same function (redundancy).

11 Reliability Analysis Serial system reliability: R = (r1) * (r2) …(rn).
Adding a component will always decrease reliability for a serial system. Parallel system reliability: R = 1 - [(1 - r1) * (1 - r2) …(1-rn)] Adding a component will always increase the reliability of a parallel system.

12 Human Reliability Operator error probability = number of
errors / number of opportunities for error. Human reliability = 1 - operator error probability. Estimating human reliability: Monte Carlo simulations: Describe the task, set up a simulation of the operator, repeat many times, estimate human reliability.

13 Human Reliability The goal of human reliability analyses is to apply the same principles to the human operator that we apply to the machine/device to prevent error that leads to system failure.

14 Human Error Accident-Proneness Theories
Theories of Accident Causation Accident-Proneness Theories Accident proneness – this theory suggests that certain individuals are more likely to have accidents than others; supported by statistical data; underlying assumption is that all workers are to same job and environmental hazards Accident liability – suggests accident-proneness is limited to specific factors (situational, age, etc.) Job Demand vs. Worker Capability Theories Accident liability increases when job demands exceed worker capabilities (similar to time-ratio estimates of mental workload) Adjustment-to-stress theory Arousal-alertness theory A-P: Confounding of attributes related to individual (person) and job (task/environment) A-L: supported by age, which is also confounded with experience Adjustment-to-stress theory: more accidents when stress (physical or psych) exceeds human capability to cope – noise, illumination, anxiety, sleep depravation Arousal-alertness theory: more accidents when stress is too low/high - arousal differs from stress b/c stress is harmful by definition Psychosocial Theories: suggest link with underlying psychological processes related to job satisfaction and motivation – be aware of them

15 Human Error Stages of Human Decision-making at which Human Error can Occur: 1. Activation/detection of system state signal 2. Observation and data collection 3. Identification of system state 4. Interpretation of situation 5. Definition of objectives 6. Evaluation of alternative strategies 7. Procedure selection 8. Procedure execution

16 Information Processing Model
Wickens et al. 2004 Attention Resources Sensory Registration Decision Making Response Selection Response Execution Long-Term Memory Working Memory Perception 1st and 2nd phases relate to cognition Top-down (knowledge-driven) vs. Bottom-up (stimulus-driven) processes Problems with info-processing – Deemphasizes “top-down” processes – humans actively construct and interpret information Assumes same processing re: of content Perceptual Encoding Central Processing Responding

17 Exceptional violations
Human Error Taxonomy Reason (1992) Basic Errors Attentional Failures Slip Unintended Action Intended Action Memory Failures Lapse Unsafe Acts Rule-based or Knowledge-based Mistakes Mistake According to Reason, human errors are divided into two major categories: (1) slips that result from the incorrect execution of a correct action sequence and (2) mistakes that result from the correct execution of an incorrect action sequence. In comparison with mistakes, slips have been extensively studied and better understood (for reviews, see Norman, 1986; Reason, 1992). Routine violations Exceptional violations Sabotage Violation

18 Error Mechanism Categories
Basic Errors Skill Based: Attention Failures Memory Failures Failures in Execution Perceptual Based: Visual Auditory Tactile Rule Based: Misapplication of a good rule Application of a bad rule Knowledge Based: Inaccurate knowledge of the system Incomplete knowledge of the system Decision Errors

19 Attentional Failures Intrusion – entering a dangerous area / location
Commission – performing an act incorrectly  Omission – failure to due something Reversal – trying to stop or undo a task already initiated Misordering – task or set of task performed in the wrong sequence Mistiming – person fails to perform the action within theltime allotted

20 Rule-based Based Mistakes
Memory Failures Losing ones place; forgetting intentions Rule-based Based Mistakes Application of a bad rule “I’m in a public space in view of many people, therefore I won’t be robbed.” Misapplication of a good rule “A patient on chronic medication became concerned about addiction and therefore deliberately stop taking the drug for a period each year even though the drug in question was not addictive.”

21 Contributing Factors Contributing Factors in Accident Causation (CFAC) Sanders and Shaw (1988) Management (organization/policies) Environment (physical conditions) Equipment (design) Work (task characteristics) Social/psychological environment (culture) Worker/coworkers (personal attributes) Sanders and Shaw (1988)

22 Typical Errors Associated with new technologies or systems
Mode Error – user thought system was in one mode when it was actually in another. Getting Lost – Users get lost in display architectures. Difficulty in finding the right screen or data set. Not Coordinating Data Entries – poor coordination between multiple users inputting data into the same system. Overload – system use drains attention resources from other equally important tasks. Data Overload –users forced to sort through a large amount of data produced by the system in order to determine the true nature of the situation. Not Noticing Changes – digital displays used to communicate system changes or trends. Automation Surprises – system automation did something user did not expect or anticipate.

23 Techniques & Methods For Human Error Identification
Technique for human error rate prediction (THERP) Hazard and operability study (HAZOP) Skill, rule and knowledge model (SKR) Systematic human error reduction and prediction approach(SHERPA) Generic error modeling system (GEMS) Potential Human Error Cause Analysis (PHECA) Murphy Diagrams Critical Action and Decision Approach (CADA) Human Reliability Management System (HRMS) Influence modeling and assessment system (IMAS) Confusion Matrices Cognitive Environment Simulation (CES)

24 Murphy Diagrams Diagrammatic representations of error modes that illustrate the underlying causes associated with cognitive decision making tasks. 1. Activity  activation/detection of system state signal  observation and data collection  identification of system state  interpretation of situation  definition of objectives  evaluation of alternative strategies  procedure selection  procedure execution Outcome Proximal Sources Distal Sources

25 Murphy Diagram Example

26 Typical Investigation Errors
Due to “Hindsight Bias” Conterfactual Reasoning – Stating only what users should have done to avoid the mishap; does not explain why users did what they did. Data Availability/Observability – Pointing out data that could have revealed the true nature of the situation; does not explain which data observers used, how they used it and why they used it. Micro-Matching Error – Matching fragments of peoples overall performance with rules and procedures taken from documentation; does not explain why the user did what they did. Cherry-Picking Error – Identifying an over-arching condition in hindsight (“users were in a hurry”) based on the outcome then trace back through the sequence of events to confirm your conclusions.

27 Human Error Investigations
Suggested Procedures Do not use the outcome of a sequence of events to assess the quality of the decisions that lead up to it (avoid hindsight bias) Don’t mix elements from your own knowledge into those of the users at the time of the mishap. Don’t present your knowledge to the users you investigate. Determine what knowledge the users utilized at the time of the mishap. Recognize that consistencies and certainties of the system are products of your hindsight, not the users mindset at the time of the mishap.

28 Human Error Investigations
Suggested Procedures To understand and evaluate human performance, you must understand how the situation unfolded around users at the time of the mishap. You must adopt a view from inside the situation as it occurred. Remember that the point of a human error investigation is to understand why users did what they did, not to judge them for what they did not do.

29 Human Error Investigations
Sources of Data / Information Third-party and historical sources Recordings of people performance and process performance Debriefings of system user participants involved in error mishap Purpose is to help reconstruct the situation surrounding the users at the time of the error mishap and get their point of view on the event.

30 Human Error Investigations
Debriefing & Interviewing Approaches & Techniques: Have users tell the story from their point of view. Do not present them with replays or summaries to “refresh their memory” Tell the story back to them as an investigator (checks understanding) Have users identify critical junctures in the sequence of events places, or short stretches of time where either people of processes contributed critically to the direction of subsequent events or the outcomes that resulted

31 Human Error Investigations
Debriefing & Interviewing Have them describe how the world looked to them at each critical juncture: what cues were observed? what knowledge was used to deal with the situation? what expectations did users have about how things were going to develop? what options did they think they had to influence events? what other influences helped determine how they interpreted the situation and how they would act?

32 Human Error Investigations
Debriefing & Interviewing

33 Why are the holes there in the first place?
Human Error rarely contributes to accidents in terms of a single distinct and obvious error. Human errors can/do occur regularly. It is the combination of many factors extrinsic and intrinsic, which when they coincide – produce tragic results. Where are the holes? What do they consist of? Why are the holes there in the first place? Why do the holes sizes and locations change over time? How and why can the holes line up to produce a mishap?

34 Latent Conditions Active Conditions Failed or Absent Defenses
Accident & Injury Latent Conditions Active Conditions Failed or Absent Defenses Organizational Influences Unsafe Supervision Preconditions for Unsafe Acts Acts

35 Generic Approaches to Minimizing Human Error
Personnel Selection Appropriate skills and capabilities to perform required tasks Training Helps ensure appropriate skills; can be expensive and time consuming; people may revert to original behaviors under stress Design Preferred method; maintainability, displays & controls, feedback (error detection), user expectations; categories: exclusionary, preventative, and fail-safe Methods of changing behavior to improve safety and reduce errors: Procedural checklists Training Feedback re: desirable work behaviors (accident statistics, recognition) Contingency reinforcement strategies – subset of feedback, positive reinforcement of behaviors – recognition, etc. Incentive programs – subset of feedback, bonuses or special privileges for reaching desired safety goals

36 Get Help From System Users
What would have helped you get the right picture of the situation? Would any specific training, experience, knowledge, procedures, or cooperation, from others have helped? If a key feature of the situation could have been different, what would you have done differently? Could clearer guidance from your organization, help you make better trade-offs between conflicting goals?


Download ppt "INSY 3020/7976/ ENH 670 Human Error."

Similar presentations


Ads by Google