Download presentation
Presentation is loading. Please wait.
Published byClaire Heath Modified over 9 years ago
1
Leveson Chapters 5 & 6 Gus Scheidt Joel Winstead November 4, 2002
2
“Human Error” in 2001: A Space Odyssey “The 9000-series is the most reliable computer ever made. No 9000-computer has ever made a mistake or distorted information.” “Well I don’t think there is any question about it. It can only be attributed to human error. This sort of thing has cropped up before and it has always been due to human error.” HAL then proceeds to eliminate the sources of “human error” on the spacecraft.
3
Alternate Explanations for “Operator Error” Data biased or incomplete Positive actions usually not noted Hindsight is always 20/20 Premise that operators can overcome every emergency Operators often intervene at the limits Difficult/Impossible to separate operator error from design error
4
Alternate Explanations for “Operator Error” Data biased or incomplete Positive actions usually not noted Hindsight is always 20/20 Difficult/Impossible to separate operator error from design error –Premise that operators can overcome every emergency –Operators often intervene at the limits
5
Blame the designers? Designers fail to fully understand system characteristics or anticipate all environmental conditions –Difficulty assessing probabilities of rare events –Bias against considering side effects –Tendency to overlook contingencies –Control complexity by concentrating on only a few aspects of the system –Limited capacity to comprehend complex relationships
6
Blame the designers? Designers fail to understand fully system characteristics or anticipate all environmental conditions Important to us as software designers
7
Human Error & Problem Solving Human Error: behavior that is inconsistent with normal, programmed patterns and that differs from prescribed procedures Exactly what is required to solve problems in unforeseen situations
8
Human Error & Rasmussen Skill-based: “errors” have a function in maintaining skill Rule-based: process of adapting rules and learning how to apply them requires experiments, some of which will fail Knowledge-based: human errors are the inevitable side effects of human adaptability and creativity
9
Relationship Between Experimentation and Error Designers have tools to assist with the experimentation process; Operators do not Examples of experimentation in software –Test-bed machines –New programming libraries –Exploring graphics tool features –Tweaking of a production system
10
Experimentation & Error in Software Differences between software and other industries –Undo –Error from Experimentation usually has less than catastrophic consequences
11
Lesson for Software Designers Summary: Human Error = unsuccessful experiments performed in “unkind” environment Our job: create “kind” environments –i.e. “design for error tolerance”
12
Mental Models and the Role of Human Operators
13
Mental Models as Abstractions Without abstraction, humans would be unable to cope with complex systems Mental models reflect goals and experience –Designer’s model vs. operator’s model
14
Mental Models Actual System Designer’s Model Operator’s Model Operational Experience Original Design Operational procedures, training Manufacturing and construction variances Evolution and changes over time
15
Model Mismatches as Source of Errors Guam Accident –Pilots assumed beacon was at end of runway –They then flew into side of mountain Three Mile Island –Operators built mental model assuming no loss of coolant –They then took actions making situation worse
16
Model Changes in Recovery Three Mile Island, again: –Operators eventually realized their model was wrong, and took action to correct it Human operators can change mental models when conflicts occur The same ability that causes model- mismatch accidents is also what allows for recovery from accidents
17
Maintaining Mental Models When designing a system, it is important to help operators maintain a good mental model of the system Operators must participate in the system in order to understand it Operators must have enough information to realize when their models are wrong
18
Three Roles Human as monitor Human as backup Human as partner
19
Human as Monitor The operator must know what correct behavior is, and how to maintain it The operator is dependent on information provided Failures may be silent or masked Tasks that require little operator activity may result in lowered alertness
20
Human as Backup When the operator must intervene, he must understand the situation –Operator may lack situational awareness Over-reliance on automated systems may make operators reluctant to intervene The system must provide enough information and enough control for operator to make correct decision
21
Human as Partner Automated systems may be better at repetitive tasks –But automation may reduce operator awareness Operator may simply be given the tasks that could not be automated Partially-automated systems might not actually decrease overall load on operator
22
Role of Human Operator The role of human operators must be considered in design The ability of human operators to maintain and correct mental models is important Human operators should not be treated as automatons in the system to handle tasks the designers could not automate
23
Autopilot Systems The aircraft can fly itself under normal circumstances When something unusual happens, the pilot must intervene Accidents occur when the pilot has an incorrect mental model Human as backup?
24
Lightweight Analysis Tools Tools like lint, ESC/Java, and Splint detect possible errors in source code Such tools are necessarily incomplete, unsound, or both A human must evaluate whether or not a real problem has been found
25
Version Control CVS can usually automatically merge different versions of a document CVS cannot always resolve conflicting changes It provides context so the user can build a mental model of the conflict and resolve it
26
Counterpane Systems Founded by Bruce Schneier of Applied Cryptography fame Human operators monitor their clients’ networks, looking for suspicious or abnormal activity Operators assisted by complex displays at control center Operators may take action by changing security policies
27
Grammar Checking The computer cannot write a thesis The computer can assist you as you write to find potential speling and grammar errors The computer is often wrong because it cannot understand the semantic context –e.g. passive voice is always assumed wrong Is the system monitoring the user or vice-versa?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.