IT Roles and Responsibilities: How Good is Good Enough? IS 485, Professor Matt Thatcher.

Slides:



Advertisements
Similar presentations
Health and Safety Executive Health and Safety Executive CDM 2007 Training Package Session 2 - Clients Version: September 07.
Advertisements

By Lanah Benthien WARNING: THIS JOB IS NOT A PRETTY ONE. GRAPHIC CONTENT.
“An Investigation of the Therac-25 Accidents” by Nancy G. Leveson and Clark S. Turner Catherine Schell CSC 508 October 13, 2004.
The Therac-25: A Software Fatal Failure
A Gift of Fire, 2edChapter 4: Can We Trust the Computer?1 PowerPoint ® Slides to Accompany A Gift of Fire : Social, Legal, and Ethical Issues for Computers.
National Health Information Privacy and Security Week Understanding the HIPAA Privacy and Security Rule.
OPMA 5364 Project Management Part 8 Project Wrap-Up.
An Investigation of the Therac-25 Accidents Nancy G. Leveson Clark S. Turner IEEE, 1993 Presented by Jack Kustanowitz April 26, 2005 University of Maryland.
Therac-25 Lawsuit for Victims Against the AECL
The Relationship between Nuclear Safety, Security and Safeguards
+ THE THERAC-25 - A SOFTWARE FATAL FAILURE Kpea, Aagbara Saturday SYSM 6309 Spring ’12 UT-Dallas.
W5HH Principle As applied to Software Projects
Reliability and Safety Lessons Learned. Ways to Prevent Problems Good computer systems Good computer systems Good training Good training Accountability.
Social Implications of a Computerized Society Lecture 8 Professional Ethics Instructor: Oliver Schulte Simon Fraser University.
Laboratory Personnel Dr/Ehsan Moahmen Rizk.
Chapter 6: Design of Expert Systems
IT Safety and Reliability Professor Matt Thatcher.
June 1, 2004Computer Security: Art and Science © Matt Bishop Slide #18-1 Chapter 18: Introduction to Assurance Overview Why assurance? Trust and.
A Gift of Fire Third edition Sara Baase
The Australian/New Zealand Standard on Risk Management
Jacky: “Safety-Critical Computing …” ► Therac-25 illustrated that comp controlled equipment could be less safe. ► Why use computers at all, if satisfactory.
CS 235: User Interface Design January 22 Class Meeting
Project Closure CHAPTER FOURTEEN Student Version Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
Staying Healthy and Safe Make Safety Your Business
Lecture 7, part 2: Software Reliability
DJ Wattam, Han Junyi, C Mongin1 COMP60611 Directed Reading 1: Therac-25 Background – Therac-25 was a new design dual mode machine developed from previous.
After completing this chapter you will be able to: 1.EXPLAIN business ethics 2.GIVE reasons why ethical behavior is good for business. 3.DEFINE social.
Company Confidential Registration Management Committee 1 Asking the Right Questions Right Dale Gordon Aerojet Rocketdyne July 16, 2014.
FDA Regulatory review in Minutes: What Product Development Executives Need-to-Know. Specifically, frequent causes of recalls and related areas that investigators.
Building a Corporate Risk Culture Shane Troyer, CPA, CIA, CFE, CISSP Principal Operational Advisory Joost Houwen, CISA,
01-1-S230-EP Unit S230-EP S230-EP Unit 1 Objectives Describe the values and principles of operational leadership. Identify the qualities.
ITGS Software Reliability. ITGS All IT systems are a combination of: –Hardware –Software –People –Data Problems with any of these parts, or a combination.
Course: Software Engineering © Alessandra RussoUnit 1 - Introduction, slide Number 1 Unit 1: Introduction Course: C525 Software Engineering Lecturer: Alessandra.
Chapter 8: Errors, Failures, and Risk
1. 2 IMPORTANCE OF MANAGEMENT Some organizations have begun to ask their contractors to provide only project managers who have been certified as professionals.
1 Can We Trust the Computer? What Can Go Wrong? Case Study: The Therac-25 Increasing Reliability and Safety Perspectives on Failures, Dependence, Risk,
CS 235: User Interface Design August 25 Class Meeting Department of Computer Science San Jose State University Fall 2014 Instructor: Ron Mak
Security and Reliability THERAC CASE STUDY TEXTBOOK: BRINKMAN’S ETHICS IN A COMPUTING CULTURE READING: CHAPTER 5, PAGES
Dimitrios Christias Robert Lyon Andreas Petrou Dimitrios Christias Robert Lyon Andreas Petrou.
©2001 Southern Illinois University, Edwardsville All rights reserved. Today Fun with Icons Thursday Presentation Lottery Q & A on Final Exam Course Evaluations.
What you know… You work at the East Texas Cancer Center in Tyler, Texas as a physicist who “maintains and checks the machine regularly.” (Huff 2005) Patient.
Chapter 20 A Safe and Healthy Environment. Lecture Overview Employee Safety Principles of Safety Program Implementation of Safety Program Health Work.
Systems Analysis and Design in a Changing World, Fourth Edition
Where We Are Now 14–2. Where We Are Now 14–2 Major Tasks of Project Closure Evaluate if the project delivered the expected benefits to all stakeholders.
Therac-25 CS4001 Kristin Marsicano. Therac-25 Overview  What was the Therac-25?  How did it relate to previous models? In what ways was it similar/different?
Business Ethics & Social Responsibility
Business ethics and social responsibility
Presented to: By: Date: Federal Aviation Administration AIRWORTHINESS Positive Safety Culture Failure to Follow Procedures 1 R1.
©2001 Southern Illinois University, Edwardsville All rights reserved. Today Finish Ethics Next Week Research Topics in HCI CS 321 Human-Computer Interaction.
Why Cryptosystems Fail R. Anderson, Proceedings of the 1st ACM Conference on Computer and Communications Security, 1993 Reviewed by Yunkyu Sung
Continual Service Improvement Methods & Techniques.
Directed Reading 1 Girish Ramesh – Andres Martin-Lopez – Bamdad Dashtban –
Randy Modowski Adam Reimel Max Varner COSC 380 May 23, 2011 Accountability.
CHAPTER 9: PROFESSIONAL ETHICS AND RESPONSIBILITIES BY: MATT JENNINGS SHANE CRAKER KYLER RHOADES.
Slide #18-1 Introduction to Assurance CS461/ECE422 Fall 2008 Based on slides provided by Matt Bishop for use with Computer Security: Art and Science.
1 Advanced Computer Programming Project Management: Basics Copyright © Texas Education Agency, 2013.
Can We Trust the Computer? FIRE, Chapter 4. What Can Go Wrong? What are the risks and reasons for computer failures? How much risk must or should we accept?
Organization and Implementation of a National Regulatory Program for the Control of Radiation Sources Need for a Regulatory program.
Principles of Information Systems Eighth Edition
The Systems Engineering Context
EE 585 : FAULT TOLERANT COMPUTING SYSTEMS B.RAM MOHAN
Therac-25 Accidents What was Therac-25? Who developed it?
PowerPoint® Slides to Accompany
Reliability and Safety
Software Engineering Software Engineering is the science and art of
CHAPTER 10 METHODOLOGIES FOR CUSTOM SOFTWARE DEVELOPMENT
Week 13: Errors, Failures, and Risks
Software Engineering Software Engineering is the science and art of
Computer in Safety-Critical Systems
A Gift of Fire Third edition Sara Baase
Presentation transcript:

IT Roles and Responsibilities: How Good is Good Enough? IS 485, Professor Matt Thatcher

2 Agenda for Today l Brief review of the Case of the Killer Robot l Overview of the Therac-25 accidents l Discussion of “How Good Is Good Enough?” –what are our social responsibilities?

3 Killer Robot Summary l The general problems –simple programming error –inadequate safety engineering and testing –poor HCI design –lax culture of safety in Silicon Techtronics l What would change in you replaced one of the characters with an “ethical” person? –would any of these problems have been solved?

4 Matt’s Humble Opinions l Source of the problems –economic incentives »time pressures u exclusive focus on meeting unrealistic deadlines u there was no payoff to the dvmt team based on usability or safety measures u valuing stock price over operator safety »cut corners  keep your job, challenge decisions  get fired –company culture »poor communication all along the company hierarchy »lots of unproductive, unresolved, and unaddressed conflict »inability to consider alternatives »choice of waterfall model instead of prototyping model as development methodology –inexperience and lack of critical skills in key positions »Johnson (hrdwr guy), Reynolds (data processing guy), Samuels (no exp in physics) l Who is most responsible for: –setting appropriate economic incentives –creating an appropriate culture –putting people with the rights skills into the right jobs

5 Therac-25 Accidents (Basis of the Killer Robot Case) l What was Therac-25? –released in 1983 –computerized radiation therapy machine used to treat cancer patients l Who developed it? –Atomic Energy of Canada, Ltd and GCR (French-based company) l What were the key advances of it over its predecessors (Therac-6 and Therac-20)? –move to more complete software-based control »faster set-up »safety checks were now controlled by software (instead of mechanical interlocks)

6 Therac-25 Accidents (What Happened?) l Massively overdosed patients at least 6 times (3 died, 3 seriously disabled) –June 1985 »Marietta, Ga (Linda Knight, 61) –July 1985 »Hamilton, Ont (Donna Gartner, 40) –December 1985 »Yakima, Wash (Janis Tilman) –March 1986 »Tyler, Tx (Isaac Dahl, 33) –April 1986 »Tyler, Tx (Daniel McCarthy) –January 1987 »Yakima, Wash (Anders Engman)

7 Therac-25 Accidents (Example of Contributing UI Problems) l The technician got the patient set up on the table, and went down the hall to start the treatment. She sat down at the terminal: »hit “x” to start the process u she realized she made a mistake, since she needed to treat the patient with the electron beam, not the X-ray beam »hit the “Up” arrow, »selected the “Edit” command, »hit “e” for electron beam, and »hit “enter” (signifying she was ready to start treatment) »the system showed a “beam ready” prompt »she hit “b” to turn the beam therapy on »the system gave her an error message (Malfunction 54) »she overrode the error message l It turns out that the UI showed that it was in electron mode but it was actually in a “hybrid” mode  delivered more than 125 times the normal dose to the patient

8 Therac-25 Accidents (What Were the Problems?) l The Problems –simple programming errors –inadequate safety engineering »ignored the software risks (almost no unit or integration testing at all) »operators were told it was impossible to overdose a patient –poor HCI design –lax culture of safety in the manufacturing co. –problems were not reported quickly to manufacturer or FDA »prompted a 1990 federal law

9 Friendly-Fire Tragedy (Afghanistan)

10 Critical Observation l If it is true that: –information technology affects society AND –some choices in computing design are not completely constrained by mathematics, physics, chemistry, etc. THEN –designers, implementers, teachers, and managers of technology make choices that affect society

11 Principle Actors in Software Design l Software Provider –person or organization that creates the software l Software Buyer –person or organization responsible for obtaining the software for its intended use l Software User –person actually using the software l Society –people, other than providers, buyer, or users who can be affected by the software

12 Obligations of the Software Provider l The Provider (itself) –profit and good reputation l The Buyer –help buyer make an informed decision –set testing goals and meet them –provide warnings about untested areas of the software –inform user about testing results and shortcomings –provide a reasonable warranty on functionality and safety l The User –education/training about use and limitations of software –provide technical support and maintenance –provide reasonable protections –provide clear instructions and user manuals l Biggest Responsibility –answer the question “How Good is Good Enough”

13 Obligations of the Software Buyer l The Provider –respect copyrights and don’t steal –pay a fair price for the product –use the product for the correct purpose –don’t blame the provider for incorrect use –make users available for observations, testing, and evaluation l The Buyer (itself) –learn limitations of the software –perform systems acceptance testing (and audit testing) l The User –ensure proper education/training is provided –ensure proper technical support and maintenance is provided –represent the user during development (be a communication link between user and developer) –make sure users are included in the design process –provide reasonable protections (and a safe working environment) l Biggest Responsibility –make sure software is used for intended purpose (think about biometrics)

14 Obligations of the Software User l The Provider –respect copyrights and don’t steal –read documentation l The Buyer –communicate problems and provide feedback (about training, UI, functionality, etc.) –ensure appropriate use of software –make a good faith effort to learn the software and its limitations l The User (herself) –help in the training process l Biggest Responsibility –ensure that the software continues to perform as intended and report problems

15 Final Observation l Software is treated like other commercials products l Software differs in many critical respects from many other products –serious software errors can remain after rigorous testing because of the logical complexity of software –it is difficult to construct uniform software standards that can be subjected to regulation and inspection –software affects an increasingly large number of people due to the proliferation and flexibility of computers –any group can provide software since set-up costs are low

16 Conclusion l Computer professionals must stop thinking of themselves as technicians l Just like medical doctors, computer professionals are agents of change in people’s lives l Computer professionals have some ethical responsibility for the changes they are creating in society l But this responsibility is complicated because our position as workers in industry diffuses accountability l We must recognize that part of our job includes larger issues of social responsibility

17 CIF for Formal User Testing Reports