1 1985 CPSR-MIT Debate Michael Dertouzos, moderator David Parnas, against SDI (Joseph Weizenbaum, against) Charles Seitz, for SDI (Danny Cohen, for)

Slides:



Advertisements
Similar presentations
Lecture 8: Testing, Verification and Validation
Advertisements

EECE499 Computers and Nuclear Energy Electrical and Computer Eng Howard University Dr. Charles Kim Fall 2013 Webpage:
The System and Software Development Process Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
Design Concepts and Principles
ICT Ethics 2 ICT 139.
Learning Objectives Explain similarities and differences among algorithms, programs, and heuristic solutions List the five essential properties of an algorithm.
1 “Star Wars” Revisited A Case Study In Ethics and Safety-Critical Software Professor Kevin W. Bowyer University of Notre Dame Copyright, Kevin W. Bowyer,
Reviewing Papers: What Reviewers Look For Session 19 C507 Scientific Writing.
Copyright © 2003 by The McGraw-Hill Companies, Inc. All rights reserved. Business and Administrative Communication SIXTH EDITION.
Exam 1 Review u Scores Min 30 Max 96 Ave 63.9 Std Dev 14.5.
Chapter 2 Succeeding as a Systems Analyst
SDI: A Violation of Professional Responsibility David Parnas Presented by Andres Ramirez.
Presented by: Hatem Halaoui
Fundamentals of Information Systems, Second Edition
1 Evaluation of Safety Critical Software David L. Parnas, C ACM, June 1990.
(c) 2007 Mauro Pezzè & Michal Young Ch 1, slide 1 Software Test and Analysis in a Nutshell.
Logical Arguments an argument can be defined as a: form of reasoning that attempts to establish the truth of one claim (called a conclusion) based on the.
System Analysis System Analysis - Mr. Ahmad Al-Ghoul System Analysis and Design.
9 1 Chapter 9 Database Design Database Systems: Design, Implementation, and Management, Seventh Edition, Rob and Coronel.
Science and Engineering Practices
Software Verification and Validation (V&V) By Roger U. Fujii Presented by Donovan Faustino.
1 “Star Wars” Revisited A Case Study In Ethics and Safety-Critical Software Professor Kevin Bowyer University of Notre Dame Copyright, Kevin W. Bowyer,
Semantic Web Technologies Lecture # 2 Faculty of Computer Science, IBA.
Software Dependability CIS 376 Bruce R. Maxim UM-Dearborn.
Section 2: Science as a Process
1 Software Process Lecture Outline Nature of software projects Engineering approaches Software process A process step Characteristics of a good.
1 Shawlands Academy Higher Computing Software Development Unit.
Evaluation of Safety Critical Software -- David L. Parnas, -- A. John van Schouwen, -- Shu Po Kwan -- June 1990 Presented By Zhuojing Li.
INFO 637Lecture #81 Software Engineering Process II Integration and System Testing INFO 637 Glenn Booker.
Chapter 2 The process Process, Methods, and Tools
Lecture 1 What is Modeling? What is Modeling? Creating a simplified version of reality Working with this version to understand or control some.
المحاضرة الثالثة. Software Requirements Topics covered Functional and non-functional requirements User requirements System requirements Interface specification.
CS 360 Lecture 3.  The software process is a structured set of activities required to develop a software system.  Fundamental Assumption:  Good software.
Introduction CS 3358 Data Structures. What is Computer Science? Computer Science is the study of algorithms, including their  Formal and mathematical.
Feasibility Study.
LEARNING DISABILITIES IMPACTING MATHEMATICS Ann Morrison, Ph.D.
West Virginia University Towards Practical Software Reliability Assessment for IV&V Projects B. Cukic, E. Gunel, H. Singh, V. Cortellessa Department of.
The roots of innovation Future and Emerging Technologies (FET) Future and Emerging Technologies (FET) The roots of innovation Proactive initiative on:
Introduction CS 3358 Data Structures. What is Computer Science? Computer Science is the study of algorithms, including their  Formal and mathematical.
Chapter 7 Software Engineering Introduction to CS 1 st Semester, 2015 Sanghyun Park.
The System and Software Development Process Instructor: Dr. Hany H. Ammar Dept. of Computer Science and Electrical Engineering, WVU.
Software Engineering Principles. SE Principles Principles are statements describing desirable properties of the product and process.
Fundamentals of Information Systems, Second Edition 1 Systems Development.
FDT Foil no 1 On Methodology from Domain to System Descriptions by Rolv Bræk NTNU Workshop on Philosophy and Applicablitiy of Formal Languages Geneve 15.
Introduction to Earth Science Section 2 Section 2: Science as a Process Preview Key Ideas Behavior of Natural Systems Scientific Methods Scientific Measurements.
The Software Development Process
1 CSCD 326 Data Structures I Software Design. 2 The Software Life Cycle 1. Specification 2. Design 3. Risk Analysis 4. Verification 5. Coding 6. Testing.
Exam 1 Review u Scores Min 30 Max 96 Ave 63.9 Std Dev 14.5.
Systems Development Life Cycle
1 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation.
LEARNING DISABILITIES IMPACTING MATHEMATICS Ann Morrison, Ph.D.
Smart Home Technologies
Software Quality Assurance and Testing Fazal Rehman Shamil.
Formal Methods in Software Engineering1 Today’s Agenda  Mailing list  Syllabus  Introduction.
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
Lectures 2 & 3: Software Process Models Neelam Gupta.
CS 5150 Software Engineering Lecture 22 Reliability 3.
Lecture 3 Page 1 CS 236 Online Security Mechanisms CS 236 On-Line MS Program Networks and Systems Security Peter Reiher.
By Ramesh Mannava.  Overview  Introduction  10 secure software engineering topics  Agile development with security development activities  Conclusion.
Using existing lifts in existing buildings to evacuate disabled persons Derek Smith Technical Director UK Lift and Escalator Industry Association.
CS223: Software Engineering Lecture 25: Software Testing.
Can We Trust the Computer? FIRE, Chapter 4. What Can Go Wrong? What are the risks and reasons for computer failures? How much risk must or should we accept?
 System Requirement Specification and System Planning.
MANAGEMENT INFORMATION SYSTEM
Software Development Module Code: CST 240 Chapter 6: Software Maintenance Al Khawarizmi International College, AL AIN, U.A.E Lecturer: Karamath Ateeq.
Logical Arguments an argument can be defined as a:
Software Aspects of Strategic Defense Systems
Introduction to Software Testing
Presentation transcript:

CPSR-MIT Debate Michael Dertouzos, moderator David Parnas, against SDI (Joseph Weizenbaum, against) Charles Seitz, for SDI (Danny Cohen, for)

2 Charles Seitz, arguing for

3 Pause for Analysis Sketch Seitz’ argument in premise-conclusion style: Since Premise, and Premise, … Therefore Conclusion. (Hint: identify conclusion first.)

4 Seitz’ Conclusion It is possible to create reliable SDI software.

5 Seitz’ Premises Since A hierarchical architecture seems best, (because more natural, used in nature, understood by military, allows abstraction up levels …)

6 Seitz’ Premises Since A hierarchical architecture seems best, Physical organization should follow logical organization, (simplest choice, natural)

7 Seitz’ Premises Since A hierarchical architecture seems best, Physical organization also hierarchical, Tradeoffs to make software problem tractable are in the choice of system architecture (not in new / radical methods)

8 Seitz’ Premises Since A hierarchical architecture seems best, Physical organization also hierarchical, This makes software problem tractable, Loose coordination allows us to infer system performance (assume stat. independence, …)

9 Seitz’ Argument Since A hierarchical architecture seems best, Physical organization also hierarchical, This makes software problem tractable, And allows system reliability estimate, Therefore – It is possible to create reliable SDI battle management software.

10 Pause for Analysis Whose argument is better? Why? Do they start with the same problem definition?

11 David Parnas, Rebuttal

12 Charles Seitz, Rebuttal

13 Pause for Analysis Relevant analogies to SDI? Why / why not? Space shuttle software Telephone system software Nuclear plant software others?

14 Pause for Analysis Outline the most realistic SDI software testing that you can.

15 Pause for Analysis How did you account for … real-world sensor inputs variable weather conditions target / decoy appearance variable attack structure attacked components failing

16 Fault Tolerant Software? James Ionson, in “Reliability and Risk,” a CPSR video.

17 Fault Tolerant Software? “It is not error-free code, it is fault-tolerant code. And if another million lines has to be written to ensure fault- tolerance, so be it.” - James Ionson

18 Fault Tolerant Software? Diagram in premise-conclusion form the argument being made by James Ionson. Does the argument make sense? Why / why not?

19 “Star Wars” Today Current SDI-like programs are called “National Missile Defense.” There are some potentially important differences.

20 “Star Wars” Today “One of the remarkable aspects of the evolution of missile defenses is that few policy makers question the fundamental ability … to be effective. Instead they focus on timing, cost, ….” (Mosher, page 39, IEEE Spectrum, 1997)

21 “Star Wars” Today “This is a sharp change from the Reagan years, perhaps because the technology used is closer at hand and the threats are smaller.” (Mosher, page 39, IEEE Spectrum, 1997)

22 “Star Wars” Today Smaller anticipated mission: “protect the U.S. … against an attack by a rogue state using a handful of warheads outfitted with … simple countermeasures.” (Mosher, page 36, IEEE Spectrum, 1997)

23 “Star Wars” Today Smaller anticipated mission: “also provide protection against an accidental launch of a few warheads by Russia or China.” (Mosher, page 36, IEEE Spectrum, 1997)

24 “Star Wars” Today One talked-about version does not use space-based weapons: “… no more than 100 hit-to-kill interceptors based at old ABM site near Grand Forks, ND.” (Mosher, page 37, IEEE Spectrum, 1997)

25 Pause for Analysis How fundamentally does it change Parnas’ argument if the anticipated attack uses fewer and simpler missiles?

26 Parnas’ Argument How are the premises changed? Specifications not known in advance, Realistic testing is not possible, No chance to fix software during use, No foreseeable technology changes this, None are changed “in principle” but overall it seems somehow less impossible.

27 “Star Wars” Testing “In the last 15 years, the U.S. has conducted 20 hit-to-kill intercepts, …. Six intercepts were successful; 13 of those test were done in the last five years, and among them three succeeded.” (Mosher, page 39, IEEE Spectrum, 1997)

28 “Star Wars” Testing “No real attempts have been made to intercept uncooperative targets – those that make use of clutter, decoys, maneuver, anti- simulation, and other countermeasures.” (Mosher, page 39, IEEE Spectrum, 1997)

29 “Star Wars” Testing “Test … of a powerful laser has been blocked by … bad weather and software problems. … a software problem caused the laser to recycle, or unexpectedly lose power ….” (R. Smith, Washington Post, Oct 8, 1997)

30 Schwartz versus TRW In 1996, ex TRW engineer Nira Schwartz filed a “False Claims Act” suit, alleging that results of tests to distinguish warheads and decoys were falsified by TRW. (featured on “60 Minutes II” in January 2001)

31 Schwartz versus TRW Schwartz claims that TRW “knowingly made false test plans, test procedures, test reports and presentations to the government … to remain in the program.”

32 Schwartz versus TRW Schwartz claims – “I say to my boss, “It is wrong, what we are going; it is wrong.” And the next day, I was fired.”

33 Schwartz versus TRW TRW says – “TRW scientists and engineers devoted years to this complex project, while Ms. Schwartz, in her six months with the company … Her understanding … is insufficient to lend any credibility to her allegations.”

34 Schwartz versus TRW DOD criminal investigator says – “absolute, irrefutable, scientific proof that TRW’s discrimination technology does not, cannot, and will not work” … TRW “knowingly covering up.”

35 Schwartz versus TRW DOD panel then said – “TRW’s software and sensors are “well designed and work properly” provided that the Pentagon does not have any wrong information about what kind of warheads and decoys an enemy is using.”

36 Schwartz versus TRW Lt. General Kadish – “Right now, from what I see, there is no reason to believe that we can’t make this work. But there’s a lot more testing to be done.”

37 Schwartz versus TRW Congressman Curt Weldon, R-PA: “If we don’t build a new aircraft carrier, we have older ones. If we don’t build a new fighter plane, we have older ones. If we don’t build missile defense, we have nothing.” What is the premise-conclusion summary of this argument?

38 Schwartz versus TRW Congressman Curt Weldon, R-PA: On 50 Nobelists’ anti-BMD letter - “I don’t know any of them that’s come to Congress or me. I mean … its easy to get anyone to sign a letter. I sign letters all the time.” What is the premise-conclusion summary of this argument?

39 Schwartz versus TRW Congressman Curt Weldon, R-PA: “There were scientists that who made the case against Kennedy that it was crazy, we’d never land on the moon. And I characterize Postol now as one of those people.” What is the premise-conclusion summary of this argument?

40 Ethical Issues What are some of the important ethical questions? And what guidance do the codes of ethics give on these questions?

41 Ethical Issues How to interact with colleagues with whom you disagree? When to blow the whistle? Should you accept work on an “impossible” but $$$ project?

42 Dealing with Colleagues AITP Standards of Conduct: “In recognition of my obligation to fellow members and the profession I shall cooperate with others in achieving understanding and in identifying problems.”

43 Dealing with Colleagues Item 5.12 of ACM / IEEE-CS Software Engineering Code: “Those managing or leading software engineers shall not punish anyone for expressing ethical concerns about a project.”

44 Accept Impossible Work? Item 3.2 of ACM / IEEE-CS Software Engineering Code: “Software engineers shall ensure proper and achievable goals and objectives for any project on which they work or propose.”

45 Accept Impossible Work? Item 1.3 of the ACM / IEEE-CS Software Engineering Code: “Software engineers shall accept software only if they have a well founded belief that it is safe, meets specifications, passes appropriate tests, …”

46 Blow the Whistle? AITP Standards of Conduct: In recognition of my obligation to society, I shall never misrepresent or withhold information that is germane to a problem or situation of public concern nor allow any such known information to remain unchallenged.

47 Blow the Whistle? Item 1.4 of ACM / IEEE-CS Software Engineering Code: “Software engineers shall disclose to appropriate persons or authorities any actual or potential danger to the user, the public … that they reasonably believe …”

48 Summary Difficult ethical issues arise in creation of safety-critical software. Trustworthy SDI software is more clearly impossible in retrospect. Modern, smaller SDI-like programs appear more tractable.

49 National Science Foundation grant DUE Thanks to for partial support of this work.

50 Computing Professionals for Social Responsibility ( Thanks to the for permission to distribute digitized video of the debate.

51 Thanks to for commenting on a draft of the paper describing this module. David Parnas Chuck Seitz

52 Thanks to the for help in obtaining the video of Reagan’s 3/23/83 speech. The Ronald Reagan Presidential Library (

53 Thanks to for technical assistance. Christine Kranenburg Laura Malave Melissa Parsons Joseph Wujek

54 The End.

55 Overheads from Parnas’ Presentation The next slides are transcribed versions of (most of) the transparencies in Parnas’ presentation.

56 Why is it important that the software can never be trusted? “We” will make decisions as if it was not there. “They” will make decisions as if it might work.

57 A necessary condition for trustworthy engineering products is validation by: Mathematical analysis, or Exhaustive case analysis, or Prolonged, realistic, testing or a combination of the above

58 Why software is always the unreliable glue in engineering systems: The best mathematical tools require that a system be described by continuous functions Exhaustive case analysis can only be used when the number of states is small or the design exhibits a repetitive structure

59 Why do we have some usable software? Sometimes the requirements allow untrustworthy software There has been extensive use under actual conditions Operating conditions are controlled or predictable “Backup” manual system available when needed

60 What makes the SDI software much more difficult than other projects? Lack of reliable information on target and decoy characteristics Distributed computing with unreliable nodes and unreliable channels Distributed computing with hard real-time deadlines Physical distribution of redundant real-time data Hardware failures will not be statistically independent

61 What makes the SDI software much more difficult than other projects? Redundancy is unusually expensive Information essential for real-time scheduling will not be reliable Very limited opportunities for realistic testing No opportunities for repairing software during use Expected to be the largest real-time system ever attempted, frequent changes are anticipated

62 Software Espionage and Nuclear Blackmail Fact: Software systems, because of their rigid predetermined behaviors are, easily defeated by people who understand the programs Fact: Changes in large software systems must be made slowly and carefully with extensive review and testing

63 What about new Soft. Eng. techniques? Precise requirement documents Abstraction/information hiding Formal specifications The use of these techniques requires previous experience with similar systems Co-operating sequential processes requires detailed information for real-time scheduling Structured programming reduces but does not eliminate errors

64 What about Artificial Intelligence? AI-1 - Defined as solving hard problems. –Study the problem, not the problem solver. No magic techniques just good solid program design. AI-2 - Heuristic or Rule Based Programming/Expert Systems –Study the problem solver, not the problem –Ad hoc, “cut and dry” programming –Little basis for confidence

65 What about new programming languages? No magic They help if they are simple and well understood No breakthroughs The fault lies not in our tools but in ourselves and in the nature of our product.

66 What about automatic programming? Since 1948 a euphemism for programming in a new language?

67 What about program verification? The right problem but do we have a solution? What’s a big program? Wrong kind of program? How do you verify a model of the earth’s gravitational field? Implicit assumption of perfect arithmetic What about language semantics?

68 Is there a meaningful concept of tolerance for software? The engineering notion of “tolerance” depends on an assumption of continuity. Statistical measures of program quality are limited in their application to situations where individual failures are not important.

69 Overheads from Seitz’ Presentation The next slides are transcribed versions of (most of) the transparencies in Seitz’ presentation.

70 From “The Strategic Defense Initiative” White House pamphlet dated Jan, “SDI’s purpose is to identify ways to exploit recent advances in ballistic missile defense technologies that have potential for strengthening our security and that of our Allies. The program is designed to answer a number of fundamental scientific and engineering questions that much be addressed before the promise of these new technologies can be fully assessed. The SDI program will provide to a future president and a future congress the technical knowledge necessary to support a decision in the early 1990’s on whether to develop and deploy advanced defensive systems.”

71 From 1985 “Report to the Congress on the Stategic Defense Initiative” (Section III): “ The goal of the SDI is to conduct a program of rigorous research focused on advanced defensive technologies.” “The SDI seeks, therefore, to exploit emerging technologies that may provide options for a broader-based deterrence by turning to a greater reliance on defensive systems”

72 From 1985 “Report to the Congress on the Stategic Defense Initiative” (Section III): “ It should be stressed that the SDI is a research program that seeks to provide the technical knowledge required to support a decision on whether to develop and later deploy these systems. All research efforts will be fully compliant with U.S. treaty obligations.”

73 Weapons –Incapable of causing damage at Earth’s surface –Range 1000 km. –Partial deployment ineffective in boost phase Sensors –Some located in high orbits –Can be passive –Useful in early deployments Battle Management System –Computers and communication

74 Coordination Lowest Level - stereo and “sensor fusion” Middle Levels - target discrimination, attack and coordination High Levels - assignment of priorities of target in midcourse in order to prevent particular areas from being overwhelmed in terminal defense, or to prevent any single area to accept too high a concentration for terminal defense Top Level - command and control decisions

75 Conclusions of the Panel: “ The feasibility of the battle management software and our ability to test, simulate, and modify the system are very sensitive to the choice of system architecture. In particular, the feasibility of the BMS software is much more sensitive to the system architecture than it is to the choice of software engineering technique”

76 Conclusions of the Panel: “Software technology is developing against what appears today to be relatively inflexible limits in the complexity of systems. The treadeoffs necessary to make the software tractable are in the system architecture”

77 Conclusions of the Panel: “We must prefer an unconventional system architecture whose programming is within the anticipated limits of software engineering over reliance on radical software development approaches and the risk that we could not develop reliable software at any cost…”

78 Conclusions of the Panel: “One promising class of system architecture for a strategic defense system are those that are less dependent on tight coordination… [because of]… the ability to infer the performance of full-scale deployment by evaluating the performance of small parts of the system.”