Presentation is loading. Please wait.

Presentation is loading. Please wait.

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.

Similar presentations


Presentation on theme: "Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering."— Presentation transcript:

1 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering Lecture 6-2 November 5, 2015 Emily Navarro Duplication of course material for any commercial purpose without the explicit written permission of the professor is prohibited.

2 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 2 Today’s Lecture Midterm answers Failures: a second look Quality assurance Testing Quiz 4 study guide

3 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 3 Today’s Lecture Midterm answers Failures: a second look Quality assurance Testing Quiz 4 study guide

4 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 4 Today’s Lecture Midterm answers Failures: a second look Quality assurance Testing Quiz 4 study guide

5 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 5 What Do These Have in Common? Airbus 320 Toyota Mariner 1 launch AT&T telephone network Ariane 5 Word 3.0 for MAC Radiation therapy machine NSA Y2K

6 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 6 They All Failed! Airbus 320 – http://catless.ncl.ac.uk/Risks/10.02.html#subj1.1 Toyota – “unintended” acceleration problem Mariner 1 launch – http://catless.ncl.ac.uk/Risks/5.73.html#subj2.1 AT&T telephone network – Ripple effect, from switch to switch, network down/dark for 2-3 days Ariane 5 – http://catless.ncl.ac.uk/Risks/18.24.html#subj2.1 Word 3.0 for MAC – “Plagued with bugs”, replaced for free later Word 3.0.1 Radiation therapy machine – http://courses.cs.vt.edu/~cs3604/lib/Therac_25/Therac_5.html Y2K

7 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 7 Toyota Failure - What was the problem? Overly complex spaghetti code – Untestable – Unmaintainable Violated standards set by the industry – Single-point failures No peer code reviews System threw away error codes

8 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 8 Toyota Failure - What was the problem? Overly complex spaghetti code – Untestable – Unmaintainable Violated standards set by the industry – Single-point failures No peer code reviews System threw away error codes Toyota did not follow good quality assurance practices

9 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 9 Y2K Facts Bug description – Date formats were MM/DD/YY, e.g., 01/01/98, 02/02/99, 03/03/00 – 98 -> 1998, 99 -> 1999 – But does 00 mean 2000 or 1900? – Does 1999 turn to 19100? Effects – Relatively minor Cost: $300 billion!

10 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 10 Impact of Failures Not just “out there” – Space shuttle – Mariner 1 – Ariane 5 But also “at home” – Your car – Your call to your mom – Your wireless network, social network, mobile app – Your homework – Your hospital visit Peter Neumann’s Risks Digest: http://catless.ncl.ac.uk/Risks

11 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 11 Today’s Lecture Midterm answers Failures: a second look Quality assurance Testing Quiz 4 study guide

12 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 12 QA goals: Verification and Validation Verification – Ensure software meets specifications – Internal consistency – “Are we building the product right?” – e.g., testing, inspections, program analysis Validation – Ensure software meets customer’s intent – External consistency – “Are we building the right product?” – e.g., usability testing, user feedback “Implement the idea properly” “Implement the proper idea” Validation

13 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 13 Software Qualities Correctness Reliability Efficiency Integrity Usability Maintainability Testability Flexibility Portability Reusability Interoperability Performance, etc.

14 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 14 Quality Assurance All activities designed to measure and improve quality in a product Assure that each of the software qualities is met – Goals set in requirements specification – Goals realized in implementation Sometimes easy, sometimes difficult – Portability versus safety Sometimes immediate, sometimes delayed – Understandability versus evolvability Sometimes provable, sometimes doubtful – Size versus correctness

15 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 15 An Idealized View of QA Design, in formal notation Executable machine code Execution on verified hardware Code, in verifiable language Complete formal specification of problem to be solved Correctness-preserving transformation

16 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 16 A Realistic View of QA Design, in mixed notation (Intel Pentium-based) machine code Execution on commercial hardware Code, in C++, Java, Ada, … Mixture of formal and informal specifications Manual transformation Compilation by commercial compiler Commercial firmware

17 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 17 First Complication Real needs Actual Specification “Correct” Specification No matter how sophisticated the QA process, the problem of creating the initial specification remains

18 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 18 Second Complication Correctness! SAFETY Security Efficiency Usability Reliability There are often multiple, sometimes conflicting qualities to be tested, making QA a challenge.

19 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 19 Third Complication Complex data communications – Electronic fund transfer Distributed processing – Web search engine Stringent performance objectives – Air traffic control system Complex processing – Medical diagnosis system Sometimes, the software system is extremely complicated making it tremendously difficult to perform QA

20 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 20 Fourth Complication It is difficult to divide the particular responsibilities involved when performing quality assurance Project Management Development Group Quality Assurance Group

21 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 21 Fourth Complication Quality assurance lays out the rules – You will check in your code every day – You will comment your code – You will… Quality assurance also uncovers the faults – Taps developers on their fingers – Creates image of “competition” Quality assurance is viewed as cumbersome, “heavy” – “Just let me code” Quality assurance has a negative connotation

22 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 22 Available Techniques Formal program verification Static analysis of program properties – Concurrent programs: deadlock, starvation, fairness – Performance: min/max response time Code reviews and inspections Testing Most techniques are geared towards verifying correctness

23 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 23 Which Technique to Use? There is no “silver bullet” of testing – A mixture of techniques is needed Different approaches are needed for different faults – E.g., testing for race conditions vs. performance issues Different approaches are needed at different times – E.g., unit testing vs. usability testing vs. system testing

24 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 24 All software has bugs "All nontrivial code has defects, and the probability of nontrivial defects increases with code size. The more code you use to solve a problem, the harder it gets for someone else to understand what you did and to maintain your code when you have moved on to write still larger programs.” - Code Inflation, Holzmann (IEEE SW 2015)

25 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 25 All software has bugs Q: What is the required period of failure-free operation for conventional takeoffs and landings of the F35 Joint Strike Fighter? A) 6,000 years B) 600 years C) 60 years D) 6 hours

26 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 26 All software has bugs Q: What is the required period of failure-free operation for conventional takeoffs and landings of the F35 Joint Strike Fighter? A) 6,000 years B) 600 years C) 60 years D) 6 hours AND a recent government report stated that this target had not yet been realized!

27 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 27 Some bugs are bizarre…

28 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 28 Some bugs are long-lived…

29 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 29 Some bugs are not bugs at all…

30 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 30 How do we know when we are done? We can never find all faults We aim to reveal as many faults as possible in a fixed period of time with a fixed budget – More faults found and fixed = good – More bugs found = more bugs not found Generally the more likely types of defects are caught – Unfortunately these are usually not the ones that cause major failures Aim to meet the quality requirements established for the project

31 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 31 How do we know when we are done? Number of problems found per hour Day 1 Time Day 2 Day 3 Day 4 Day 5

32 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 32 How do we know when we are done? Number of problems found per hour Day 1 Time Day 2 Day 3 Day 4 Day 5 Could stop testing when the problem find rate stabilizes to near zero

33 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 33 How do we know when we are done? Confidence in module being tested Number of test cases with correct outputs -- 100% Sweet spot?

34 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 34 How do we know when we are done? Pepper the code with defects and observe how many of the seeded defects are discovered Scenario – The program is seeded with 10 defects – After some test cases are executed 7 of the seeded defects are found 45 nonseeded defects are found – Since 70% of the seeded defects are found, and 30% not found, assume that the nonseeded defects follow the same pattern 45 is 70% of 64, so there are 19 (64 minus 45) defects left to be found This technique assumes that nonseeded defects are similar to the seeded ones

35 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 35 Today’s Lecture Midterm answers Failures: a second look Quality assurance Testing Quiz 4 study guide

36 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 36 Testing – Basic Process Using a set of techniques to detect and correct errors in a software product Exercise a module, collection of modules, or system – Use a predetermined input (“test case”) – Run test case – Capture actual output – Compare actual output to expected output – (Lather, rinse, and repeat) Actual output = expected output Test case SUCCEEDS Actual output ≠ expected output Test case FAILS (report failure)

37 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 37 Testing Terminology Test Case: A group of input values that cause a program to take some defined action, with an expected output Error – A human action that produces an incorrect result – May or may not produce a fault Fault – A condition that may (or may not) cause a failure – Caused by an error – “bug” Failure – The inability of a system to perform a function according to its specifications – Result of a fault

38 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 38 Error, Fault, Failure Error (in programmer’s mind) Fault or defect (in code) Failure (in execution or output)

39 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 39 Error, Fault, Failure

40 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 40 Error, Fault, Failure

41 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 41 Testing Goals Failures/faults/errors Detect Locate Fix Show system correctness – Within the limits of optimistic inaccuracy Improve confidence that the system performs as specified (verification) Improve confidence that the system performs as desired (validation) Program testing can be used to show the presence of bugs, but never to show their absence [Dijkstra]

42 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 42 Testing Process Quality Goals Accurate Complete Thorough Repeatable Systematic

43 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 43 Who Does the Testing? Programmers – Unit testing Testers – Non-programmers Users – Acceptance testing – Alpha testing – Beta testing (Public at large - bug bounties)

44 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 44 Levels of Testing Unit testing – Testing of a single code unit – Requires use of test drivers Functional/integration testing – Testing of interfaces among integrated units Incremental “Big bang” – Often requires test drivers and test stubs System/acceptance testing – Testing of complete system for satisfaction of requirements

45 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 45 Levels of Testing Appointment Patient: p Doctor: d void setPatient(…) void setDoctor(…) Schedule AppointmentList List : list Patient: patient PrivatePractice EMRManagement AppointmentSystem Logger Unit testing Appointment.setPatient(…) Functional/integration testing Schedule.addNewAppt(…) System/acceptance testing Make appointment Cancel appointment Login Logout Void addNewAppt(…) Void cancelAppt(…)

46 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 46 Test Tasks Devise test cases – Target specific areas of the system – Create specific inputs – Create expected outputs Choose test cases – Not all need to be run all the time Regression testing Run test cases – Can be labor intensive – Opportunity for automation All in a systematic, repeatable, and accurate manner

47 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 47 How to Choose Test Cases (I) There are usually an infinite number of possible test cases for a given function There are too many input-output pairs to exhaustively verify, so we must take a small sample Example: multiplier – Input: two integers – Output: one integer int multiplier(int a, int b) { return a * b; }

48 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 48 How to Choose Test Cases (II) Practically, testing can only select a very small set of inputs – Our goal should be to choose the best ones What are the best five test cases for a multiplier? – (AKA: what five test cases, if they don’t reveal any bugs, will give you the most confidence that the multiplier is working correctly?)

49 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 49 How to Choose Test Cases (III) Intuition Specification (black-box testing) – Equivalence class partitioning – Boundary-value analysis Code (white-box testing) – Path analysis Existing test cases (regression testing) Faults – Error guessing – Error-prone analysis

50 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 50 Test Automation Opportunities – Test execution – Scaffolding Executing test cases – Most repetitive, non-creative aspect of the test process – Design once, execute many times – Tool support available White box testing can be partially automated Black box testing requires “formal” specifications to automate

51 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 51 Scaffolding Term borrowed from construction, civil engineering Additional code to support development – But usually not included or visible in the deployed/shipped code – Not experienced by the end user Test driver – A function or program (“main”) for driving a test Test stub – A replacement of the “real code” that’s being called by the program

52 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 52 Test Drivers/Stubs

53 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 53 Test Drivers/Stubs

54 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 54 Test Oracles Provide a mechanism for deciding whether a test case execution succeeds or fails Critical to testing – Used in white box testing – Used in black box testing Difficult to automate – Typically relies on humans – Typically relies on human intuition – Formal specifications may help

55 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 55 Oracle Example: Cosine Your test execution shows cos(0.5) = 0.87758256189 You have to decide whether this answer is correct? You need an oracle – Draw a triangle and measure the sides – Look up cosine of 0.5 in a book – Compute the value using Taylor series expansion – Check the answer with your desk calculator

56 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 56 Two Overall Approaches to Testing Black box testing – Specification-based testing – Test cases designed, selected, and ran based on specifications – Scale: tests the higher-level system behavior – Drawback: less systematic White box testing – Structural testing – Test cases designed, selected, and ran based on structure of the code – Scale: tests the nitty-gritty – Drawbacks: need access to source

57 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 57 Today’s Lecture Midterm answers Failures: a second look Quality assurance Testing Quiz 4 study guide

58 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 58 Quiz 4 User orientation – User-centered design methods – Be familiar with the Nielsen heuristics, memorize at least 3 of them Testing – Validation/verification – Techniques for “How do we know when we are done?” – Testing – basic process – Error, fault, failure – Testing goals – Levels of testing (unit, functional/integration, system/acceptance) – Oracles – Test drivers/stubs – Difference between white-box and black-box testing Reading – Toyota article

59 Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 59 Next time Black-box (Specification-based) Testing


Download ppt "Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering."

Similar presentations


Ads by Google