Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.

Slides:



Advertisements
Similar presentations
Object Oriented Analysis And Design-IT0207 iiI Semester
Advertisements

Software Testing. Quality is Hard to Pin Down Concise, clear definition is elusive Not easily quantifiable Many things to many people You'll know it when.
Test process essentials Riitta Viitamäki,
Testing and Quality Assurance
SOFTWARE TESTING. INTRODUCTION  Software Testing is the process of executing a program or system with the intent of finding errors.  It involves any.
Annoucements  Next labs 9 and 10 are paired for everyone. So don’t miss the lab.  There is a review session for the quiz on Monday, November 4, at 8:00.
Informatics 43 – April 28, Fun with Models Fashion Student Model = Ideal.
Informatics 43 – May 5, Restatement of goals Want to verify software’s correctness  Need to test  Need to decide on test cases  No set of test.
Software Testing. “Software and Cathedrals are much the same: First we build them, then we pray!!!” -Sam Redwine, Jr.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Outline Types of errors Component Testing Testing Strategy
High Level: Generic Test Process (from chapter 6 of your text and earlier lesson) Test Planning & Preparation Test Execution Goals met? Analysis & Follow-up.
Chapter 11: Testing The dynamic verification of the behavior of a program on a finite set of test cases, suitable selected from the usually infinite execution.
Introduction to Software Testing
Types and Techniques of Software Testing
Issues on Software Testing for Safety-Critical Real-Time Automation Systems Shahdat Hossain Troy Mockenhaupt.
BY RAJESWARI S SOFTWARE TESTING. INTRODUCTION Software testing is the process of testing the software product. Effective software testing will contribute.
CS527: (Advanced) Topics in Software Engineering Overview of Software Quality Assurance Tao Xie ©D. Marinov, T. Xie.
System/Software Testing
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Extreme Programming Software Development Written by Sanjay Kumar.
Software Quality Assurance Lecture #8 By: Faraz Ahmed.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
TESTING.
Objectives Understand the basic concepts and definitions relating to testing, like error, fault, failure, test case, test suite, test harness. Explore.
Topic 11Summer ICS 52: Introduction to Software Engineering Lecture Notes for Summer Quarter, 2003 Michele Rousseau Topic 11 Partially based on.
CPIS 357 Software Quality & Testing
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 121 Software Design I Lecture 12.
Overview of Software Testing 07/12/2013 WISTPC 2013 Peter Clarke.
Software Testing Testing principles. Testing Testing involves operation of a system or application under controlled conditions & evaluating the results.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Testing Basics of Testing Presented by: Vijay.C.G – Glister Tech.
Software Development Software Testing. Testing Definitions There are many tests going under various names. The following is a general list to get a feel.
University of Palestine software engineering department Testing of Software Systems Testing throughout the software life cycle instructor: Tasneem.
Introduction to Software Testing. Types of Software Testing Unit Testing Strategies – Equivalence Class Testing – Boundary Value Testing – Output Testing.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Today’s Agenda  HW #1  Finish Introduction  Input Space Partitioning Software Testing and Maintenance 1.
Well-behaved objects Main concepts to be covered Testing Debugging Test automation Writing for maintainability Objects First with Java - A Practical.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Sylnovie Merchant, Ph.D. MIS 161 Spring 2005 MIS 161 Systems Development Life Cycle II Lecture 5: Testing User Documentation.
TESTING LEVELS Unit Testing Integration Testing System Testing Acceptance Testing.
(1) A beginners guide to testing Philip Johnson Collaborative Software Development Laboratory Information and Computer Sciences University of Hawaii Honolulu.
Software Engineering 2004 Jyrki Nummenmaa 1 BACKGROUND There is no way to generally test programs exhaustively (that is, going through all execution.
CPSC 873 John D. McGregor Session 9 Testing Vocabulary.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
Software Testing and Quality Assurance 1. What is the objectives of Software Testing?
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Objects First With Java A Practical Introduction Using BlueJ Well-behaved objects 2.1.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Software Quality Assurance and Testing Fazal Rehman Shamil.
Dynamic Testing.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
1 Phase Testing. Janice Regan, For each group of units Overview of Implementation phase Create Class Skeletons Define Implementation Plan (+ determine.
Agenda  Quick Review  Finish Introduction  Java Threads.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
SOFTWARE TESTING SOFTWARE TESTING Presented By, C.Jackulin Sugirtha-10mx15 R.Jeyaramar-10mx17K.Kanagalakshmi-10mx20J.A.Linda-10mx25P.B.Vahedha-10mx53.
Informatics 43 – May 3, Restatement of goals Want to verify software’s correctness  Need to test  Need to decide on test cases  No set of test.
SOFTWARE TESTING LECTURE 9. OBSERVATIONS ABOUT TESTING “ Testing is the process of executing a program with the intention of finding errors. ” – Myers.
Testing Integral part of the software development process.
SOFTWARE TESTING Date: 29-Dec-2016 By: Ram Karthick.
Software Engineering (CSI 321)
Software Testing An Introduction.
Software engineering – 1
Introduction to Software Testing
Lecture 09:Software Testing
Chapter 10 – Software Testing
Presentation transcript:

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering Lecture 6-2 November 5, 2015 Emily Navarro Duplication of course material for any commercial purpose without the explicit written permission of the professor is prohibited.

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 2 Today’s Lecture Midterm answers Failures: a second look Quality assurance Testing Quiz 4 study guide

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 3 Today’s Lecture Midterm answers Failures: a second look Quality assurance Testing Quiz 4 study guide

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 4 Today’s Lecture Midterm answers Failures: a second look Quality assurance Testing Quiz 4 study guide

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 5 What Do These Have in Common? Airbus 320 Toyota Mariner 1 launch AT&T telephone network Ariane 5 Word 3.0 for MAC Radiation therapy machine NSA Y2K

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 6 They All Failed! Airbus 320 – Toyota – “unintended” acceleration problem Mariner 1 launch – AT&T telephone network – Ripple effect, from switch to switch, network down/dark for 2-3 days Ariane 5 – Word 3.0 for MAC – “Plagued with bugs”, replaced for free later Word Radiation therapy machine – Y2K

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 7 Toyota Failure - What was the problem? Overly complex spaghetti code – Untestable – Unmaintainable Violated standards set by the industry – Single-point failures No peer code reviews System threw away error codes

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 8 Toyota Failure - What was the problem? Overly complex spaghetti code – Untestable – Unmaintainable Violated standards set by the industry – Single-point failures No peer code reviews System threw away error codes Toyota did not follow good quality assurance practices

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 9 Y2K Facts Bug description – Date formats were MM/DD/YY, e.g., 01/01/98, 02/02/99, 03/03/00 – 98 -> 1998, 99 -> 1999 – But does 00 mean 2000 or 1900? – Does 1999 turn to 19100? Effects – Relatively minor Cost: $300 billion!

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 10 Impact of Failures Not just “out there” – Space shuttle – Mariner 1 – Ariane 5 But also “at home” – Your car – Your call to your mom – Your wireless network, social network, mobile app – Your homework – Your hospital visit Peter Neumann’s Risks Digest:

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 11 Today’s Lecture Midterm answers Failures: a second look Quality assurance Testing Quiz 4 study guide

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 12 QA goals: Verification and Validation Verification – Ensure software meets specifications – Internal consistency – “Are we building the product right?” – e.g., testing, inspections, program analysis Validation – Ensure software meets customer’s intent – External consistency – “Are we building the right product?” – e.g., usability testing, user feedback “Implement the idea properly” “Implement the proper idea” Validation

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 13 Software Qualities Correctness Reliability Efficiency Integrity Usability Maintainability Testability Flexibility Portability Reusability Interoperability Performance, etc.

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 14 Quality Assurance All activities designed to measure and improve quality in a product Assure that each of the software qualities is met – Goals set in requirements specification – Goals realized in implementation Sometimes easy, sometimes difficult – Portability versus safety Sometimes immediate, sometimes delayed – Understandability versus evolvability Sometimes provable, sometimes doubtful – Size versus correctness

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 15 An Idealized View of QA Design, in formal notation Executable machine code Execution on verified hardware Code, in verifiable language Complete formal specification of problem to be solved Correctness-preserving transformation

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 16 A Realistic View of QA Design, in mixed notation (Intel Pentium-based) machine code Execution on commercial hardware Code, in C++, Java, Ada, … Mixture of formal and informal specifications Manual transformation Compilation by commercial compiler Commercial firmware

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 17 First Complication Real needs Actual Specification “Correct” Specification No matter how sophisticated the QA process, the problem of creating the initial specification remains

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 18 Second Complication Correctness! SAFETY Security Efficiency Usability Reliability There are often multiple, sometimes conflicting qualities to be tested, making QA a challenge.

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 19 Third Complication Complex data communications – Electronic fund transfer Distributed processing – Web search engine Stringent performance objectives – Air traffic control system Complex processing – Medical diagnosis system Sometimes, the software system is extremely complicated making it tremendously difficult to perform QA

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 20 Fourth Complication It is difficult to divide the particular responsibilities involved when performing quality assurance Project Management Development Group Quality Assurance Group

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 21 Fourth Complication Quality assurance lays out the rules – You will check in your code every day – You will comment your code – You will… Quality assurance also uncovers the faults – Taps developers on their fingers – Creates image of “competition” Quality assurance is viewed as cumbersome, “heavy” – “Just let me code” Quality assurance has a negative connotation

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 22 Available Techniques Formal program verification Static analysis of program properties – Concurrent programs: deadlock, starvation, fairness – Performance: min/max response time Code reviews and inspections Testing Most techniques are geared towards verifying correctness

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 23 Which Technique to Use? There is no “silver bullet” of testing – A mixture of techniques is needed Different approaches are needed for different faults – E.g., testing for race conditions vs. performance issues Different approaches are needed at different times – E.g., unit testing vs. usability testing vs. system testing

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 24 All software has bugs "All nontrivial code has defects, and the probability of nontrivial defects increases with code size. The more code you use to solve a problem, the harder it gets for someone else to understand what you did and to maintain your code when you have moved on to write still larger programs.” - Code Inflation, Holzmann (IEEE SW 2015)

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 25 All software has bugs Q: What is the required period of failure-free operation for conventional takeoffs and landings of the F35 Joint Strike Fighter? A) 6,000 years B) 600 years C) 60 years D) 6 hours

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 26 All software has bugs Q: What is the required period of failure-free operation for conventional takeoffs and landings of the F35 Joint Strike Fighter? A) 6,000 years B) 600 years C) 60 years D) 6 hours AND a recent government report stated that this target had not yet been realized!

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 27 Some bugs are bizarre…

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 28 Some bugs are long-lived…

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 29 Some bugs are not bugs at all…

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 30 How do we know when we are done? We can never find all faults We aim to reveal as many faults as possible in a fixed period of time with a fixed budget – More faults found and fixed = good – More bugs found = more bugs not found Generally the more likely types of defects are caught – Unfortunately these are usually not the ones that cause major failures Aim to meet the quality requirements established for the project

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 31 How do we know when we are done? Number of problems found per hour Day 1 Time Day 2 Day 3 Day 4 Day 5

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 32 How do we know when we are done? Number of problems found per hour Day 1 Time Day 2 Day 3 Day 4 Day 5 Could stop testing when the problem find rate stabilizes to near zero

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 33 How do we know when we are done? Confidence in module being tested Number of test cases with correct outputs % Sweet spot?

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 34 How do we know when we are done? Pepper the code with defects and observe how many of the seeded defects are discovered Scenario – The program is seeded with 10 defects – After some test cases are executed 7 of the seeded defects are found 45 nonseeded defects are found – Since 70% of the seeded defects are found, and 30% not found, assume that the nonseeded defects follow the same pattern 45 is 70% of 64, so there are 19 (64 minus 45) defects left to be found This technique assumes that nonseeded defects are similar to the seeded ones

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 35 Today’s Lecture Midterm answers Failures: a second look Quality assurance Testing Quiz 4 study guide

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 36 Testing – Basic Process Using a set of techniques to detect and correct errors in a software product Exercise a module, collection of modules, or system – Use a predetermined input (“test case”) – Run test case – Capture actual output – Compare actual output to expected output – (Lather, rinse, and repeat) Actual output = expected output Test case SUCCEEDS Actual output ≠ expected output Test case FAILS (report failure)

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 37 Testing Terminology Test Case: A group of input values that cause a program to take some defined action, with an expected output Error – A human action that produces an incorrect result – May or may not produce a fault Fault – A condition that may (or may not) cause a failure – Caused by an error – “bug” Failure – The inability of a system to perform a function according to its specifications – Result of a fault

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 38 Error, Fault, Failure Error (in programmer’s mind) Fault or defect (in code) Failure (in execution or output)

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 39 Error, Fault, Failure

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 40 Error, Fault, Failure

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 41 Testing Goals Failures/faults/errors Detect Locate Fix Show system correctness – Within the limits of optimistic inaccuracy Improve confidence that the system performs as specified (verification) Improve confidence that the system performs as desired (validation) Program testing can be used to show the presence of bugs, but never to show their absence [Dijkstra]

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 42 Testing Process Quality Goals Accurate Complete Thorough Repeatable Systematic

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 43 Who Does the Testing? Programmers – Unit testing Testers – Non-programmers Users – Acceptance testing – Alpha testing – Beta testing (Public at large - bug bounties)

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 44 Levels of Testing Unit testing – Testing of a single code unit – Requires use of test drivers Functional/integration testing – Testing of interfaces among integrated units Incremental “Big bang” – Often requires test drivers and test stubs System/acceptance testing – Testing of complete system for satisfaction of requirements

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 45 Levels of Testing Appointment Patient: p Doctor: d void setPatient(…) void setDoctor(…) Schedule AppointmentList List : list Patient: patient PrivatePractice EMRManagement AppointmentSystem Logger Unit testing Appointment.setPatient(…) Functional/integration testing Schedule.addNewAppt(…) System/acceptance testing Make appointment Cancel appointment Login Logout Void addNewAppt(…) Void cancelAppt(…)

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 46 Test Tasks Devise test cases – Target specific areas of the system – Create specific inputs – Create expected outputs Choose test cases – Not all need to be run all the time Regression testing Run test cases – Can be labor intensive – Opportunity for automation All in a systematic, repeatable, and accurate manner

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 47 How to Choose Test Cases (I) There are usually an infinite number of possible test cases for a given function There are too many input-output pairs to exhaustively verify, so we must take a small sample Example: multiplier – Input: two integers – Output: one integer int multiplier(int a, int b) { return a * b; }

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 48 How to Choose Test Cases (II) Practically, testing can only select a very small set of inputs – Our goal should be to choose the best ones What are the best five test cases for a multiplier? – (AKA: what five test cases, if they don’t reveal any bugs, will give you the most confidence that the multiplier is working correctly?)

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 49 How to Choose Test Cases (III) Intuition Specification (black-box testing) – Equivalence class partitioning – Boundary-value analysis Code (white-box testing) – Path analysis Existing test cases (regression testing) Faults – Error guessing – Error-prone analysis

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 50 Test Automation Opportunities – Test execution – Scaffolding Executing test cases – Most repetitive, non-creative aspect of the test process – Design once, execute many times – Tool support available White box testing can be partially automated Black box testing requires “formal” specifications to automate

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 51 Scaffolding Term borrowed from construction, civil engineering Additional code to support development – But usually not included or visible in the deployed/shipped code – Not experienced by the end user Test driver – A function or program (“main”) for driving a test Test stub – A replacement of the “real code” that’s being called by the program

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 52 Test Drivers/Stubs

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 53 Test Drivers/Stubs

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 54 Test Oracles Provide a mechanism for deciding whether a test case execution succeeds or fails Critical to testing – Used in white box testing – Used in black box testing Difficult to automate – Typically relies on humans – Typically relies on human intuition – Formal specifications may help

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 55 Oracle Example: Cosine Your test execution shows cos(0.5) = You have to decide whether this answer is correct? You need an oracle – Draw a triangle and measure the sides – Look up cosine of 0.5 in a book – Compute the value using Taylor series expansion – Check the answer with your desk calculator

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 56 Two Overall Approaches to Testing Black box testing – Specification-based testing – Test cases designed, selected, and ran based on specifications – Scale: tests the higher-level system behavior – Drawback: less systematic White box testing – Structural testing – Test cases designed, selected, and ran based on structure of the code – Scale: tests the nitty-gritty – Drawbacks: need access to source

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 57 Today’s Lecture Midterm answers Failures: a second look Quality assurance Testing Quiz 4 study guide

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 58 Quiz 4 User orientation – User-centered design methods – Be familiar with the Nielsen heuristics, memorize at least 3 of them Testing – Validation/verification – Techniques for “How do we know when we are done?” – Testing – basic process – Error, fault, failure – Testing goals – Levels of testing (unit, functional/integration, system/acceptance) – Oracles – Test drivers/stubs – Difference between white-box and black-box testing Reading – Toyota article

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 59 Next time Black-box (Specification-based) Testing