Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.

Slides:



Advertisements
Similar presentations
Object Oriented Analysis And Design-IT0207 iiI Semester
Advertisements

Software Testing. Quality is Hard to Pin Down Concise, clear definition is elusive Not easily quantifiable Many things to many people You'll know it when.
1 Integration Testing CS 4311 I. Burnstein. Practical Software Testing, Springer-Verlag, 2003.
Testing and Quality Assurance
Introduction to Software Engineering Lecture 5 André van der Hoek.
CMSC 345, Version 11/07 SD Vick from S. Mitchell Software Testing.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Informatics 43 – April 28, Fun with Models Fashion Student Model = Ideal.
Informatics 43 – May 5, Restatement of goals Want to verify software’s correctness  Need to test  Need to decide on test cases  No set of test.
Software Testing. “Software and Cathedrals are much the same: First we build them, then we pray!!!” -Sam Redwine, Jr.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Outline Types of errors Component Testing Testing Strategy
Chapter 11: Testing The dynamic verification of the behavior of a program on a finite set of test cases, suitable selected from the usually infinite execution.
Introduction to Software Testing
Types and Techniques of Software Testing
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
CS527: (Advanced) Topics in Software Engineering Overview of Software Quality Assurance Tao Xie ©D. Marinov, T. Xie.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Extreme Programming Software Development Written by Sanjay Kumar.
Software Quality Assurance Lecture #8 By: Faraz Ahmed.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
Objectives Understand the basic concepts and definitions relating to testing, like error, fault, failure, test case, test suite, test harness. Explore.
CS 501: Software Engineering Fall 1999 Lecture 16 Verification and Validation.
Topic 11Summer ICS 52: Introduction to Software Engineering Lecture Notes for Summer Quarter, 2003 Michele Rousseau Topic 11 Partially based on.
CPIS 357 Software Quality & Testing
Software Engineering Chapter 23 Software Testing Ku-Yaw Chang Assistant Professor Department of Computer Science and Information.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 121 Software Design I Lecture 12.
©Ian Sommerville 2000, Mejia-Alvarez 2009 Slide 1 Software Processes l Coherent sets of activities for specifying, designing, implementing and testing.
Software Testing Testing principles. Testing Testing involves operation of a system or application under controlled conditions & evaluating the results.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Testing Basics of Testing Presented by: Vijay.C.G – Glister Tech.
Software Development Software Testing. Testing Definitions There are many tests going under various names. The following is a general list to get a feel.
Introduction to Software Testing. Types of Software Testing Unit Testing Strategies – Equivalence Class Testing – Boundary Value Testing – Output Testing.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Today’s Agenda  HW #1  Finish Introduction  Input Space Partitioning Software Testing and Maintenance 1.
Software Construction Lecture 18 Software Testing.
Well-behaved objects Main concepts to be covered Testing Debugging Test automation Writing for maintainability Objects First with Java - A Practical.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Sylnovie Merchant, Ph.D. MIS 161 Spring 2005 MIS 161 Systems Development Life Cycle II Lecture 5: Testing User Documentation.
CPSC 873 John D. McGregor Session 9 Testing Vocabulary.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
Software Testing and Quality Assurance 1. What is the objectives of Software Testing?
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
Software Quality Assurance and Testing Fazal Rehman Shamil.
Dynamic Testing.
Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
1 Phase Testing. Janice Regan, For each group of units Overview of Implementation phase Create Class Skeletons Define Implementation Plan (+ determine.
Agenda  Quick Review  Finish Introduction  Java Threads.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
Lecturer: Eng. Mohamed Adam Isak PH.D Researcher in CS M.Sc. and B.Sc. of Information Technology Engineering, Lecturer in University of Somalia and Mogadishu.
Testing and Evolution CSCI 201L Jeffrey Miller, Ph.D. HTTP :// WWW - SCF. USC. EDU /~ CSCI 201 USC CSCI 201L.
Week # 4 Quality Assurance Software Quality Engineering 1.
Informatics 43 – May 3, Restatement of goals Want to verify software’s correctness  Need to test  Need to decide on test cases  No set of test.
SOFTWARE TESTING LECTURE 9. OBSERVATIONS ABOUT TESTING “ Testing is the process of executing a program with the intention of finding errors. ” – Myers.
CS223: Software Engineering Lecture 25: Software Testing.
Testing Integral part of the software development process.
Informatics 43 – April 26, Announcements Apply to be an AppJam+ Summer Mentor! AppJam+ Partners with local universities and Colleges to recruit.
CPSC 372 John D. McGregor Module 8 Session 1 Testing.
Software Engineering (CSI 321)
Software Testing An Introduction.
Informatics 43 – May 3, 2016.
Introduction to Software Testing
Lecture 09:Software Testing
Presentation transcript:

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 1 Informatics 43 Introduction to Software Engineering Lecture 8-2 May 21, 2015 Emily Navarro Duplication of course material for any commercial purpose without the explicit written permission of the professor is prohibited.

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 2 Today’s Lecture Quality assurance Testing

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 3 Today’s Lecture Quality assurance Testing

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 4 What Do These Have in Common? Airbus 320 Audi 5000 Mariner 1 launch AT&T telephone network Ariane 5 Word 3.0 for MAC Radiation therapy machine NSA Y2K

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 5 They All Failed! Airbus 320 Audi 5000 Mariner 1 launch AT&T telephone network Ariane 5 Word 3.0 for MAC Radiation therapy machine NSA Y2K

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 6 They All Failed! Airbus 320 – Audi 5000 – “unintended” acceleration problem Mariner 1 launch – AT&T telephone network – Ripple effect, from switch to switch, network down/dark for 2-3 days Ariane 5 – Word 3.0 for MAC – “Plagued with bugs”, replaced for free later Word Radiation therapy machine – Y2K

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 7 Y2K Facts Bug description – Date formats were MM/DD/YY, e.g., 01/01/98, 02/02/99, 03/03/00 – 98 -> 1998, 99 -> 1999 – But does 00 mean 2000 or 1900? – Does 1999 turn to 19100? Effects – Relatively minor Cost: $300 billion!

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 8 Impact of Failures Not just “out there” – Space shuttle – Mariner 1 – Ariane 5 But also “at home” – Your car – Your call to your mom – Your wireless network, social network, mobile app – Your homework – Your hospital visit Peter Neumann’s Risks Digest:

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 9 Verification and Validation Verification – Ensure software meets specifications – Internal consistency – “Are we building the product right?” – e.g., testing, inspections, program analysis Validation – Ensure software meets customer’s intent – External consistency – “Are we building the right product?” – e.g., usability testing, user feedback Validation

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 10 Verification and Validation Verification – Ensure software meets specifications – Internal consistency – “Are we building the product right?” – e.g., testing, inspections, program analysis Validation – Ensure software meets customer’s intent – External consistency – “Are we building the right product?” – e.g., usability testing, user feedback “Implement the idea properly” “Implement the proper idea” Validation

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 11 Software Qualities Correctness Reliability Efficiency Integrity Usability Maintainability Testability Flexibility Portability Reusability Interoperability Performance, etc.

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 12 Quality Assurance All activities designed to measure and improve quality in a product Assure that each of the software qualities is met – Goals set in requirements specification – Goals realized in implementation Sometimes easy, sometimes difficult – Portability versus safety Sometimes immediate, sometimes delayed – Understandability versus evolvability Sometimes provable, sometimes doubtful – Size versus correctness

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 13 An Idealized View of QA Design, in formal notation Executable machine code Execution on verified hardware Code, in verifiable language Complete formal specification of problem to be solved Correctness-preserving transformation

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 14 A Realistic View of QA Design, in mixed notation (Intel Pentium-based) machine code Execution on commercial hardware Code, in C++, Java, Ada, … Mixture of formal and informal specifications Manual transformation Compilation by commercial compiler Commercial firmware

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 15 First Complication Real needs Actual Specification “Correct” Specification No matter how sophisticated the QA process, the problem of creating the initial specification remains

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 16 Second Complication There are often multiple, sometimes conflicting qualities to be tested, making QA a challenge.

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 17 Third Complication Complex data communications – Electronic fund transfer Distributed processing – Web search engine Stringent performance objectives – Air traffic control system Complex processing – Medical diagnosis system Sometimes, the software system is extremely complicated making it tremendously difficult to perform QA

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 18 Fourth Complication It is difficult to divide the particular responsibilities involved when performing quality assurance Project Management Development Group Quality Assurance Group

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 19 Fourth Complication Quality assurance lays out the rules – You will check in your code every day – You will comment your code – You will… Quality assurance also uncovers the faults – Taps developers on their fingers – Creates image of “competition” Quality assurance is viewed as cumbersome, “heavy” – “Just let me code” Quality assurance has a negative connotation

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 20 Available Techniques Formal program verification Static analysis of program properties – Concurrent programs: deadlock, starvation, fairness – Performance: min/max response time Code reviews and inspections Testing Most techniques are geared towards verifying correctness

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 21 Which Technique to Use? There is no “silver bullet” of testing – A mixture of techniques is needed Different approaches are needed for different faults – E.g., testing for race conditions vs. performance issues Different approaches are needed at different times – E.g., unit testing vs. usability testing vs. system testing

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 22 How do we know when we are done? We can never find all faults – But we cannot test forever! We aim to reveal as many faults as possible in a given period of time – More faults found and fixed = good – More bugs found = more bugs not found Aim to meet the quality requirements established for the project

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 23 How do we know when we are done? Number of problems found per hour Day 1 Time Day 2 Day 3 Day 4 Day 5

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 24 How do we know when we are done? Number of problems found per hour Day 1 Time Day 2 Day 3 Day 4 Day 5 Could stop testing when the problem find rate stabilizes to near zero

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 25 How do we know when we are done? Confidence in module being tested Number of test cases with correct outputs % Sweet spot?

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 26 How do we know when we are done? Can pepper the code with defects and observe how many of the seeded defects are discovered Scenario – The program is seeded with 10 defects – After some test cases are executed 7 of the seeded defects are found 45 nonseeded defects are found – Since 70% of the seeded defects are found, and 30% not found, assume that the nonseeded defects follow the same pattern 45 is 70% of 64, so there are 19 (64-45) defects left to be found This technique assumes that nonseeded defects are similar to the seeded ones

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 27 Reminder: Use the Principles Rigor and formality Separation of concerns – Modularity – Abstraction Anticipation of change Generality Incrementality

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 28 Today’s Lecture Quality assurance Testing

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 29 Testing Using a set of techniques to detect and correct errors in a software product Exercise a module, collection of modules, or system – Use predetermined inputs (“test case”) – Capture actual outputs – Compare actual outputs to expected outputs Actual outputs equal to expected outputs  test case succeeds Actual outputs not equal to expected outputs  test case fails

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 30 Testing Process Model 1.Decide what to test. 2.Select a test case input. 3.Determine the expected output E. 4.Run the system with the test case input. 5.Capture the actual output A. 6.Compare E and A. Different? Inform programmer 7.Loop back to 1 or 2, if time permits.

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 31 V-Model of Development and Testing Develop Acceptance Tests Acceptance Test ReviewRequirements Review Develop RequirementsExecute System TestsDevelop Integration Tests Integration Tests ReviewDesign Review DesignExecute Integration TestsDevelop Unit Tests Unit Tests ReviewCode Review CodeExecute Unit Tests

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 32 Spiral Risk analysis Risk analysis Risk analysis Risk analysis Rapid prototype Specification Design Implementation Verify

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 33 The RUP Model Management Environment Business Modeling Implementation Test Analysis & Design Preliminary Iteration(s) Iter. #1 Phases Process Workflows Iterations Supporting Workflows Iter. #2 Iter. #n Iter. #n+1 Iter. #n+2 Iter. #m Iter. #m+1 Deployment Configuration Mgmt Requirements ElaborationTransitionInceptionConstruction

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 34 Extreme Programming

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 35 Extreme Programming

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 36 Test-Driven Development

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 37 Testing Terminology Error – A human action that produces an incorrect result – May or may not produce a fault Fault – A condition that may (or may not) cause a failure – Caused by an error – Commonly referred to as a “bug” Failure – The inability of a system to perform a function according to its specifications – Result of a fault

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 38 Error, Fault, Failure Error (in programmer’s mind) Fault or defect (in code) Failure (in execution or output)

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 39 Error, Fault, Failure Error (in programmer’s mind) Fault or defect (in code) Failure (in execution or output)

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 40 Error, Fault, Failure

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 41 Error, Fault, Failure

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 42 Testing Goals Detect failures/faults/errors Locate failures/faults/errors Fix failures/faults/errors Show system correctness – Within the limits of optimistic inaccuracy Improve confidence that the system performs as specified (verification) Improve confidence that the system performs as desired (validation) Program testing can be used to show the presence of bugs, but never to show their absence [Dijkstra]

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 43 Testing Process Quality Goals Accurate Complete Thorough Repeatable Systematic

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 44 Testing Planning Set quality goal for the project Choose test methodologies and techniques Assign resources Bring in tools Set a schedule

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 45 Who Does the Testing? Programmers – Unit testing Testers – Non-programmers Users – Acceptance testing – Alpha testing – Beta testing

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 46 Levels of Testing Unit testing – Testing of a single code unit – Requires use of test drivers Functional/integration testing – Testing of interfaces among integrated units Incremental “Big bang” – Often requires test drivers and test stubs System/acceptance testing – Testing of complete system for satisfaction of requirements

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 47 Levels of Testing Meal String: mealName int: numCalories void setNumCalories(…) CalorieTracker MealList List : mealList App: connectedApp ZotMyHealth LoginManager CalorieTracker WorkoutTracker SleepTracker Unit testing Meal.setNumCalories(…) Functional/integration testing CalorieTracker.addMeal(Meal m) System/acceptance testing Add a meal Delete a workout Login Logout SettingsManager void addMeal(…) void deleteMeal(…)

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 48 Test Tasks Devise test cases – Target specific areas of the system – Create specific inputs – Create expected outputs Choose test cases – Not all need to be run all the time Regression testing Run test cases – Can be labor intensive – Opportunity for automation All in a systematic, repeatable, and accurate manner Test Case: A group of input values that cause a program to take some defined action, with an expected output

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 49 How to Choose Test Cases (I) There are usually an infinite number of possible test cases for a given function There are too many input-output pairs to exhaustively verify, so we must take a small sample Example: multiplier – Input: two integers – Output: one integer int multiplier(int a, int b) { return a * b; }

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 50 How to Choose Test Cases (II) Practically, testing can only select a very small set of inputs – Our goal should be to choose the best ones What are the best five test cases for a multiplier? – (AKA: what five test cases, if they don’t reveal any bugs, will give you the most confidence that the multiplier is working correctly?)

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 51 How to Choose Test Cases (III) Intuition Specification (black-box testing) – Equivalence class partitioning – Boundary-value analysis Code (white-box testing) – Path analysis Existing test cases (regression testing) Faults – Error guessing – Error-prone analysis

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 52 Test Automation Opportunities – Test execution – Scaffolding Executing test cases – Most repetitive, non-creative aspect of the test process – Design once, execute many times – Tool support available jUnit for java, xUnit in general White box testing can be partially automated Black box testing requires “formal” specifications to automate

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 53 Scaffolding Term borrowed from construction, civil engineering Additional code to support development – But usually not included or visible in the deployed/shipped code – Not experienced by the end user Test driver – A function or program (“main”) for driving a test Test stub – A replacement of the “real code” that’s being called by the program

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 54 Test Drivers/Stubs

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 55 Test Drivers/Stubs

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 56 Test Oracles Provide a mechanism for deciding whether a test case execution succeeds or fails Critical to testing – Used in white box testing – Used in black box testing Difficult to automate – Typically relies on humans – Typically relies on human intuition – Formal specifications may help

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 57 Oracle Example: Cosine Your test execution shows cos(0.5) = You have to decide whether this answer is correct? You need an oracle – Draw a triangle and measure the sides – Look up cosine of 0.5 in a book – Compute the value using Taylor series expansion – Check the answer with your desk calculator

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 58 Two Approaches Black box testing – Specification-based testing – Test cases designed, selected, and ran based on specifications – Scale: tests the higher-level system behavior – Drawback: less systematic White box testing – Structural testing – Test cases designed, selected, and ran based on structure of the code – Scale: tests the nitty-gritty – Drawbacks: need access to source

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 59 Reminder Discussion tomorrow – Bring a laptop or tablet

Department of Informatics, UC Irvine SDCL Collaboration Laboratory Software Design and sdcl.ics.uci.edu 60 Next time Black-box (Specification-based) Testing