A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 ©1995-2001 Barnette ND, McQuain WD 1 Levels of Verification The Unreachable.

Slides:



Advertisements
Similar presentations
Software Testing. Quality is Hard to Pin Down Concise, clear definition is elusive Not easily quantifiable Many things to many people You'll know it when.
Advertisements

Lecture 8: Testing, Verification and Validation
1 Integration Testing CS 4311 I. Burnstein. Practical Software Testing, Springer-Verlag, 2003.
SOFTWARE TESTING. INTRODUCTION  Software Testing is the process of executing a program or system with the intent of finding errors.  It involves any.
Annoucements  Next labs 9 and 10 are paired for everyone. So don’t miss the lab.  There is a review session for the quiz on Monday, November 4, at 8:00.
Introduction to Data Structures and Software Engineering
Software Failure: Reasons Incorrect, missing, impossible requirements * Requirement validation. Incorrect specification * Specification verification. Faulty.
CMSC 345, Version 11/07 SD Vick from S. Mitchell Software Testing.
Chapter 9 Testing the System, part 2. Testing  Unit testing White (glass) box Code walkthroughs and inspections  Integration testing Bottom-up Top-down.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
Illinois Institute of Technology
CS 425/625 Software Engineering Software Testing
Testing an individual module
Outline Types of errors Component Testing Testing Strategy
BY RAJESWARI S SOFTWARE TESTING. INTRODUCTION Software testing is the process of testing the software product. Effective software testing will contribute.
1 Software Testing Techniques CIS 375 Bruce R. Maxim UM-Dearborn.
Chapter 13 & 14 Software Testing Strategies and Techniques
Software Testing Sudipto Ghosh CS 406 Fall 99 November 9, 1999.
Software Systems Verification and Validation Laboratory Assignment 3
System/Software Testing
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
CMSC 345 Fall 2000 Unit Testing. The testing process.
Software Testing.
Prof. Mohamed Batouche Software Testing.
INT-Evry (Masters IT– Soft Eng)IntegrationTesting.1 (OO) Integration Testing What: Integration testing is a phase of software testing in which.
1 Software testing. 2 Testing Objectives Testing is a process of executing a program with the intent of finding an error. A good test case is in that.
Testing Basics of Testing Presented by: Vijay.C.G – Glister Tech.
Software Testing Testing types Testing strategy Testing principles.
Software Testing The process of operating a system or component under specified conditions, observing and recording the results, and making an evaluation.
Testing -- Part II. Testing The role of testing is to: w Locate errors that can then be fixed to produce a more reliable product w Design tests that systematically.
Software Testing. 2 CMSC 345, Version 4/12 Topics The testing process  unit testing  integration and system testing  acceptance testing Test case planning.
Chapter 13: Regression Testing Omar Meqdadi SE 3860 Lecture 13 Department of Computer Science and Software Engineering University of Wisconsin-Platteville.
Test Coverage CS-300 Fall 2005 Supreeth Venkataraman.
INTRUDUCTION TO SOFTWARE TESTING TECHNIQUES BY PRADEEP I.
Presented by: Ritesh Jain Date: 16-Jun-05 Software Quality Testing.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
Software Engineering Saeed Akhtar The University of Lahore.
Integration testing Integrate two or more module.i.e. communicate between the modules. Follow a white box testing (Testing the code)
SOFTWARE TESTING. Introduction Software Testing is the process of executing a program or system with the intent of finding errors. It involves any activity.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
CSC 395 – Software Engineering Lecture 27: White-Box Testing.
SOFTWARE TESTING SOFTWARE TESTING Presented By, C.Jackulin Sugirtha-10mx15 R.Jeyaramar-10mx17K.Kanagalakshmi-10mx20J.A.Linda-10mx25P.B.Vahedha-10mx53.
SOFTWARE TESTING. SOFTWARE Software is not the collection of programs but also all associated documentation and configuration data which is need to make.
SOFTWARE TESTING LECTURE 9. OBSERVATIONS ABOUT TESTING “ Testing is the process of executing a program with the intention of finding errors. ” – Myers.
ANOOP GANGWAR 5 TH SEM SOFTWARE TESTING MASTER OF COMPUTER APPLICATION-V Sem.
Verification vs. Validation Verification: "Are we building the product right?" The software should conform to its specification.The software should conform.
Software Testing. SE, Testing, Hans van Vliet, © Nasty question  Suppose you are being asked to lead the team to test the software that controls.
Defect testing Testing programs to establish the presence of system defects.
1 Software Testing. 2 What is Software Testing ? Testing is a verification and validation activity that is performed by executing program code.
Testing Integral part of the software development process.
Software Testing. Software Quality Assurance Overarching term Time consuming (40% to 90% of dev effort) Includes –Verification: Building the product right,
Software Testing.
Rekayasa Perangkat Lunak Part-13
Software Testing.
Software Testing Techniques
Software Engineering (CSI 321)
Verification and Testing
Chapter 13 & 14 Software Testing Strategies and Techniques
Structural testing, Path Testing
Types of Testing Visit to more Learning Resources.
UNIT-4 BLACKBOX AND WHITEBOX TESTING
Software Testing (Lecture 11-a)
Lecture 09:Software Testing
Software testing.
Chapter 10 – Software Testing
Integration Testing CS 4311
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
Software Testing “If you can’t test it, you can’t design it”
UNIT-4 BLACKBOX AND WHITEBOX TESTING
Chapter 13 & 14 Software Testing Strategies and Techniques 1 Software Engineering: A Practitioner’s Approach, 6th edition by Roger S. Pressman.
Presentation transcript:

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 1 Levels of Verification The Unreachable Goal: Correctness

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 2 Testing and Errors Relationship between Discovered Errors and Undiscovered Errors – 40-50% of all development time is spent in the testing process – Humans (programmers) are NOT good at testing. The process of testing admits that one has produced code with errors. – Successful testing can be thought of as successfully finding errors and testing failure implies not discovering any errors. Probability Increases as Number of Errors Increases Probability of Existence of More Errors Number of Errors Found to Date Reference: “The Art of Software Testing”, Meyers, Glenford J., John Wiley & Sons, 1979 "Testing can establish the presence of errors, but never their absence." [Edsger Dijkstra]

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 3 Testing Phases – Regression Testing involves fixing errors during testing and the re- execution of all previous passed tests. – Unit Testing utilizes module testing techniques (white-box / black- box techniques). – Integration Testing involves checking subsets of the system. – Acceptance, Function and System testing is performed upon the entire system. Life Cycle Testing Requirements Specification High Level Design Low Level Design Coding Integration Testing Deployment Maintanence Acceptance Test Function Test System Test Integration Test Unit Test Regression Test

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 4 Integration Testing Bottom-Up Testing – Unit Test (Black & White box techniques) – discovers errors in individual modules – requires coding (& testing) of driver routines Top-Down Testing – Main module & immediate subordinate routines are tested first – requires coding of routine stubs to simulate lower level routines – system developed as a skeleton Sandwich Integration – combination of top-down & bottom-up testing Big Bang – No integration testing – modules developed alone – All modules are connected together at once

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 5 Function Testing System «-» Specifications – Checks that the system satisfies its external specification – Entire system is viewed as a "Black Box" – Techniques: ††Boundary-value Analysis †Cause-Effect Graphing Functional Verification Testing Establishes Level of Confidence Proof of Correctness External Specifications Program Requirements User Documentation Program

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 6 Acceptance Testing System «-» Users – Tests the program against the current needs of the users and its original objectives. – Usually performed by the end user (customer) – Contract may require, as part of acceptance test: † performance tests (throughput, statistics collection,...) † stress tests (system limits) – If performed by system developers may consist of  (alpha),  (beta) testing Program Requirements User Documentation Program External Specifications

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 7 Testing Experiment Program – Program reads 3 integer values from a line. – The 3 values represent the lengths of the sides of a triangle. – The program outputs whether the triangle is equilateral, isosceles, or scalene. – Write a set of test cases which would adequately test this program! Test Cases – Valid scalene triangle. – Valid equilateral triangle. – Valid Isosceles triangle. – All possible permutations of Isosceles triangles (e.g. (3,3,4) (3,4,3) (4,3,3)) – One side having a zero value. – One side having a negative value. – Degenerate Triangle (e.g. 1-Dim  (1,2,3) – All possible permutations of Degenerate Triangles (e.g. (1,2,3) (3,1,2) (1,3,2)) – Invalid Triangle (e.g. (1,2,4)) – All possible permutations of invalid triangles. – All sides = 0. – Non-integer values. – Incorrect number of sides

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 8 Exhaustive Testing Example Practical Limitations – How long will it take to try all possible inputs at a rate of one test/second? 2 32 tests * 1 second / test = 2 32 seconds = 2 32 / (60 * 60 * 24 * 365) years > 2 32 / (2 6 * 2 6 * 2 5 * 2 9 ) years = 2 32 / 2 26 years = 2 6 years = 64 years – Exhaustive Testing cannot be performed!

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 9 Testing Principles General Heuristics – The expected output for each test case should be defined in advance of the actual testing. – The test output should be thoroughly inspected. – Test cases must be written for invalid & unexpected, as well as valid and expected input conditions. – Test cases should be saved and documented for use during the maintenance / modification phase of the life cycle. – New test cases must be added as new errors are discovered. – The test cases must be a demanding exercise of the component under test. – Tests should be carried out by a third party independent tester, developer engineers should not privatize testing due to conflict of interest – Testing must be planned as the system is being developed, NOT after coding. Goal of Testing – No method (Black/White Box, etc.) can be used to detect all errors. – Errors may exist due to a testing error instead of a program error. – A finite number of test cases must be chosen to maximize the probability of locating errors. Perform testing to ensure that the probability of program/system failure due to undiscovered errors is acceptably small.

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 10 Testing Mechanics Testing components – Drivers †Test harness – Stubs †Scaffold Code Test Case Inputs Valid Test Outputs Driver Routine X Stub a b d c Component Under Testing Required by X but NOT coded

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 11 White Box Testing Structural Testing – Exercise of Source code and internal data structures – Test cases are derived from analysis of internal module logic and external module specifications – Logic Coverage (condition/decision testing) †Statement Coverage †Decision Coverage †Condition Coverage †Decision/Condition Coverage †Multiple Condition Coverage – Path Coverage †Control Flow Testing Functional Description and actual implementation Correct I/O relationships are verified using both :

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 12 White Box: Logic Testing Logic Coverage – Statement Coverage †Every statement is executed at least once. – Decision Coverage †Each decision is tested for TRUE & FALSE. †correctness of conditions within the decisions are NOT tested – Condition Coverage †Each condition in a decision takes on all possible outcomes at least once. †Does not necessarily test all decision outcomes. †Test cases do not take into account how the conditions affect the decisions. Eg: if (x >= 5 && y != 7 && z= =9) { ;at least 3 tests} – Decision/Condition Coverage †Satisfies both decision coverage and condition coverage. † Does NOT necessarily test all possible combinations of conditions in a decision. Multiple Condition Coverage †Test all possible combinations of conditions in a decision †Does not test all possible combinations of decision branches. Eg: if (x >= 5 && y != 7 && z= =9) { ;//2^3=8 tests!}

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 13 White Box: Path Testing Control Flow Graph – Node: sequence of statements ending in a branch – Arc: transfer of control Path Testing – Exercise a program by testing all possible execution paths through the code. – Method 1. Enumerate the paths to be tested 2. Find the Input Domain of each 3. Select 1 or more test cases from domains – Problem: Loops (  number of paths) Paths: ABC; ABBC; AB... BC – Solution: †Restrict loop to N iterations †Select small number of paths that yield reasonable testing. Exhaustive Path Testing (impossible) – (analogue of exhaustive input testing) – requires executing the total number of ways of going from the top of the graph to the bottom – approx. 100 trillion, where 5 = number of unique paths – assuming all decisions are independent of each other – specification errors could still exist – does not detect missing paths – does not check data-dependent errors A C B

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 14 Test Path Determination Independent Path – any path that introduces at least one new set of processing statements (nodes), i.e. it must traverse an edge not previously covered. – Independent Paths: Cyclomatic Complexity – upper bound on the number of independent paths, i.e. number of tests that must be executed in order to cover all statements. – CC = edges - Nodes + 2 = E - N + 2 = =

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 15 Path Input Domains Input Domain Subset Reverse Path Analysis Input Domain Domain for Path: ABDEAF A F B CD E Input Domain ? ? ? Recreate the test data by 'tracing' the path in reverse, collecting the conditions on the input variables.

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 16 Testing Reliability Question: – When to stop testing? Answer: – When no more errors exist. Impossible to ascertain. – (1) How reliable is the set of test cases? †Data Domain – (2) How reliable is the software being developed? †Time Domain – Time Domain Reliability MTBF : mean time between failures MTTF : mean time to failure MTTR: mean time to repair MTBF = MTTF + MTTR Availability = MTTF / (MTTF + MTTR) * 100 Estimate Methods: 1. Predictions based on calendar time 2. Predictions based on CPU time RELIABILITY Data Domain Time Domain Coverage Mutation Analysis Error Seeding Shooman Jelinski-Moranda Musa

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 17 Mutation Analysis The purpose of Mutation Analysis is to test the test suite. OriginalMutant – Mutate Code to determine the adequacy of the test data. – Determines whether all deliberately introduced (mutant) errors are detected by the original test cases.

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 18 Mutation Analysis Process Mutation Testing Diagram Mutation Testing Process – 1. Program P is executed for test case T – 2. If errors occur test case T has succeeded Errors are corrected & retested until no errors with test case T are observed. – 3. Program is Mutated P’ – 4. Mutant P’ is executed for test case T IF no errors are found { test case T is inadequate; further testing is required; // ERROR SEEDING new test cases are added & step 3 is repeated until all mutations are discovered; entire process is started again at step 1 with the new test cases ELSE // all mutations located by tests T T is adequate and no further testing is required. PROGRAM Mutations Testing Test Cases Expand Test Cases all Mutations discovered? Testing complete Yes No

A13. Testing Intro Data Structures & SE Computer Science Dept Va Tech Aug., 2001 © Barnette ND, McQuain WD 19 Error Seeding Error Scattergram Graph Technique – Estimate of the number of original undiscovered errors remaining in a system. 1. Intentionally introduce (seed) errors into the source code. 2. Execute test cases upon source code. 3. Count the number of seeded errors & original errors (unseeded errors) discovered. 4. Estimate the total number of original errors (mutations)