Software Testing. Purpose: to Insure that software meets its design specifications, especially at the boundary conditions. “Boundary Conditions:” conditions.

Slides:



Advertisements
Similar presentations
Chapter 14 Software Testing Techniques - Testing fundamentals - White-box testing - Black-box testing - Object-oriented testing methods (Source: Pressman,
Advertisements

Defect testing Objectives
SOFTWARE TESTING. INTRODUCTION  Software Testing is the process of executing a program or system with the intent of finding errors.  It involves any.
David Woo (dxw07u).  What is “White Box Testing”  Data Processing and Calculation Correctness Tests  Correctness Tests:  Path Coverage  Line Coverage.
CMSC 345, Version 11/07 SD Vick from S. Mitchell Software Testing.
Boundary Value Testing A type of “Black box” functional testing –The program is viewed as a mathematical “function” –The program takes inputs and maps.
Creator: ACSession No: 13 Slide No: 1Reviewer: SS CSE300Advanced Software EngineeringFebruary 2006 Testing - Techniques CSE300 Advanced Software Engineering.
1 Static Analysis Methods CSSE 376 Software Quality Assurance Rose-Hulman Institute of Technology March 20, 2007.
Chapter 17 Software Testing Techniques
Introduction to Hypothesis Testing CJ 526 Statistical Analysis in Criminal Justice.
IMSE Week 18 White Box or Structural Testing Reading:Sommerville (4th edition) ch 22 orPressman (4th edition) ch 16.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Basis Path Testing - Example
BASIS PATH TESTING ● By Tom McCabe ● McCabe, T., "A Software Complexity Measure," IEEE Trans. Software Engineering, vol. SE-2, December 1976, pp
Testing an individual module
Introduction to Hypothesis Testing CJ 526 Statistical Analysis in Criminal Justice.
Software Engineering Lecture 12 Software Testing Techniques 1.
1 Software Testing Techniques CIS 375 Bruce R. Maxim UM-Dearborn.
Chapter 13 & 14 Software Testing Strategies and Techniques
28/08/2015SJF L31 F21SF Software Engineering Foundations ASSUMPTIONS AND TESTING Monica Farrow EM G30 Material available on Vision.
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
Testing. Definition From the dictionary- the means by which the presence, quality, or genuineness of anything is determined; a means of trial. For software.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
CMSC 345 Fall 2000 Unit Testing. The testing process.
Prof. Mohamed Batouche Software Testing.
1 Software testing. 2 Testing Objectives Testing is a process of executing a program with the intent of finding an error. A good test case is in that.
Software Reviews & testing Software Reviews & testing An Overview.
Chapter 9 Power. Decisions A null hypothesis significance test tells us the probability of obtaining our results when the null hypothesis is true p(Results|H.
1 Today Random testing again Some background (Hamlet) Why not always use random testing? More Dominion & project? CUTE: “concolic” testing.
Software Testing. 2 CMSC 345, Version 4/12 Topics The testing process  unit testing  integration and system testing  acceptance testing Test case planning.
Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20.
Testing of Hypothesis Fundamentals of Hypothesis.
CSE403 Software Engineering Autumn 2001 More Testing Gary Kimura Lecture #10 October 22, 2001.
INTRUDUCTION TO SOFTWARE TESTING TECHNIQUES BY PRADEEP I.
Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20.
White-box Testing.
1 Ch. 1: Software Development (Read) 5 Phases of Software Life Cycle: Problem Analysis and Specification Design Implementation (Coding) Testing, Execution.
BASIS PATH TESTING.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
© 2001 Prentice-Hall, Inc.Chap 9-1 BA 201 Lecture 14 Fundamentals of Hypothesis Testing.
Test Case Designing UNIT - 2. Topics Test Requirement Analysis (example) Test Case Designing (sample discussion) Test Data Preparation (example) Test.
CS451 Lecture 10: Software Testing Yugi Lee STB #555 (816)
Theory and Practice of Software Testing
SOFTWARE TESTING. Introduction Software Testing is the process of executing a program or system with the intent of finding errors. It involves any activity.
White Box Testing by : Andika Bayu H.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20.
SOFTWARE TESTING LECTURE 9. OBSERVATIONS ABOUT TESTING “ Testing is the process of executing a program with the intention of finding errors. ” – Myers.
Software Testing. SE, Testing, Hans van Vliet, © Nasty question  Suppose you are being asked to lead the team to test the software that controls.
Defect testing Testing programs to establish the presence of system defects.
CHAPTER 9 - PART 2 Software Testing Strategies. Lesson Outlines ■ White box testing – Data processing and calculation correctness tests – Correctness.
White Box Testing. Agenda White-box vs Black-box Program Flow Controls White-box Test Methods Exercises Complexity Q&A.
Testing Integral part of the software development process.
INTRODUCTION TO TESTING OF HYPOTHESIS INTRODUCTION TO TESTING OF HYPOTHESIS SHWETA MOGRE.
Chapter 17 Software Testing Techniques
BASIS PATH TESTING.
Software Testing.
White-Box Testing Pfleeger, S. Software Engineering Theory and Practice 2nd Edition. Prentice Hall, Ghezzi, C. et al., Fundamentals of Software Engineering.
Software Testing.
Chapter 13 & 14 Software Testing Strategies and Techniques
Types of Testing Visit to more Learning Resources.
White Box Testing.
UNIT-4 BLACKBOX AND WHITEBOX TESTING
INTEGRATED LEARNING CENTER
Software Testing (Lecture 11-a)
Software Testing “If you can’t test it, you can’t design it”
UNIT-4 BLACKBOX AND WHITEBOX TESTING
Introduction To Hypothesis Testing
Chapter 13 & 14 Software Testing Strategies and Techniques 1 Software Engineering: A Practitioner’s Approach, 6th edition by Roger S. Pressman.
Presentation transcript:

Software Testing

Purpose: to Insure that software meets its design specifications, especially at the boundary conditions. “Boundary Conditions:” conditions where performance changes from acceptable to unacceptable. –“Boundary Conditions” may sometimes mean where performance changes from one level (e.g. “summer”) to another (e.g. “fall”).

Simplifying Assumptions Software either fails or it doesn’t; no “partial failures.” Software performance can be measured without ambiguity. (“I’m positive that the criminal was about average height and had either green or blue eyes and may have had a hat or coat, but maybe not.”)

Two Types of Errors Software fails, but we think it works. (Type 1 error) –The guilty man is acquitted. Software works, but we think it fails. (Type 2 error) –The innocent man is convicted

Two Types of Errors Type 1 and Type 2 errors are complementary: Reducing one may increase the other. Accepting everything eliminates Type 2 error (no good cases are rejected) but maximizes Type 1 error (all bad cases are accepted). Rejecting everything eliminates Type 1 error (no bad cases are accepted) but maximizes Type 2 error (all good cases are rejected).

Two Types of Errors Complete performance testing must address both types of errors. –Testing “out-of bounds” cases tests for Type 1 error (accepting an incorrect input). –Testing “in-bounds” cases tests for Type 2 error (rejecting a correct input).

Examples of Type 1 and Type 2 Errors Goal: Select the the attractive women from the following set {Jodie Foster, Gwyneth Paltrow, Tom Hanks, Harrison Ford}. –{Jodie Foster, Gwyneth Paltrow, Tom Hanks} contains a Type 1 error. –{Jodie Foster} contains a Type 2 error –{Jodie Foster, Tom Hanks} contains both types.

Type 1 and Type 2 Errors Acceptable Performance Unacceptable Performance Accepting this point is a Type 1 error. Rejecting this point is a Type 2 error.

Type 1 and Type 2 Errors Acceptable PerformanceUnacceptable Performance Accept Positive Rejecting either of these points is a Type 2 error. Accept Negative Accepting either of these points is a Type 1 error. Print positive integers, reject negative ones. Accept a negative integer Reject a positive one Reject Positive Reject Negative

Testing for Both Types of Error Type 1 and Type 2 errors are complementary. –Testing for Type 1 tells nothing about Type 2 –Testing for Type 2 tells nothing about Type 1 Both types must be tested for independently.

Testing Project 2a Required: Test program to insure it printed positive numbers, rejected negative ones. Testing with positive numbers tests only for Type 2 error: does program wrongly reject proper input. Testing with negative numbers tests only for Type 1 error: does program wrongly accept improper input.

What to Test? Goal: every predicate –if, while, for, do-while, switch Limit: must be observable –input and output must be observable; “You can’t test what you can’t see.”

How To Test Develop specific test cases with known answers. –If you don’t know the answer beforehand, you’re “experimenting,” not “testing.” Trace program logic to insure that every predicate is tested at least once. –Analysis of program flow using graph theory can give a minimum set of tests to insure complete coverage. (McCabe cyclomatic complexity) Test at “critical points” (where behavior changes.)

How To Test Test at “Critical Points” (where behavior changes.) Save testing procedure, data, and results to use after maintenance.