Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS4723 Software Validation and Quality Assurance Lecture 02 Overview of Software Testing.

Similar presentations


Presentation on theme: "CS4723 Software Validation and Quality Assurance Lecture 02 Overview of Software Testing."— Presentation transcript:

1 CS4723 Software Validation and Quality Assurance Lecture 02 Overview of Software Testing

2 2 Approach to remove bugs  Testing  Feed input to software and run it to see whether its behavior is as expected  Limitations  Impossible to cover all cases  Test oracles (what is expected)  Static checking  Identify specific problems (e.g., memory leak) in the software by scanning the code or all possible paths  Limitations  Limited problem types  False positives

3 3 Approach to remove bugs  Formal Proof  Formally prove that the program implements the specification  Limitations  Difficult to have a formal specification  The proof cost a lot of human efforts  Inspection  Manually review the code to detect faults  Limitations:  Hard to evaluate  Sometime hard to get progress

4 4 Answer is testing, why?  “50% of my employees are testers, and the rest spends 50% of their time testing” ---- Bill Gates, 1995  More reliable than inspection, relatively cheap  Actually in the old days, when testing is expensive, inspection was the major answer  You get what you pay (linear rewards)  Compared to other 3 approaches  Inspection, static checking, formal proof

5 5 Testing: Concepts  Test case  Test oracle  Test suite  Test script  Test driver  Test coverage

6 6 Testing: Concepts  Test case  An execution of the software with a given list of input values  Include:  Input values, sometimes fed in different steps  Expected outputs  Test oracle  The expected outputs of software by feeding in a list of input values  A part of test cases  Hardest problem in auto-testing: test oracle problem

7 7 Testing: Concepts: Example

8 8 Testing: Concepts  Test suite  A collection of test cases  Usually these test cases share similar pre-conditions and configuration  Usually can be run together in sequence  Different test suites for different purposes  Smoke test, Certain platforms, Certain feature, performance, …  Test Script  A script to run a sequence of test cases or a test suite automatically

9 9 Testing: Concepts  Test Driver  A software framework that can load a collection of test cases or a test suite  It can usually handle the configuration and comparison between expected outputs and actual outputs  Test Coverage  A measurement to evaluate how well the testing is done  The measure can be based on multiple elements  Code  Input combinations  Specifications

10 10 Granularity of Testing: V-model

11 11 Granularity of testing  Unit / Module Testing  Test of a single module  Integration Testing  Test the interaction between modules  System Testing  Test the system as a whole, by developers on test cases  Acceptance Testing  Validate the system against user requirements, by customers with no formal test cases

12 12 Stage of software Testing  Development-time testing  Unit testing, Integration Testing  Before-release testing  System testing, Acceptance Testing  User testing  Actual usage -> field bugs & patches

13 13 Types of testing by how they are designed  White box testing  The tester knows everything about the implementation  They knows where the bugs are more probably be  They can exercise paths in the code  Black box testing  The tester are just like normal users  They just try to cover input space and corner cases

14 14 Black Box Testing: General Guidelines  Divide value range and cover each part  Cover boundary values  Try to reach all error messages  Try to trigger potential exceptions  Feed invalid inputs  wrong formats, too long, too short, empty, …  Try combinations of all above  Repeat same and use different inputs for many times if the input is a sequence

15 15 Black Box Testing Techniques  Boundary value testing Boundary value analysis Robustness testing Worst case testing  Equivalence class testing  Decision table based testing

16 16 Boundary Value Analysis  Errors tend to occur near the extreme values of an input variables  Boundary value analysis focuses on the boundary of the input space to identity test cases  Boundary value analysis selects input variable values at their Minimum Just above the minimum A nominal value Just below the maximum Maximum

17 17 Example of Boundary Value Analysis  Assume a program accepting two inputs y1 and y2, such that a < y1< b and c < y2 < d.........

18 18 Single Fault Assumption for Boundary Value Analysis  Boundary value analysis is also augmented by the single fault assumption principle “Failures occur rarely as the result of the simultaneous occurrence of two (or more) faults”  In this respect, boundary value analysis test cases can be obtained by holding the values of all but one variable at their nominal values, and letting that variable assume its extreme values

19 19 Generalization of Boundary Value Analysis  The basic boundary value analysis can be generalized in two ways: By the number of variables - (4n +1) test cases for n variables By the kinds of ranges of variables : map to order Strings Sequences Complex Data Structures, e. g., trees

20 20 Application Scenario of Boundary Value Analysis  Several independent variables that represent bounded physical quantities  No consideration of the function of the program, nor of the semantic meaning of the variables  Good news: We can distinguish between physical and other types of variables

21 21 Robustness Testing  A simple extension of boundary value analysis  In addition to the five boundary value analysis values of variables, we add values slightly greater that the maximum (max+) and a value slightly less than the minimum (min-)  The main value of robustness testing is to force attention on exception handling

22 UTSA CS3773 22 Example of Robustness Testing.... …......

23 23 Worst Case Testing  No single fault assumption: error happens when more than one variable has an extreme value  Considering n inputs in boundary analysis, we take the Cartesian product of the five values for 1, 2, 3, … n variables  We can have 5 n test cases for n input variables  The more interactions on inputs -> more on worse case testing  Input partitions: Length & Width vs. Length & price

24 24 Example of Worst Case Testing.....

25 25 Equivalence Class Testing  Divide the value range of an input to a number of subsets  Subsets are disjoint  The union of the subset if the value range  Values in one subset does not make difference for the software concerned  Water temp in a car: =212  Normal colors vs. Metallic colors

26 26 Example of Equivalence Class Testing

27 27 Equivalence Class Testing  The use of equivalence class testing has two motivations: Sense of complete testing The entire set is represented provides a form of completeness Avoid redundancy The disjointness assures a form of non-redundancy  Note Also check boundaries Combinations of inputs also follows the rule: more interaction -> more combinations

28 28 Equivalent Class for non-numeric inputs  Feature extraction  For string and structure inputs  Split the possible value set with a certain feature  Example: String passwd => {contains space}, {no space}  It is possible to extract multiple features from one input  Example: String name => {capitalized first letter}, {not} => {contains space}, {not} => {length >10}, {2-10}, {1}, {0} One test case may cover multiple features

29 29 Decision Table  Make it easy to observe that all possible conditions are accounted for  Decision tables can be used for: Specifying complex program logic Generating test cases with oracles

30 30 Example of Decision Table Conditions Printer does not print YYYYNNNN A red light is flashing YYNNYYNN Printer is unrecognized YNYNYNYN Actions Check the power cable X Check the printer-computer cable XX Ensure printer software is installed XXXX Check/replace ink XXXX Check for paper jam XX Printer Troubleshooting

31 31 Decision Table Usage  The use of the decision-table model is applicable when: Specification is given or can be converted to a decision table The order in which the predicates are evaluated does not affect the interpretation of resulting action  Note: Decision table needs not cover all combinations

32 32 White Box Testing: General Guidelines  Try to cover all branches  Study the relationship between input value and branch logic  Test more on complex modules  Measure complexities of modules by code size, number of branches and loops, number of calls and recursions

33 33 White Box Testing: Techniques  More difficult than black box testing  Seldom done manually  Automatic support  Symbolic execution  Complexity measurement and Defect prediction

34 34 Review: Test overview  Test is the practical choice: the best affordable approach  Concepts: test case, test oracle, test suite, test driver, test script, test coverage  Granularity: unit, integration, system, acceptance  Type by design principle: black-box, white-box  Black-box-testing: boundary, equivalence, decision table  White-box-testing: branch coverage, complexity


Download ppt "CS4723 Software Validation and Quality Assurance Lecture 02 Overview of Software Testing."

Similar presentations


Ads by Google