Presentation is loading. Please wait.

Presentation is loading. Please wait.

Software Testing Techniques. December 2007 2 Introduction Many aspects to achieving software quality –Formal reviews (of both the software process and.

Similar presentations


Presentation on theme: "Software Testing Techniques. December 2007 2 Introduction Many aspects to achieving software quality –Formal reviews (of both the software process and."— Presentation transcript:

1 Software Testing Techniques

2 December 2007 2 Introduction Many aspects to achieving software quality –Formal reviews (of both the software process and the various stages of development), audits, documentation, etc. –Unit testing –Integration testing –Verification Does the module meet the specifications –Validation Does the product meet the requirements

3 December 2007 3 Introduction A Critical element of the Software Quality Assurance Represents a critical review of Specifications, Design and Coding Destructive rather than Constructive (try to break the system) Major objective is to find errors not to show the absence of errors (as distinct from Verification and Validation)

4 December 2007 4 Objectives Testing is a process of executing a program with the intenet of finding an error. A good test case is one that has a high probability of finding an as-yet undiscovered error A Successful test is one that uncovers an as- yet undiscovered error

5 December 2007 5 Principles All tests should be traceable to customer requirements Tests should be planned long before testing begins The Pareto principle applies to Testing –Typically, 80% of the errors come from 20% of the modules Testing should begin ‘‘in the small’’ and progress towards “in the large” Exhaustive Testing is not possible, but, –if time permits, conduct multiple failure mode testing Test plans must receive independent review

6 December 2007 6 Testability The ease with which a computer program can be tested.

7 December 2007 7 Characteristics for Testability Operability – the better it works, the more efficiently it can be tested The system has few bugs No bugs block the execution of tests The product evolves in functional stages

8 December 2007 8 Characteristics for Testability Observability – what you see is what you test Distinct output for each input System states and variables visible during execution Past system states and variables are visible All factors affecting the output are visible Incorrect output is easily identified Internal errors are automatically detected and reported

9 December 2007 9 Characteristics for Testability Controllability – the better we can control the software, the more testing can be automated All possible outputs can be generated through some combination of input All code is executable through some combination of input Input and Output formats are consistent and structured All sequences of task interaction can be generated Tests can be conveniently specified and reproduced

10 December 2007 10 Characteristics for Testability Decomposability – By controlling the scope of testing, isolate problems and perform smarter retesting The Software system is built from independent modules Software modules can be tested independently –While this is very important, it does not obviate the need for integration testing

11 December 2007 11 Characteristics for Testability Simplicity – the less there is to test, the more quickly we can test it Functional simplicity Structural simplicity Code simplicity

12 December 2007 12 Characteristics for Testability Stability – the fewer the changes, the fewer disruptions to testing Changes are infrequent Changes are controlled Changes do not invalidate existing tests The software recovers well from failures

13 December 2007 13 Characteristics for Testability Understandability – the more information we have, the smarter we will test The design is well understood Dependencies between internal, external and shared components well understood Changes to design are well communicated Technical documentation is instantly accessible, well-organized, specific and accurate

14 December 2007 14 Types of Testing White-Box Testing all gears mesh” i.e. internal operations are performed according to specifications and all internal components have been adequately exercised. –Knowing the internal workings of a product, tests are conducted to ensure that “all gears mesh” i.e. internal operations are performed according to specifications and all internal components have been adequately exercised. Black-Box Testing –Knowing the specified function that a product has been designed to perform, tests are conducted to demonstrate that each function is fully operational (note: this is still different from validation)

15 December 2007 15 White Box Testing Uses the control structure of the procedural design to derive test cases Guarantees that all independent paths within a module have been exercised at least once Exercises all loops at their boundaries and within their operational bounds Exercises internal data structures to assure their validity - again, at their boundaries and with their operational bounds

16 December 2007 16 Control Structure Testing Attacks the control flow of the program Provides us with a logical complexity measure of a procedural design Use this measure as a guide for defining a Basis set of execution paths Test cases derived to exercise the Basis set are guaranteed to execute every statement in the program at least once

17 December 2007 17 Basis Path Testing A Flow Graph created –represents the control flow of the program –each node in the graph represents one or more procedural statements –Any procedural design representation can be translated into a flow graph

18 December 2007 18 Flow Graph Notation

19 December 2007 19 Basis Path Testing ( contd.) Example PDL procedure sort 1 :do while records remain read record ; 2 : if record field1 = 0 3 : then process record ; store in buffer ; increment counter ; 4 :elseif record field2 = 0 5 : then reset counter ; 6 : else process record ; store in file ; 7a: endif endif 7b:enddo 8 :end

20 December 2007 20 Basis Path Testing ( contd.) Flow Graph 1 2 4 3 6 5 7a 7b 8

21 December 2007 21 Basis Path Testing ( contd.) Cyclomatic Complexity –Quantitative measure of the complexity of a program –is the number of independent paths in the basis set of a program –Upper bound for the number of tests that must be conducted to ensure that all statements have been executed at least once

22 December 2007 22 Basis Path Testing ( contd.) Cyclomatic Complexity calculation V (G) = E -N+ 2 = P + 1 = No. of regions in the graph where E = no. of edges, N = no. of nodes, and P = no. of predicate nodes For the previous example – Independent paths path 1 : 1 - 8 path 2 : 1 - 2 - 3 - 7b - 1 - 8 path 3 :1 - 2 - 4 - 6 - 7a - 7b - 1 - 8 path 4 :1- 2 - 4 - 5 - 7a - 7b - 1 - 8 – Cyclomatic complexity = 11 - 9 + 2 = 3 + 1 = 4

23 December 2007 23 Basis Path Testing ( contd.) Prepare test cases that will force execution of each independent path in the Basis Path set Each test case is executed and compared to expected results

24 December 2007 24 Example

25 December 2007 25 Example

26 December 2007 26 Condition Testing Exercises all the logical conditions in a module Types of possible errors –Boolean variable error –Boolean Parenthesis error –Boolean Operator error –Arithmetic expression error

27 December 2007 27 Types of Condition Testing Branch Testing –the TRUE and FALSE branches of the condition and every simple condition in it are tested Domain Testing –for every Boolean expression of n variables, all of 2 n possible tests are required

28 December 2007 28 Data Flow Testing Assume functions do not modify their arguments or global variables. Then define –DEF ( S ) = { X | Statement S contains a definition of X } – USE ( S ) = { X | Statement S contains a use of X } –Definition - Use chain ( DU chain ) [ X, S, S ‘ ], where X  DEF ( S ) and X  USE ( S ‘ ) and the definition of X in S is live at S ’ Every DU chain to be covered at least once

29 December 2007 29 Kinds of Loops

30 December 2007 30 Loop Testing Focus is on the validity of loop constructs Simple loop ( n is the max. no. of allowable passes ) –Skip the loop entirely –Only one pass through the loop –Two passes –m passes, where m < n –n-1, n, n+1 passes Nested loop –Start at innermost loop –Conduct simple loop test for this loop –Move outwards one loop at a time

31 December 2007 31 Loop Testing ( contd.) Concatenated loops –Multiple simple loop tests if independent –Nested loop approach if dependent Unstructured loops –Should be restructured into a combination of simple and nested loops

32 December 2007 32 Black Box Testing Focus is on the functional requirements of the software Uncovers errors such as –Incorrect or missing functions –Interface errors –Errors in data structures –Performance errors –Initialization and Termination errors Unlike White Box Testing, this is performed at later stages of testing

33 December 2007 33 Graph Based Testing Identify all objects modeled by the software Identify the relationships that connect these objects Create an Object-Relationship graph –node –node weights –links –link weights

34 December 2007 34 Graph Testing ( contd.) Example graph new file Document window Document text is represented ascontains Attributes : start dimension Background color text color allows editing of menu select generates generation < 1 sec

35 December 2007 35 Graph Test Generation Add entry and exit nodes For an object A, values for all objects in the transitive closure of Z must be tested for their impact on Z Test the symmetry of all bi-directional links, e.g., “undo” Be sure all nodes have a reflexive link. Test it for each node. Test each relationship (the links).

36 December 2007 36 Equivalence Partitioning Input domain divided into classes of data from which test cases are derived Goal is to design a single test case that uncovers classes of errors, thereby reducing the total number of test cases to be developed Each class represents a set of valid or invalid states for input conditions

37 December 2007 37 Equivalence Partitioning ( contd.) Test case design is based on an evaluation of equivalence classes for an input condition –range specified, one valid and two invalid equivalence classes –requires a specific value, one valid and two invalid equivalence classes –specifies a member of a set, one valid and one invalid equivalence classes –is boolean, one valid & one invalid equivalence class

38 December 2007 38 Equivalence Partitioning ( cont. ) Example Automatic Banking –area code : input condition, boolean input condition, range [200,999] –prefix : input condition, range >200, no 0’s, < 1000 –suffix : input condition, value -- 4 digits –password : input condition, boolean input condition, value -- 6 char str –command : input condition, set

39 December 2007 39 Boundary Value Analysis Greater number of errors tend to occur at the boundaries of the input domain Select test cases that exercise bounding values Input condition –range, test cases are just below min and just above max –set, test cases are minimum and maximum values, if ordered The above guidelines are also applied to output conditions –example outputs that produce minimum and maximum values in the output range

40 December 2007 40 Comparison Testing Multiple copies of the software are constructed in case of critical applications –Example: Shuttle Flight Control Software Each version is independently built from the specs by different teams Test cases derived from other BB Testing techniques are provided as input to each version Outputs are compared and versions validated Not fool proof

41 December 2007 41 Other Cases GUI testing –See text for partial list of things to test Client Server –Often distributed -- complicates testing –Additional emphasis on non-interference among clients Documentation –Review & inspection (for editorial clarity) –Live test (uses the documentation in conjuction with the use of actual program) Real-time –Beyond the scope of this course

42 December 2007 42 Summary Destructive rather than constructive Main goal is to find errors not to prove the absence of errors White Box Testing –control structure testing –Condition testing –Data flow testing –Loop testing Black Box Testing - Functional requirements –Graph based testing –Equivalence partitioning –Boundary Value testing –Comparison testing


Download ppt "Software Testing Techniques. December 2007 2 Introduction Many aspects to achieving software quality –Formal reviews (of both the software process and."

Similar presentations


Ads by Google