Download presentation
Presentation is loading. Please wait.
Published byStephany Goodman Modified over 8 years ago
1
Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20
2
CMSC 345, Version 4/042 Topics Covered l The testing process l Defect testing Black box testing »Equivalence partitions White box testing »Equivalence partitions »Path testing l Integration testing Top-down, bottom-up, sandwich, thread Interface testing Stress testing l Object-oriented testing
3
CMSC 345, Version 4/043 Testing Goal l The goal of testing is to discover defects in programs. l A successful test is a test that causes a program to behave in an anomalous way. l Tests show the presence, not the absence, of defects. l Test planning should be continuous throughout the software development process. l Note that testing is the only validation technique for non-functional requirements.
4
CMSC 345, Version 4/044 The V-model of Testing
5
CMSC 345, Version 4/045 The Testing Process l Component (unit) testing Testing of individual program components (e.g., functions, methods, or classes) Usually the responsibility of the component developer (except sometimes for critical systems) l Integration and system testing Testing of groups of components integrated to create a sub- system or entire system Usually the responsibility of an independent testing team Tests are based on system specification l Acceptance testing Run in presence of customer or by customer Used to validate all system requirements
6
CMSC 345, Version 4/046 l Test data - Inputs that have been devised to conduct the particular test l Expected output – Recorded before test is conducted l Actual output – The output actually received l Pass/fail criteria – Criteria to determine if test passed or failed when the actual results are compared to the expected results. Determined before test is conducted. The Testing Process (con’t)
7
CMSC 345, Version 4/047 Black Box Testing l An approach to testing where the program is considered as a “black box” (i.e., one cannot “see” inside of it) l The program test cases are based on the system specification, not the internal workings (e.g., algorithms), of the program. l Use equivalence partitions when conducting black box testing
8
CMSC 345, Version 4/048 Equivalence Partitioning l Input data and output results often fall into different classes where all members of a class are related. Examples: Positive (or negative) numbers Strings with (or without) blanks l Each of these classes is an equivalence partition where the program behaves in an equivalent way for each class member. l Test cases should be chosen from each partition, especially at the boundaries.
9
CMSC 345, Version 4/049 l System accepts 4 to 10 inputs which are 5-digit integers greater than 10,000 l Partition system inputs into groups (partitions) that should cause equivalent behavior. Include both valid and invalid inputs. If input is a 5-digit integer between 10,000 and 99,999, equivalence partitions are: 99,999 l Choose test cases at the boundaries of these partitions: 9,999 10,000 99,999 100,000 Equivalence Partitions Example
10
CMSC 345, Version 4/0410 Equivalence Partitions Example
11
CMSC 345, Version 4/0411 Derivation of Test Cases Some Practice procedure Search (Key : ELEM ; T: ELEM_ARRAY; Found : in out BOOLEAN; L: in out ELEM_INDEX) ; Pre-condition -- the array has at least one element Post-condition -- the element is found and is referenced by L or -- the element is not in the array
12
CMSC 345, Version 4/0412 What Tests Should We Use? l Equivalence Partitions
13
CMSC 345, Version 4/0413 Search Routine Test Cases
14
CMSC 345, Version 4/0414 l Sometimes called structural testing l Derivation of test cases according to program structure (can “see” inside) l Objective is to exercise all program statements at least once l Usually applied to relatively small program units such as functions or class methods White Box Testing
15
CMSC 345, Version 4/0415 Binary Search Routine l Assume that we now know that the search routine is a binary search. l Any new tests?
16
CMSC 345, Version 4/0416 Binary Search Test Cases
17
CMSC 345, Version 4/0417 Path Testing l The objective of path testing is to ensure that the set of test cases is such that each path through the program is executed at least once. l The starting point for path testing is a program flow graph.
18
Binary search flow graph
19
CMSC 345, Version 4/0419 l 1, 2, 8, 9 l 1, 2, 3, 8, 9 l 1, 2, 3, 4, 6, 7, 2 l 1, 2, 3, 4, 5, 7, 2 l Test cases should be derived so that all of these paths are executed. l A dynamic program analyser may be used to check that paths have been executed (e.g., LINT). Independent Paths
20
CMSC 345, Version 4/0420 l The minimum number of tests needed to test all statements equals the cyclomatic complexity. l CC = number_edges – number_nodes + 2 l In the case of no goto’s, CC = number_decisions + 1 l Although all paths are executed, all combinations of paths are not executed. l Some paths may be impossible to test. Cyclomatic Complexity
21
CMSC 345, Version 4/0421 Integration Testing l Tests the complete system or subsystems composed of integrated components l Integration testing is black box testing with tests derived from the requirements and design specifications. l Main difficulty is localizing errors l Incremental integration testing reduces this problem.
22
CMSC 345, Version 4/0422 Incremental Integration Testing
23
CMSC 345, Version 4/0423 Approaches to Integration Testing l Top-down testing Start with high-level system and integrate from the top-down, replacing individual components by stubs where appropriate l Bottom-up testing Integrate individual components in levels until the complete system is created l In practice, most integration testing involves a combination of both of these strategies: Sandwich testing (outside-in) Thread testing
24
CMSC 345, Version 4/0424 Top-down Testing
25
CMSC 345, Version 4/0425 Bottom-up Testing
26
CMSC 345, Version 4/0426 Method “Pro’s” l Top-down l Bottom-up l Sandwich l Thread
27
CMSC 345, Version 4/0427 Interface Testing l Interface misuse A calling component calls another component and makes an error in the use of its interface. »parameters in the wrong order »parameter(s) of the wrong type »incorrect number of parameters l Interface misunderstanding A calling component embeds assumptions about the behavior of the called component that are incorrect. »binary search function called with an unordered array »wrong flag “number” »other preconditions that are violated
28
CMSC 345, Version 4/0428 Interface Types l Parameter interfaces Data passed from one procedure to another »functions l Shared memory interfaces Block of memory is shared between procedures »global data l Message passing interfaces Sub-systems request services from other sub-systems »OO systems »client-server systems
29
CMSC 345, Version 4/0429 Some Interface Testing Guidelines l Design tests so that parameters to a called procedure are at the extreme ends of their ranges (boundaries). l Always test pointer parameters with null pointers. l Design tests that cause the component to fail. violate preconditions l In shared memory systems, vary the order in which components are activated.
30
CMSC 345, Version 4/0430 Stress Testing l Exercises the system beyond its maximum design load. Exceed string lengths Store/manipulate more data than in specification Load system with more users than in specification l Stressing the system often causes defects to come to light. l Systems should not fail catastrophically. Stress testing checks for unacceptable loss of service or data.
31
CMSC 345, Version 4/0431 l The components to be tested are classes that are instantiated as objects. l No obvious “top” or “bottom” to the system for top-down or bottom-up integration and testing. l Levels: Testing class methods Testing the class as a whole Testing clusters of cooperating classes Testing the complete OO system Object-oriented Testing
32
CMSC 345, Version 4/0432 Class Testing l Complete test coverage of a class involves Testing all operations associated with an object Setting and interrogating all object attributes Exercising the object in all possible states; i.e., all events that cause a state (attribute(s)) change in the object should be tested l Inheritance makes it more difficult to design class tests as the information to be tested is not localized.
33
CMSC 345, Version 4/0433 Object Integration l Levels of integration are less distinct in object- oriented systems. l Cluster testing is concerned with integrating and testing clusters of cooperating objects. l Identify clusters using knowledge of the operation of objects and the system features that are implemented by these clusters
34
CMSC 345, Version 4/0434 Approaches to Cluster Testing l Use case or scenario testing Testing is based on user (actor) interactions with the system Has the advantage that it tests system features as experienced by users l Thread testing Tests the system’s response to events as processing threads through the system »a button click
35
CMSC 345, Version 4/0435 Scenario-based Testing l Identify scenarios from use cases and supplement these with sequence diagrams that show the objects involved in the scenario
36
CMSC 345, Version 4/0436 Sample Sequence Diagram
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.