Download presentation
Presentation is loading. Please wait.
Published byDaniel Jacobs Modified over 11 years ago
1
1 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Software Testing IS301 – Software Engineering Lecture #31 – 2004-11-12 M. E. Kabay, PhD, CISSP Assoc. Prof. Information Assurance Division of Business & Management, Norwich University mailto:mkabay@norwich.edumailto:mkabay@norwich.edu V: 802.479.7937
2
2 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Topics Defect Testing Integration Testing Object-Oriented Testing Testing Workbenches
3
3 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Defect Testing Testing programs to establish presence of system defects
4
4 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Testing Process Component testing Testing of individual program components Usually responsibility of component developer (except sometimes for critical systems) Tests derived from developers experience Integration testing Testing of groups of components integrated to create system or sub-system responsibility of independent testing team Tests based on system specification
5
5 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Testing Phases
6
6 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Defect Testing Goal of defect testing to discover defects in programs Successful defect test test which causes program to behave in anomalous way Tests show presence not absence of defects
7
7 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Testing priorities Only exhaustive testing can show program free from defects. However, exhaustive testing impossible Tests should exercise system's capabilities rather than its components Testing old capabilities more important than testing new capabilities Testing typical situations more important than boundary value cases
8
8 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Test Data and Test Cases Test data Inputs which have been devised to test system Test cases Inputs to test system plus Predicted outputs from these inputs If system operates according to its specification
9
9 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Defect Testing Process
10
10 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Black-Box Testing Program considered as black-box Test cases based on system specification Test planning can begin early in software process
11
11 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Black-Box Testing
12
12 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Equivalence Partitioning Input data and output results often fall into different classes where all members of class related Each of these classes = equivalence partition where program behaves in equivalent way for each class member Test cases should be chosen from each partition
13
13 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Equivalence Partitioning (1)
14
14 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Equivalence partitioning (2) Partition system inputs and outputs into equivalence sets; e.g., If input 5-digit integer between 10,000 and 99,999, then equivalence partitions are <10,000, 10,000-99,999 and >99,999 Choose test cases at boundaries of these sets: 00000, 09999, 10000, 10001, 99998, 99999, & 100000
15
15 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Equivalence Partitioning (3)
16
16 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Search Routine Specification procedure Search (Key : ELEM ; T: ELEM_ARRAY; Found : in out BOOLEAN; L: in out ELEM_INDEX) ; Pre-condition -- array has at least one element TFIRST <= TLAST Post-condition -- element found and referenced by L ( Found and T (L) = Key) or -- element not in array ( not Found and not (exists i, TFIRST >= i <= TLAST, T (i) = Key ))
17
17 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Search Routine - Input Partitions Inputs which conform to pre-conditions Inputs where pre-condition does not hold Inputs where key element is member of array Inputs where key element not member of array
18
18 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Testing Guidelines (Sequences) Test software with sequences which have only single value Use sequences of different sizes in different tests Derive tests so that first, middle and last elements of sequence accessed Test with sequences of zero length
19
19 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Search Routine - Input Partitions
20
20 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Structural Testing Sometime called white-box testing Derivation of test cases according to program structure. Knowledge of program used to identify additional test cases Objective to exercise all program statements not all path combinations
21
21 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. White-Box Testing
22
Binary Search (Java)
23
23 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Binary Search – Equiv. Partitions Pre-conditions satisfied, key element in array Pre-conditions satisfied, key element not in array Pre-conditions unsatisfied, key element in array Pre-conditions unsatisfied, key element not in array Input array has single value Input array has even number of values Input array has odd number of values
24
24 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Binary Search – Equiv. Partitions
25
25 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Binary Search – Test Cases
26
26 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Path Testing Objective: ensure each path through program executed at least once Starting point: flow graph Shows nodes representing program decisions Arcs representing flow of control Statements with conditions therefore = nodes in flow graph
27
27 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Program Flow Graphs Each branch shown as separate path Loops shown by arrows looping back to loop condition node Basis for computing cyclomatic complexity Cyclomatic complexity = Number of edges - Number of nodes + 2
28
28 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Cyclomatic Complexity Number of tests needed to test all control statements equals cyclomatic complexity Cyclomatic complexity equals number of conditions in program Useful if used with care. Does not imply adequacy of testing. Although all paths executed, all combinations of paths not necessarily executed
29
Binary Search Flow Graph
30
30 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Independent Paths 1, 2, 3, 8, 9 1, 2, 3, 4, 6, 7, 2 1, 2, 3, 4, 5, 7, 2 1, 2, 3, 4, 6, 7, 2, 8, 9 Test cases should be derived so that all of these paths executed Dynamic program analyzer may be used to check that paths have been executed
31
31 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Integration Testing Tests complete systems or subsystems composed of integrated components Integration testing should be black-box testing with tests derived from specification Main difficulty is localizing errors Incremental integration testing reduces this problem
32
32 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Incremental Integration Testing
33
33 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Approaches to Integration Testing Top-down testing Start with high-level system and integrate from top-down replacing individual components by stubs where appropriate Bottom-up testing Integrate individual components in levels until complete system created In practice, most integration involves combination of these strategies
34
34 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Top-Down Testing
35
35 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Bottom-Up Testing
36
36 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Testing Approaches Architectural validation Top-down integration testing better at discovering errors in system architecture System demonstration Top-down integration testing allows limited demonstration at early stage in development Test implementation Often easier with bottom-up integration testing Test observation – whats happening during test? Problems with both approaches Extra code may be required to observe tests Instrumenting the code
37
37 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Interface Testing Takes place when modules or sub-systems integrated to create larger systems Objective: Detect faults due to Interface errors or Invalid assumptions about interfaces Particularly important for object-oriented development Objects defined by their interfaces
38
38 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Interface Testing
39
39 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Interface Types Parameter interfaces Data passed from one procedure to another Shared-memory interfaces Block of memory shared between / among procedures Procedural interfaces Sub-system encapsulates set of procedures to be called by other sub-systems Message-passing interfaces Sub-systems request services from other sub-systems
40
40 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Interface Errors Interface misuse Calling component makes error use of interface; e.g. parameters in wrong order Interface misunderstanding Calling component embeds incorrect assumptions about behavior of called component Timing errors Called and calling component operate at different speeds Out-of-date information accessed Race conditions
41
41 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Interface Testing Guidelines Parameters to called procedure at extreme ends of their ranges Always test pointer parameters with null pointers Design tests which cause component to fail Use stress testing in message passing systems (see next slide) In shared-memory systems, vary order in which components activated Why?
42
42 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Stress Testing Exercises system beyond its maximum design load Stressing system often causes defects to come to light Stressing system tests failure behavior Should not fail catastrophically Check for unacceptable loss of service or data Distributed systems Non-linear performance degradation
43
43 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Object-Oriented Testing Test object classes instantiated as objects Larger grain than individual functions Approaches to white-box testing have to be extended No obvious top to system for top-down integration and testing
44
44 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Testing Levels for OOD Testing operations associated with objects Testing object classes Testing clusters of cooperating objects Testing complete OO system
45
45 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Object-Class Testing Complete test coverage of class involves Testing all operations associated with object Setting and interrogating all object attributes Exercising object in all possible states Inheritance More difficult to design object class tests Information to be tested not localized
46
46 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Weather Station Object Interface Test cases needed for all operations Use state model to identify state transitions for testing Examples of testing sequences Shutdown > Waiting > Shutdown Waiting > Calibrating > Testing > Transmitting > Waiting Waiting > Collecting > Waiting > Summarizing > Transmitting > Waiting
47
47 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Object Integration Levels of integration less distinct in object- oriented systems Cluster testing concerned with integrating and testing clusters of cooperating objects Identify clusters using knowledge of Operation of objects and System features implemented by these clusters
48
48 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Approaches to Cluster Testing Use-case or scenario testing Testing based on user interactions with system Tests system features as experienced by users Thread testing Tests systems response to events as processing threads through system Object interaction testing Tests sequences of object interactions Stop when object operation does not call on services from another object
49
49 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Scenario-Based Testing Identify scenarios from use-cases Supplement with interaction diagrams Show objects involved in scenario Consider scenario in weather station system where report generated (next slide)
50
50 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Collect Weather Data
51
51 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Weather Station Testing Thread of methods executed CommsControler:request > WeatherStation:report > WeatherData:summarize Inputs and outputs Input of report request with associated acknowledge and final output of report Can be tested by creating raw data and ensuring that it is summarized properly Use same raw data to test WeatherData object
52
52 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Testing Workbenches Testing expensive Workbenches Range of tools Reduce Time required and Total testing costs Most testing workbenches are open systems Testing needs are organization-specific Difficult to integrate with closed design and analysis workbenches
53
53 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Testing Workbench
54
54 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Testing Workbench Adaptation Scripts User interface simulators and Patterns for test data generators Test outputs May have to be prepared manually for comparison Special-purpose file comparators may be developed
55
55 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Required Homework Read-recite-review Chapter 23 of Sommervilles text Survey-Question Chapter 24 for Monday Quiz on WEDNESDAY 17 th Nov: Chapters 17-21 Required For Fri 19 Nov 2004 for 35 points 23.1 & 23.3 (@5) 23.4 & 23.5 (@10) 23.7 (@5)
56
56 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. Optional Homework By Mon 29 Nov 2004, for up to 14 extra points, complete any or all of 23.2, 23.6, 23.8 (@2) 23.9 (@5) 23.10 (@3)
57
57 Note content copyright © 2004 Ian Sommerville. NU-specific content copyright © 2004 M. E. Kabay. All rights reserved. DISCUSSION
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.