Presentation is loading. Please wait.

Presentation is loading. Please wait.

IS301 – Software Engineering V:

Similar presentations


Presentation on theme: "IS301 – Software Engineering V:"— Presentation transcript:

1 IS301 – Software Engineering mailto:mkabay@norwich.edu V: 802.479.7937
Software Testing IS301 – Software Engineering Lecture #31 – M. E. Kabay, PhD, CISSP Assoc. Prof. Information Assurance Division of Business & Management, Norwich University V: M. E. Kabay, PhD, CISSP Copyright © 2003 M. E. Kabay All rights reserved.

2 Topics Defect Testing Integration Testing Object-Oriented Testing
Testing Workbenches

3 Copyright © 2003 M. E. Kabay. All rights reserved.
Defect Testing Testing programs to establish presence of system defects Copyright © 2003 M. E. Kabay All rights reserved.

4 Testing Process Component testing
Testing of individual program components Usually responsibility of component developer (except sometimes for critical systems) Tests derived from developer’s experience Integration testing Testing of groups of components integrated to create system or sub-system responsibility of independent testing team Tests based on system specification

5 Testing Phases

6 Defect Testing Goal of defect testing to discover defects in programs
Successful defect test test which causes program to behave in anomalous way Tests show presence not absence of defects

7 Copyright © 2003 M. E. Kabay. All rights reserved.
Testing priorities Only exhaustive testing can show program free from defects. However, exhaustive testing impossible Tests should exercise system's capabilities rather than its components Testing old capabilities more important than testing new capabilities Testing typical situations more important than boundary value cases Copyright © 2003 M. E. Kabay All rights reserved.

8 Test Data and Test Cases
Inputs which have been devised to test system Test cases Inputs to test system plus Predicted outputs from these inputs If system operates according to its specification Copyright © 2003 M. E. Kabay All rights reserved.

9 Defect Testing Process

10 Black-Box Testing Program considered as ‘black-box’
Test cases based on system specification Test planning can begin early in software process

11 Copyright © 2003 M. E. Kabay. All rights reserved.
Black-Box Testing Copyright © 2003 M. E. Kabay All rights reserved.

12 Equivalence Partitioning
Input data and output results often fall into different classes where all members of class related Each of these classes = equivalence partition where program behaves in equivalent way for each class member Test cases should be chosen from each partition

13 Equivalence Partitioning (1)
Copyright © 2003 M. E. Kabay All rights reserved.

14 Equivalence partitioning (2)
Partition system inputs and outputs into ‘equivalence sets’; e.g., If input 5-digit integer between 10,000 and 99,999, then equivalence partitions are <10,000, 10,000-99,999 and >99,999 Choose test cases at boundaries of these sets: 00000, 09999, 10000, 10001, 99998, 99999, & Copyright © 2003 M. E. Kabay All rights reserved.

15 Equivalence Partitioning (3)

16 Search Routine Specification
procedure Search (Key : ELEM ; T: ELEM_ARRAY; Found : in out BOOLEAN; L: in out ELEM_INDEX) ; Pre-condition -- array has at least one element T’FIRST <= T’LAST Post-condition -- element found and referenced by L ( Found and T (L) = Key) or -- element not in array ( not Found and not (exists i, T’FIRST >= i <= T’LAST, T (i) = Key )) Copyright © 2003 M. E. Kabay All rights reserved.

17 Search Routine - Input Partitions
Inputs which conform to pre-conditions Inputs where pre-condition does not hold Inputs where key element is member of array Inputs where key element not member of array Copyright © 2003 M. E. Kabay All rights reserved.

18 Testing Guidelines (Sequences)
Test software with sequences which have only single value Use sequences of different sizes in different tests Derive tests so that first, middle and last elements of sequence accessed Test with sequences of zero length

19 Search Routine - Input Partitions

20 Copyright © 2003 M. E. Kabay. All rights reserved.
Structural Testing Sometime called white-box testing Derivation of test cases according to program structure. Knowledge of program used to identify additional test cases Objective to exercise all program statements not all path combinations Copyright © 2003 M. E. Kabay All rights reserved.

21 Copyright © 2003 M. E. Kabay. All rights reserved.
White-Box Testing Copyright © 2003 M. E. Kabay All rights reserved.

22 Binary Search – Equiv. Partitions
Pre-conditions satisfied, key element in array Pre-conditions satisfied, key element not in array Pre-conditions unsatisfied, key element in array Pre-conditions unsatisfied, key element not in array Input array has single value Input array has even number of values Input array has odd number of values Copyright © 2003 M. E. Kabay All rights reserved.

23 Binary Search – Equiv. Partitions
Copyright © 2003 M. E. Kabay All rights reserved.

24 Binary Search – Test Cases
Copyright © 2003 M. E. Kabay All rights reserved.

25 Path Testing Objective: ensure each path through program executed at least once Starting point: flow graph Shows nodes representing program decisions Arcs representing flow of control Statements with conditions therefore = nodes in flow graph

26 Copyright © 2003 M. E. Kabay. All rights reserved.
Program Flow Graphs Each branch shown as separate path Loops shown by arrows looping back to loop condition node Basis for computing cyclomatic complexity Cyclomatic complexity = Number of edges - Number of nodes + 2 Copyright © 2003 M. E. Kabay All rights reserved.

27 Cyclomatic Complexity
Number of tests needed to test all control statements equals cyclomatic complexity Cyclomatic complexity equals number of conditions in program Useful if used with care. Does not imply adequacy of testing. Although all paths executed, all combinations of paths not necessarily executed Copyright © 2003 M. E. Kabay All rights reserved.

28 Binary Search Flow Graph

29 Copyright © 2003 M. E. Kabay. All rights reserved.
Independent Paths 1, 2, 3, 8, 9 1, 2, 3, 4, 6, 7, 2 1, 2, 3, 4, 5, 7, 2 1, 2, 3, 4, 6, 7, 2, 8, 9 Test cases should be derived so that all of these paths executed Dynamic program analyzer may be used to check that paths have been executed Copyright © 2003 M. E. Kabay All rights reserved.

30 Integration Testing Tests complete systems or subsystems composed of integrated components Integration testing should be black-box testing with tests derived from specification Main difficulty is localizing errors Incremental integration testing reduces this problem

31 Incremental Integration Testing

32 Approaches to Integration Testing
Top-down testing Start with high-level system and integrate from top-down replacing individual components by stubs where appropriate Bottom-up testing Integrate individual components in levels until complete system created In practice, most integration involves combination of these strategies

33 Top-Down Testing

34 Bottom-Up Testing

35 Testing Approaches Architectural validation
Top-down integration testing better at discovering errors in system architecture System demonstration Top-down integration testing allows limited demonstration at early stage in development Test implementation Often easier with bottom-up integration testing Test observation – what’s happening during test? Problems with both approaches Extra code may be required to observe tests Instrumenting the code

36 Copyright © 2003 M. E. Kabay. All rights reserved.
Interface Testing Takes place when modules or sub-systems integrated to create larger systems Objective: Detect faults due to Interface errors or Invalid assumptions about interfaces Particularly important for object-oriented development Objects defined by their interfaces Copyright © 2003 M. E. Kabay All rights reserved.

37 Interface Testing

38 Interface Types Parameter interfaces
Data passed from one procedure to another Shared-memory interfaces Block of memory shared between / among procedures Procedural interfaces Sub-system encapsulates set of procedures to be called by other sub-systems Message-passing interfaces Sub-systems request services from other sub-systems

39 Interface Errors Interface misuse
Calling component makes error use of interface; e.g. parameters in wrong order Interface misunderstanding Calling component embeds incorrect assumptions about behavior of called component Timing errors Called and calling component operate at different speeds Out-of-date information accessed “Race conditions”

40 Interface Testing Guidelines
Parameters to called procedure at extreme ends of their ranges Always test pointer parameters with null pointers Design tests which cause component to fail Use stress testing in message passing systems (see next slide) In shared-memory systems, vary order in which components activated Why?

41 Stress Testing Exercises system beyond its maximum design load
Stressing system often causes defects to come to light Stressing system tests failure behavior Should not fail catastrophically Check for unacceptable loss of service or data Distributed systems Non-linear performance degradation

42 Approaches to Cluster Testing
Use-case or scenario testing Testing based on user interactions with system Tests system features as experienced by users Thread testing Tests systems response to events as processing threads through system Object interaction testing Tests sequences of object interactions Stop when object operation does not call on services from another object

43 Scenario-Based Testing
Identify scenarios from use-cases Supplement with interaction diagrams Show objects involved in scenario Consider scenario in weather station system where report generated (next slide)

44 Testing Workbenches Testing expensive Workbenches Range of tools
Reduce Time required and Total testing costs Most testing workbenches are open systems Testing needs are organization-specific Difficult to integrate with closed design and analysis workbenches

45 Testing Workbench

46 Testing Workbench Adaptation
Scripts User interface simulators and Patterns for test data generators Test outputs May have to be prepared manually for comparison Special-purpose file comparators may be developed

47 Required Homework Read-recite-review Chapter 23 of Sommerville’s text
Survey-Question Chapter 24 for Monday Quiz on WEDNESDAY 17th Nov: Chapters 17-21 Required For Fri 19 Nov 2004 for 35 points 23.1 & 23.3 23.4 & 23.5 23.7

48 Optional Homework By Mon 29 Nov 2004, for up to 14 extra points, complete any or all of 23.2, 23.6, 23.8 23.9 23.10

49 Copyright © 2003 M. E. Kabay. All rights reserved.
DISCUSSION Copyright © 2003 M. E. Kabay All rights reserved.


Download ppt "IS301 – Software Engineering V:"

Similar presentations


Ads by Google