Download presentation
Presentation is loading. Please wait.
1
Software Testing Techniques
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
2
Testing Objectives(SWE 5.5.2)
Executing a program with the intent of finding an error. Has a high probability of finding an undiscovered error. Uncovers an error. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
3
Testing Principles All tests should be traceable to customer requirements. Tests should be planned long before testing begin. The Pareto Principle applies to software testing. (80% of all errors uncovered during testing will likely be traceable to 20% of all program components). Testing should begin “in the small” and progress toward testing “in the large”. Exhaustive testing is not possible. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
4
Testability Software testability is simply how easily a software can be tested. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
5
Testable Software Characteristics
Operability—it operates cleanly (the better it works, the more efficiently it cab be tested) Observability—the results of each test case are readily observed (incorrect output is easily identified) Controllability—the degree to which testing can be automated and optimized (the better we can control the software, the more testing can automated and optimized) Decomposability—testing can be targeted (the software system is built from independent modules that can be tested independently) Simplicity—reduce complex architecture and logic to simplify tests (the software should exhibit functional simplicity, structural simplicity, and code simplicity). Stability—few changes are requested during testing Understandability—of the design (the more information we have, the smarter we will test) These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
6
What is a “Good” Test? A good test has a high probability of finding an error A good test is not redundant. A good test should be “best of breed” A good test should be neither too simple nor too complex These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
7
Software Testing white-box black-box methods methods Methods
Strategies These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
8
White-Box Testing (SWE 14.3)
... our goal is to ensure that all statements and conditions have been executed at least once ... These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
9
White-Box Testing Derive test cases that:
Guarantee that all independent paths within a module have been exercised at least once. Exercise all logical decisions on their true and false sides. Execute all loops at their boundaries and within their operational bounds. Exercise internal data structures to ensure its validity. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
10
Black-Box Testing(SWE 14.6)
requirements output input events These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
11
Black-Box Testing Find errors in the following categories:
Incorrect or missing functions Interface errors Errors in data structures or external data base access. Behavior or performance errors. Initialization and termination errors. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
12
Black-Box Testing Tests are designed to answer the following questions: How is functional validity tested? How is system behavior and performance tested? What classes of input will make good test cases? Is the system particularly sensitive to certain input values? How are the boundaries of a data class isolated? What data rates and data volume can the system tolerate? What effect will specific combinations of data have on system operation? These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
13
Unit (functional) Testing(SWE 13.3.1)
module to be tested results software engineer test cases These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
14
Unit test Considerations
Module interface is tested to ensure information flow Local data structures are examined to ensure that local data maintains its integrity All independent paths are exercised to ensure all statements are executed at least once. Modules operate properly at boundaries established to limit or restrict processing. Error handling paths are tested. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
15
Unit Testing module to be tested interface local data structures
boundary conditions independent paths error handling paths test cases These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
16
Unit testing Test cases should uncover errors such as:
Comparison of different data types. Incorrect logical operators or procedures. Expectation of equality when precision errors make equality unlikely. Incorrect comparison of errors Improper or nonexistent loop termination. Improperly modified loop variables. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
17
Unit Test Environment test cases RESULTS driver Module stub stub
interface local data structures Module boundary conditions independent paths error handling paths stub stub test cases RESULTS These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
18
Integration Testing Strategies (SWE 13.3.2)
Options: • the “big bang” approach • an incremental construction strategy These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
19
Top Down Integration A top module is tested with stubs B F G
stubs are replaced one at a time, "depth first" C as new modules are integrated, some subset of tests is re-run D E These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
20
Bottom-Up Integration
F G drivers are replaced one at a time, "depth first" C worker modules are grouped into builds and integrated D E cluster These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
21
System Testing (SWE 13.6) A series of tests whose primary purpose is to fully exercise the computer based system. Such tests could be categorized as: Recovery testing : forcing the software and verifying the recovery is properly performed. Security testing : verifying the protection mechanisms built into a system will protect it from improper penetration. Stress testing: executing the system in a manner that demands resources in abnormal quantity, frequency, or volume. Performance testing: testing the run-time performance of software within the context of an integrated system. These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
22
Equivalence Partitioning (SWE 14.6.2)
user queries output formats FK input mouse picks data prompts These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
23
Sample Equivalence Classes
Valid data user supplied commands responses to system prompts file names computational data physical parameters bounding values initiation values output data formatting responses to error messages graphical data (e.g., mouse picks) Invalid data data outside bounds of the program physically impossible data proper value supplied in wrong place These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
24
Boundary Value Analysis (SWE 14.6.3)
user queries output formats FK input mouse picks data prompts output domain input domain These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by R.S. Pressman & Associates, Inc., copyright © 1996, 2001, 2005
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.