Download presentation
Presentation is loading. Please wait.
1
Chapter 9 Testing the System, part 2
2
Testing Unit testing White (glass) box Code walkthroughs and inspections Integration testing Bottom-up Top-down Sandwich Big Bang Function testing Performance testing Acceptance testing Installation testing
4
One-minute quiz What is the difference between verification and validation?
5
One-minute quiz What is meant by regression testing? Is regression testing used for verification or validation?
6
Function Testing Test cases derived from requirements specification document Black box testing Independent testers Test both valid and invalid input and the success of the test is determined by the produced output Equivalence partitioning Boundary values
7
Performance Testing Stress tests Volume tests Configuration tests Compatibility tests Regression tests Security tests Timing tests Environmental tests Quality tests Recovery tests Maintenance tests Documentation tests Human factors (usability) tests
8
Reliability, Availability, and Maintainability Software reliability: operating without failure under given condition for a given time interval Software availability: operating successfully according to specification at a given point in time Software maintainability: for a given condition of use, a maintenance activity can be carried out within stated time interval, procedures and resources
9
Measuring Reliability, Availability, and Maintainability Mean time to failure (MTTF) Mean time to repair (MTTR) Mean time between failures (MTBF) MTBF = MTTF + MTTR Reliability R = MTTF / (1+MTTF) Availability A = MTBF / (1+MTBF) Maintainability M = 1 / (1+MTTR)
10
When to stop testing Fault seeding Adding faults to the code to estimate the number of remaining faults. Suppose 50 faults have been seeded in the code. Regression testing identifies 60 faults, forty of which are the seeded faults. What is the estimate of the total number of undiscovered real faults remaining?
11
Acceptance Tests Enable the customers and users to determine if the built system meets their needs and expectations Written, conducted, and evaluated by the customers
12
Types of Acceptance Tests Pilot test: install on experimental basis Alpha test: in-house test Beta test: customer pilot Parallel testing: new system operates in parallel with old system
13
Test Documentation Test plan: describes system and plan for exercising all functions and characteristics Test specification and evaluation: details each test and defines criteria for evaluating each feature Test description: test data and procedures for each test Test report: results of each test
14
Test Documentation
15
Test Plan Used to organize testing activities guides the scheduling of the programming explains the nature and extent of each test documents test input, specific test procedures, and expected outcomes
16
Defect Tracking Form
17
Quality Assurance Quality Control Testing the quality of the program Quality Assurance Building quality into the program Management level – proactive process Checklists
18
Testing Safety-Critical Systems Recognize that testing cannot remove all faults or risks Assume that every mistake users can make will be made Do not assume that low-probability, high- impact events will not happen Emphasize requirements definition, testing, code and specification reviews, and configuration control Cleanroom testing
19
Different Levels of Failure Severity Catastrophic: causes death or system loss Critical: causes severe injury or major system damage Marginal: causes minor injury or minor system damage Minor: causes no injury or system damage
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.