Download presentation
Published byMarilynn Bond Modified over 9 years ago
1
Software Testing and Reliability Software Test Process
Aditya P. Mathur Purdue University August 12-16 @ Guidant Corporation Minneapolis/St Paul, MN Graduate Assistants: Ramkumar Natarajan Baskar Sridharan Last update: August 9, 2002
2
Software Testing and Reliability Aditya P. Mathur 2002
Learning Objectives What is software life cycle? How to mesh various types of testing into the development cycle? What documents are generated for and as a result of testing? What test-metrics needs to be collected and why? Software Testing and Reliability Aditya P. Mathur 2002
3
Software Testing and Reliability Aditya P. Mathur 2002
References Software Testing in the Real World, Edward Kit, Addison Wesley, 1995. Software Testing and Reliability Aditya P. Mathur 2002
4
Software Testing and Reliability Aditya P. Mathur 2002
Life Cycle Software Development Cycle is a sequence of steps in which a software system is developed. We shall refer to this sequence as, simply, Life Cycle. There are many models of Life Cycle. The waterfall model Iterative refinement model Incremental development model Spiral model Unified Process model Software Testing and Reliability Aditya P. Mathur 2002
5
Testing and the Life Cycle
It is widely believed that testing is necessary in each phase of the life cycle regardless of which model is adopted. The key questions with which most process managers struggle are: How should the test process be “meshed” with the development cycle? What tools and techniques should be used in each phase? What data should be collected to evaluate the effectiveness of the test process? Software Testing and Reliability Aditya P. Mathur 2002
6
Meshing the Test Process with the Life Cycle
Requirements specification System Modification Test the requirements Design specification Run regression Tests System Deployment Code Test the design Test for acceptance Integration System Integration Test the modules Test the subsystems Test the system Software Testing and Reliability Aditya P. Mathur 2002
7
Test Tools and Techniques: Requirements Phase
Inspection and walkthroughs: Ensure completeness, unambiguity, precision, consistency, testability, and traceability. [Rational: Rose and Analyst Studio, I-Logix: Rhapsody] Simulation: Use this to improve the understanding of the system to be built; where possible, work with the customer. Generate test procedures, plans, and test cases: Develop procedures, plans and test cases to be used for testing each requirement. Identify what tools will be used in each phase, who will use these, and what data will be collected. Determine product release criteria. Software Testing and Reliability Aditya P. Mathur 2002
8
Test Tools and Techniques: Design Phase
Inspection and walkthroughs [Rational: Rose] Simulation: Check for the correctness of scenarios. [I-Logix: Rhapsody] Refine test procedures and test cases: Use the artifacts created during design to generate additional test cases and refine or correct the existing ones. Risk analysis: Use the application architecture to perform risk analysis. [Rational: ROOM] Software Testing and Reliability Aditya P. Mathur 2002
9
Test Tools and Techniques: Coding Phase
Inspection and walkthroughs: Ensure that the code matches the design. If not then at leant one of the two needs to be modified to achieve consistency. [Rational: Rose] Perform unit testing: Test each module thoroughly. [CleanScape: TestWise; Rational: Purify and Visual Test; AMC: CodeTest] Software Testing and Reliability Aditya P. Mathur 2002
10
Test Tools and Techniques: Integration Phase
Perform integration testing: Test each subsystem thoroughly. [CleanScape: TestWise; Rational: Purify and Visual Test; Applied Microsystems: CodeTest; Windriver: Sniff+] Software Testing and Reliability Aditya P. Mathur 2002
11
Test Tools and Techniques: System Integration
Perform system testing: Test the system to ensure correct functionality with respect to use cases, do performance, stress, and robustness tests. [CMU: Ballista; Telcordia: AETG; Applied Microsystems: CodeTest; Windriver: Diab RTA suite and Sniff+] Software Testing and Reliability Aditya P. Mathur 2002
12
Test Tools and Techniques: System Changes
Perform regression testing: Test the modified system to ensure that existing functions perform as intended. Note that system testing for the added features ought to be performed. [CleanScape: Testwise for test minimization] Software Testing and Reliability Aditya P. Mathur 2002
13
Test Tools and Techniques: In-target system
Perform system testing: Test the application for all functionality, its performance, and robustness while the application runs embedded in the target platform. [Advanced Microsystems: CodeTest; Home-grown test generators; Rational: Purify] Software Testing and Reliability Aditya P. Mathur 2002
14
Test Tools and Techniques: Usability testing
Perform usability testing: Have a sample of the potential users test the application and the documents that accompany it. The goal is to ensure that (a) the documents are understandable, consistent, and correct and (b) the GUI satisfies the requirements of the potential users. Usability testing can begin when the GUI is ready for use, i.e. developed and tested for correctness, but the remainder of the application code might not be. Software Testing and Reliability Aditya P. Mathur 2002
15
Test Tools and Techniques: Security testing
Perform security and safety testing: Security and safety testing must be done when an application is accessible from the “outside” and might compromise the user’s privacy and safety, The objective of security and safety testing is to demonstrate that privacy and safety cannot be compromised when the assumptions regarding the use of the application are met. Security and safety testing is done with respect to the requirements for security and safety. Traditional testing techniques are useful but might have to be augmented with techniques specific to the application environment. Software Testing and Reliability Aditya P. Mathur 2002
16
Software Testing and Reliability Aditya P. Mathur 2002
Responsibilities Planning: (High level) Test team Unit tests: Developer. (Who checks the developers work?) Integration tests: Developer and/or test team depending on the complexity of the sub-system to be tested. System level tests: Test team. These tests will generally require communication with the developers. This must be a part of the detail plan built at the beginning of the development cycle. Acceptance and usability tests: Test team in collaboration with the end user (or a sample of the end user set). Software Testing and Reliability Aditya P. Mathur 2002
17
Software Testing and Reliability Aditya P. Mathur 2002
Release Criteria Units: Code coverage based. Example: all statements, decisions, and conditions must be covered. Integration tests: Code coverage and function based. Low level code coverage might not be usable if the integration produces a “large” component. System level tests: System reliability measured in terms of the number and type of failures detected. Acceptance and usability tests: Determined by the user primarily based on correct functionality, robustness, performance, and usability. Software Testing and Reliability Aditya P. Mathur 2002
18
Standards for Test Documents
SQAP: : Software Quality Assurance Plan, IEEE/ANSI, 1989 [Std ]. SVVP: Software Verification and Validation Plan, IEEE/ANSI, 1986 [Std ]; one per SQAP. VTP: Verification Test Plan, one per verification activity. MVP: Master Validation Test Plan, IEEE/ANSI, 1983 [Std ]; one per SVVP. DTP: Detailed Validation Test Plan, IEEE/ANSI, 1983 [Std ]; one or more per activity. Software Testing and Reliability Aditya P. Mathur 2002
19
Standards for Test Documents (contd.)
TDS: : Test Design Specification, IEEE/ANSI, 1983 [Std ], one per DTP. TCS: Test Case Specification IEEE/ANSI, 1983 [Std ]; one or more per TDS/TPS. TPS: Test Procedure Specification, IEEE/ANSI, 1983 [Std ]; one or more per TDS. TC: Test Case, one per TCS. Software Testing and Reliability Aditya P. Mathur 2002
20
Structure of Standards for Test Documents
SQAP SVVP VTP MTP DTP TDS TDS TPS TCS TPS TC Software Testing and Reliability Aditya P. Mathur 2002
21
Defect Tracking and Classification
Defect tracking and classification [Applied Innovation management: Bug/Defect Tracking Expert; defectx: defectX] Defect estimation: Use inspection data for defect estimation; design metrics; past data on similar projects. Defects open vs. closed Cross-project and cross version comparison of the number of defects. Cost of repairing a defect: time and manpower resources needed. Software Testing and Reliability Aditya P. Mathur 2002
22
Software Testing and Reliability Aditya P. Mathur 2002
Why collect data? Assists with improving the test process; correlations between other process related metrics, such as code coverage, and defects can be established. Helps improve product quality. Useful in setting up a reward system (no penalty please!). Software Testing and Reliability Aditya P. Mathur 2002
23
Software Testing and Reliability Aditya P. Mathur 2002
Summary Software development cycle Test process and the software development cycle Assignment of responsibilities Standards Metrics Software Testing and Reliability Aditya P. Mathur 2002
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.