Download presentation
Presentation is loading. Please wait.
Published byZoe Barker Modified over 9 years ago
1
Testing Basics of Testing Presented by: Vijay.C.G – Glister Tech
2
Software Reliability Probability that a software system will not deviate from the required behavior for a specified time under specified conditions Reliability enhancing techniques –Fault avoidance –Fault detection –Fault tolerance
3
Fault Avoidance Development methodologies –Avoid faults by minimizing their introduction into models and code Configuration management –Avoid faults by minimizing undisciplined changes to the models Verification techniques –Only useful for simple software projects Code reviews
4
Fault Detection Debugging –Correctness debugging –Performance debugging Testing –Component testing –Integration testing –System testing
5
Fault Tolerance Allows a system to recover in the case of failure –Atomic transactions –Modular redundancy
6
What Is Testing? 1.The process of demonstrating that errors are not present 2.The systematic attempt to find errors in a planned way
7
Concepts Component Fault Error Failure Test case Test stub Test driver Correction
8
Component Part of the system that can be isolated for testing –Function(s) –Object(s) –Subsystem(s)
9
Fault Design or coding mistake that may cause abnormal component behavior A.K.A. bug or defect
10
Error A manifestation of a fault during the execution of the system
11
Failure A deviation between the specification of a component and its behavior –Caused by one or more errors
12
Test Case A set of inputs and expected results that exercises a component
13
Test Case (Cont.)
14
Test Stub and Driver Stub provides a partial implementation of components on which the tested component depends Driver provides a partial implementation of a component that depends on the tested component
15
Test Stub and Driver (Cont.)
16
Correction A change made to a component in order to repair a fault –May introduce additional faults –New faults can be used to handle new faults Problem tracking Regression Testing Rationale Maintenance
17
Testing Activities Inspecting components Unit testing Integration testing System testing
18
Inspecting Components Code reviews find faults in components Can be formal or informal Can be more effective than testing, but finds different types of faults Can be time consuming
19
Inspecting Components (Cont.) Fagan’s Inspection Method –Overview Author presents purpose and scope of component –Preparation Reviewers become familiar with implementation –Inspection meeting Recommendations are presented –Rework –Follow up
20
Inspecting Components (Cont.) Active Design Review –Similar to Fagan’s method Less formal –Not all reviewers need to be involved –Reviewers make recommendations during the preparation phase –There is no inspection meeting Author meets with reviewers individually if at all Reviewers fill out questionnaires
21
Unit Testing Equivalence testing Boundary testing Path testing State testing
22
Equivalence Testing Black box technique Possible inputs are partitioned into equivalence classes A typical input and an invalid input are selected for each class Test cases are written using the input values selected
23
Equivalence Testing (Cont.) Class1 (31 day months) (1, 3, 5, 7, 8, 10, 12) Class2 (30 day months) (4, 6, 9, 11) Class3 (February) Class4 (Leap years) Class5 (Non leap years)
24
Boundary Testing Special case of equivalence testing Deals with boundary conditions and special cases Detects “off by one” and “fence post” faults
25
Path Testing White box technique Every path is exercised Unable to detect omissions Does not work well for object oriented programs –Polymorphism makes paths dynamic –Shorter, related functions must be tested together
26
State Testing Designed to work with object oriented systems Compares the resulting state of the system with the expected state Currently not used in practice –Lengthy test sequences required to put objects in desired state –Not yet supported by automated testing tools
27
Integration Testing Big bang testing Bottom-up testing Top-down testing Sandwich testing
28
Big Bang Testing All components unit tested first All components integrated and tested as a whole Requires no stubs or drivers Difficult to pinpoint cause of failure –Unable to distinguish failures in the interface from failures inside the components
29
Bottom-Up Testing Lower level components tested first Lower level components are integrated with components from the next level Process continues until all levels have been integrated
30
Top-Down Testing Upper level components tested first Upper level components are integrated with components from the next level Process continues until all levels have been integrated
31
Sandwich Testing Combines bottom-up and top-down testing strategies System is divided into three layers –Top layer is unit tested, and then top-down integration testing commences –Bottom layer is unit tested, and then bottom- up integration testing commences Test stubs and drivers not needed for the top and bottom layers Does not adequately test the target layer
32
Sandwich Testing (Cont.)
33
Modified Sandwich Testing Each layer is unit tested first –Top layer uses stubs for the target layer –Target layer uses drivers for the top layer, and stubs for the bottom layer –Bottom layer uses drivers for the target layer Bottom-up and top-down testing can then reuse the test cases that were used in the individual layer tests
34
Modified Sandwich Testing (Cont.)
35
System Testing Functional testing Performance testing Pilot testing Acceptance testing Installation testing
36
Functional Testing Tests the functional requirements specified in the RAD Black box technique –Test cases are derived from the use cases. Test cases should exercise both common and exceptional behavior
37
Performance Testing Tests the design goals specified during system design, and the nonfunctional requirements in the RAD Types of Performance Testing –Stress Testing –Volume Testing –Security Testing –Timing Tests –Recovery Tests
38
Pilot Testing A selected set of users installs and tests the system Phases –Alpha test System is tested in the development environment –Beta test System is tested in the target environment
39
Acceptance Testing Benchmark tests Competitor tests –New system is tested against an existing system or a competitor’s product Shadow tests –Both the new system and the old system are run in parallel and results compared
40
Installation Testing Performed in the target environment Repeats the test cases that were run during functional and performance testing
41
Managing Testing Planning Documenting Assigning responsibilities
42
Test Planning Test cases should be developed early –As soon as their models become stable –Tests must be changed when the models change Parallelize tests whenever possible –Component tests should all run at the same time –Integration tests can start when some components tests have passed
43
Documentation Test Plan –Documents scope, approach, resources, and schedule of testing process Test Case Specification –Contains the input, drivers, stubs, and expected results –Documents test procedure
44
Documentation (Cont.)
46
Test Incident Report –Records the differences between the actual output and the expected output or each execution of a test case Test Summary Report –Lists all failures discovered during testing
47
Assigning Responsibilities Testers should not be the same developers who built the system Dedicated testing teams are used when there are stringent quality requirements Subsystem teams can test subsystems developed by other teams when there are not stringent quality requirements
48
Questions? Please Reach – Glister Tech Family…..
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.