Download presentation
Presentation is loading. Please wait.
Published byMelissa Hunt Modified over 9 years ago
1
1 ECE 453 – CS 447 – SE 465 Software Testing & Quality Assurance Instructor Kostas Kontogiannis
2
2 These slides are based on: –Lecture slides by Ian Summerville, see http://www.comp.lancs.ac.uk/computing/resources/ser/ –ECE355 Lecture slides by Sagar Naik –Lecture Notes from Bernd Bruegge, Allen H. Dutoit “Object-Oriented Software Engineering – Using UML, Patterns and Java”
3
3 Overview Basics of Testing Testing & Debugging Activities Testing Strategies –Black-Box Testing –White-Box Testing Testing in the Development Process –Unit Test –Integration Test –System Test –Acceptance Test –Regression Test Practical Considerations
4
4 Static and dynamic V&V Static verification Requirements specification Architecture Detailed design Implementation V1 Dynamic V &V Prototype Implementation V2 Special cases Executable specifications Animation of formal specs
5
5 Can reveal the presence of errors NOT their absence –Only exhaustive testing can show a program is free from defects. However, exhaustive testing for any but trivial programs is impossible A successful test is a test which discovers one or more errors Should be used in conjunction with static verification Run all tests after modifying a system Program testing ©Ian Sommerville 1995
6
6 Testing in the V-Model Requirements Detailed Design Module implementation Unit test Integration test System test Acceptance test Run tests Write tests Customer Developer Functional (BB) Structural (WB) Architectural Design
7
7 Unit testing –Testing of individual components Integration testing –Testing to expose problems arising from the combination of components System testing –Testing the complete system prior to delivery Acceptance testing –Testing by users to check that the system satisfies requirements. Sometimes called alpha testing Testing stages
8
8 Statistical testing –Tests designed to reflect the frequency of user inputs. Used for reliability estimation. –Covered in section on Software reliability. Defect testing –Tests designed to discover system defects. –A successful defect test is one which reveals the presence of defects in a system. Types of testing ©Ian Sommerville 1995
9
9 Some Terminology –Failure A failure is said to occur whenever the external behavior does not conform to system spec. –Error An error is a state of the system which, in the absence of any corrective action, could lead to a failure. –Fault An adjudged cause of an error.
10
10 Some Terminology Fault Error Failure It is there in the program… Program state Observed fault, bug, error, defect
11
11 Testing Activities Tested Subsystem Code Functional Integration Unit Tested Subsystem Requirements Analysis Document System Design Document Tested Subsystem Test Test Unit Test Unit Test User Manual Requirements Analysis Document Subsystem Code Subsystem Code All tests by developer Functioning System Integrated Subsystems
12
12 Global Requirements Testing Activities continued User’s understanding Tests by developer PerformanceAcceptance Client’s Understanding of Requirements Test Functioning System Test Installation User Environment Test System in Use Usable System Validated System Accepted System Tests (?) by user Tests by client
13
13 Overview Basics of Testing Testing & Debugging Activities Testing Strategies –Black-Box Testing –White-Box Testing Testing in the Development Process –Unit Test –Integration Test –System Test –Acceptance Test –Regression Test Practical Considerations
14
14 Defect testing and debugging are distinct processes Defect testing is concerned with confirming the presence of errors Debugging is concerned with locating and repairing these errors Debugging involves formulating a hypothesis about program behaviour then testing these hypotheses to find the system error Testing and debugging ©Ian Sommerville 1995
15
15 Debugging Activities ©Ian Sommerville 1995 Locate error & fault Design fault repair Repair fault Re-test program
16
16 Testing Activities Identify Design Build Execute Compare Test conditions (“What”): an item or event to be verified. How the “what” can be tested: realization Build test cases (imp. scripts, data) Run the system Test case outcome with expected outcome Test result
17
17 Testing Activities Test condition –What: Descriptions of circumstances that could be examined (event or item). –Categories: functionality, performance, stress, robustness… –Derive Using testing techniques (to be discussed) (Refer to the V-Model)
18
18 Testing Activities Design test cases: the details –Input values –Expected outcomes Things created (output) Things changed/updated database? Things deleted Timing … –Expected outcomes Known Unknown (examine the first actual outcome) –Environment prerequisites: file, net connection …
19
19 Testing Activities Build test cases (implement) –Implement the preconditions (set up the environment) –Prepare test scripts (may use test automation tools) Structure of a test case Simple linear Tree (I, EO) {(I1, EO1), (I2, EO2), …} I EO1EO2
20
20 Testing Activities Scripts contain data and instructions for testing –Comparison information –What screen data to capture –When/where to read input –Control information Repeat a set of inputs Make a decision based on output –Testing concurrent activities
21
21 Testing Activities Compare (test outcomes, expected outcomes) –Simple/complex (known differences) –Different types of outcomes Variable values (in memory) Disk-based (textual, non-textual, database, binary) Screen-based (char., GUI, images) Others (multimedia, communicating apps.)
22
22 Testing Activities Compare: actual output == expected output?? –Yes Pass (Assumption: Test case was “instrumented.”) –No Fail (Assumption: No error in test case, preconditions)
23
23 Overview Basics of Testing Testing & Debugging Activities Testing Strategies –Black-Box Testing –White-Box Testing Testing in the Development Process –Unit Test –Integration Test –System Test –Acceptance Test –Regression Test Practical Considerations
24
24 Overview Basics of Testing Testing & Debugging Activities Testing Strategies Black-Box Testing –White-Box Testing Testing in the Development Process –Unit Test –Integration Test –System Test –Acceptance Test –Regression Test Practical Considerations
25
25 Goodness of test cases Exec. of a test case against a program P –Covers certain requirements of P; –Covers certain parts of P’s functionality; –Covers certain parts of P’s internal logic. Idea of coverage guides test case selection.
26
26 Black-box Testing Focus: I/O behavior. If for any given input, we can predict the output, then the module passes the test. –Almost always impossible to generate all possible inputs ("test cases") Goal: Reduce number of test cases by equivalence partitioning: –Divide input conditions into equivalence classes –Choose test cases for each equivalence class. (Example: If an object is supposed to accept a negative number, testing one negative number is enough)
27
27 Overview Basics of Testing Testing & Debugging Activities Testing Strategies –Black-Box Testing White-Box Testing Testing in the Development Process –Unit Test –Integration Test –System Test –Acceptance Test –Regression Test Practical Considerations
28
28 if ( i =TRUE) printf("YES\n");else printf("NO\n"); Test cases: 1) i = TRUE; 2) i = FALSE White-box Testing Statement Testing (Algebraic Testing): Test single statements (Choice of operators in polynomials, etc) Loop Testing: –Cause execution of the loop to be skipped completely. (Exception: Repeat loops) –Loop to be executed exactly once –Loop to be executed more than once Path testing: –Make sure all paths in the program are executed Branch Testing (Conditional Testing): Make sure that each possible outcome from a condition is tested at least once
29
29 Code Coverage Statement coverage –Elementary statements: assignment, I/O, call –Select a test set T such that by executing P in all cases in T, each statement of P is executed at least once. –read(x); read(y); if x > 0 then write(“1”); else write(“2”); if y > 0 then write(“3”); else write(“4”); –T: {, }
30
30 White-box Testing: Determining the Paths FindMean (FILE ScoreFile) { float SumOfScores = 0.0; int NumberOfScores = 0; float Mean=0.0; float Score; Read(ScoreFile, Score); while (! EOF(ScoreFile) { if (Score > 0.0 ) { SumOfScores = SumOfScores + Score; NumberOfScores++; } Read(ScoreFile, Score); } /* Compute the mean and print the result */ if (NumberOfScores > 0) { Mean = SumOfScores / NumberOfScores; printf(“ The mean score is %f\n”, Mean); } else printf (“No scores found in file\n”); } 1 2 3 4 5 7 6 8 9
31
31 Constructing the Logic Flow Diagram
32
32 Overview Basics of Testing Testing & Debugging Activities Testing Strategies –Black-Box Testing –White-Box Testing Testing in the Development Process Unit Test –Integration Test –System Test –Acceptance Test –Regression Test Practical Considerations
33
33 Unit Testing Objective: Find differences between specified units and their imps. Unit: component ( module, function, class, objects, …) Unit test environment: Unit under test Stub Driver Test cases Test result Dummy modules Effectiveness? Partitioning Code coverage
34
34 Overview Basics of Testing Testing & Debugging Activities Testing Strategies –Black-Box Testing –White-Box Testing Testing in the Development Process –Unit Test Integration Test –System Test –Acceptance Test –Regression Test Practical Considerations
35
35 Integration Testing Objectives: To expose problems arising from the combination To quickly obtain a working solution from components. Problem areas –Internal: between components Invocation: call/message passing/… Parameters: type, number, order, value Invocation return: identity (who?), type, sequence –External: Interrupts (wrong handler?) I/O timing –Interaction
36
36 Integration Testing Types of integration –Structural “Big bang” no error localization Bottom-up: terminal, driver/module, (driver module) Top-down: top, stubs, (stub module), early demo –Behavioral (next slide)
37
37 Integration Testing (Behavioral: Path-Based) A B C MM-path: Interleaved sequence of module exec path and messages Module exec path: entry-exit path in the same module Atomic System Function: port input, … {MM-paths}, … port output Test cases: exercise ASFs
38
38 Overview Basics of Testing Testing & Debugging Activities Testing Strategies –Black-Box Testing –White-Box Testing Testing in the Development Process –Unit Test –Integration Test System Test –Acceptance Test –Regression Test Practical Considerations
39
39 System Testing Concerns with the app’s externals Much more than functional –Load/stress testing –Usability testing –Performance testing –Resource testing
40
40 System Testing Functional testing –Objective: Assess whether the app does what it is supposed to do –Basis: Behavioral/functional specification –Test case: A sequence of ASFs (thread)
41
41 System Testing Functional testing: coverage Event-based coverage –PI1: each port input event occurs –PI2: common sequences of port input event occurs –PI3: each port input in every relevant data context –PI4: for a given context, all possible input events –PO1: each port output event –PO2: each port output event occurs for each cause Data-based –DM1: Exercise cardinality of every relationship –DM2: Exercise (functional) dependencies among relationships
42
42 System Testing Stress testing: push it to its limit + beyond Application (System) : Users rate Volume Resources: phy. + logical response
43
43 System Testing Performance testing –Performance seen by users: delay, throughput System owner: memory, CPU, comm –Performance Explicitly specified or expected to do well Unspecified find the limit Usability testing –Human element in system operation GUI, messages, reports, …
44
44 Test Stopping Criteria Meet deadline, exhaust budget, … management Achieved desired coverage Achieved desired level failure intensity
45
45 Overview Basics of Testing Testing & Debugging Activities Testing Strategies –Black-Box Testing –White-Box Testing Testing in the Development Process –Unit Test –Integration Test –System Test Acceptance Test –Regression Test Practical Considerations
46
46 Acceptance Testing Purpose: ensure that end users are satisfied Basis: user expectations (documented or not) Environment: real Performed: for and by end users (commissioned projects) Test cases: –May reuse from system test –Designed by end users
47
47 Overview Basics of Testing Testing & Debugging Activities Testing Strategies –Black-Box Testing –White-Box Testing Testing in the Development Process –Unit Test –Integration Test –System Test –Acceptance Test Regression Test Practical Considerations
48
48 Regression Testing Whenever a system is modified (fixing a bug, adding functionality, etc.), the entire test suite needs to be rerun –Make sure that features that already worked are not affected by the change Automatic re-testing before checking in changes into a code repository Incremental testing strategies for big systems
49
49 Comparison of White & Black-box Testing 25.1.2002 White-box Testing: –Potentially infinite number of paths have to be tested –White-box testing often tests what is done, instead of what should be done –Cannot detect missing use cases Black-box Testing: –Potential combinatorical explosion of test cases (valid & invalid data) –Often not clear whether the selected test cases uncover a particular error –Does not discover extraneous use cases ("features") Both types of testing are needed White-box testing and black box testing are the extreme ends of a testing continuum. Any choice of test case lies in between and depends on the following: –Number of possible logical paths –Nature of input data –Amount of computation –Complexity of algorithms and data structures
50
50 The 4 Testing Steps 1. Select what has to be measured –Analysis: Completeness of requirements –Design: tested for cohesion –Implementation: Code tests 2. Decide how the testing is done –Code inspection –Proofs (Design by Contract) –Black-box, white box, –Select integration testing strategy (big bang, bottom up, top down, sandwich) 1. Select what has to be measured –Analysis: Completeness of requirements –Design: tested for cohesion –Implementation: Code tests 2. Decide how the testing is done –Code inspection –Proofs (Design by Contract) –Black-box, white box, –Select integration testing strategy (big bang, bottom up, top down, sandwich) 3. Develop test cases –A test case is a set of test data or situations that will be used to exercise the unit (code, module, system) being tested or about the attribute being measured 4. Create the test oracle –An oracle contains of the predicted results for a set of test cases –The test oracle has to be written down before the actual testing takes place 3. Develop test cases –A test case is a set of test data or situations that will be used to exercise the unit (code, module, system) being tested or about the attribute being measured 4. Create the test oracle –An oracle contains of the predicted results for a set of test cases –The test oracle has to be written down before the actual testing takes place
51
51 Guidance for Test Case Selection Use analysis knowledge about functional requirements (black-box testing): –Use cases –Expected input data –Invalid input data Use design knowledge about system structure, algorithms, data structures (white-box testing): –Control structures Test branches, loops,... –Data structures Test records fields, arrays,... Use analysis knowledge about functional requirements (black-box testing): –Use cases –Expected input data –Invalid input data Use design knowledge about system structure, algorithms, data structures (white-box testing): –Control structures Test branches, loops,... –Data structures Test records fields, arrays,... Use implementation knowledge about algorithms: –Examples: –Force division by zero –Use sequence of test cases for interrupt handler Use implementation knowledge about algorithms: –Examples: –Force division by zero –Use sequence of test cases for interrupt handler
52
52 Unit-testing Heuristics 1. Create unit tests as soon as object design is completed: –Black-box test: Test the use cases & functional model –White-box test: Test the dynamic model –Data-structure test: Test the object model 2. Develop the test cases –Goal: Find the minimal number of test cases to cover as many paths as possible 3. Cross-check the test cases to eliminate duplicates –Don't waste your time! 1. Create unit tests as soon as object design is completed: –Black-box test: Test the use cases & functional model –White-box test: Test the dynamic model –Data-structure test: Test the object model 2. Develop the test cases –Goal: Find the minimal number of test cases to cover as many paths as possible 3. Cross-check the test cases to eliminate duplicates –Don't waste your time! 4. Desk check your source code –Reduces testing time 5. Create a test harness –Test drivers and test stubs are needed for integration testing 6. Describe the test oracle –Often the result of the first successfully executed test 7. Execute the test cases –Don’t forget regression testing –Re-execute test cases every time a change is made. 8. Compare the results of the test with the test oracle –Automate as much as possible 4. Desk check your source code –Reduces testing time 5. Create a test harness –Test drivers and test stubs are needed for integration testing 6. Describe the test oracle –Often the result of the first successfully executed test 7. Execute the test cases –Don’t forget regression testing –Re-execute test cases every time a change is made. 8. Compare the results of the test with the test oracle –Automate as much as possible
53
53 Overview Basics of Testing Testing & Debugging Activities Testing Strategies –Black-Box Testing –White-Box Testing Testing in the Development Process –Unit Test –Integration Test –System Test –Acceptance Test –Regression Test Practical Considerations
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.