Lecturer: Dr. AJ Bieszczad Chapter 87-1 How does software fail? Wrong requirement: not what the customer wants Missing requirement Requirement impossible to implement Faulty design Faulty code Improperly implemented design
Lecturer: Dr. AJ Bieszczad Chapter 87-2 Terminology Fault identification: what fault caused the failure? Fault correction: change the system Fault removal: take out the fault
Lecturer: Dr. AJ Bieszczad Chapter 87-3 Types of faults Algorithmic fault Syntax fault Computation and precision fault Documentation fault Stress or overload fault Capacity or boundary fault Timing or coordination fault Throughput or performance fault Recovery fault Hardware and system software fault Documentation fault Standards and procedures fault
Lecturer: Dr. AJ Bieszczad Chapter 87-4 Table 8.1. IBM orthogonal defect classification. Fault typeMeaning FunctionFault that affects capability, end-user interfaces, product interfaces, interface with hardware architecture, or global data structure InterfaceFault in interacting with other components or drivers via calls, macros, control blocks or parameter lists CheckingFault in program logic that fails to validate data and values properly before they are used AssignmentFault in data structure or code block initialization. Timing/serializationFault that involves timing of shared and real-time resources Build/package/mergeFault that occurs because of problems in repositories, management changes, or version control DocumentationFault that affects publications and maintenance notes AlgorithmFault involving efficiency or correctness of algorithm or data structure but not design
Lecturer: Dr. AJ Bieszczad Chapter 87-5 HP fault classification
Lecturer: Dr. AJ Bieszczad Chapter 87-6 Faults distribution (HP)
Lecturer: Dr. AJ Bieszczad Chapter 87-7 Testing steps Unit test Unit test Unit test Integration test Function test Performance test Acceptance test Installation test Design specifications System functional requirements Other software requirements Customer requirements specifications User environment SYSTEM IN USE!
Lecturer: Dr. AJ Bieszczad Chapter 87-8 Views of Test Objects Closed box or black box the object seen from outside tests developed without the knowledge of the internal structure focus on inputs and outputs Open box or clear box the object seen from inside tests developed so the internal structure is tested focus algorithms, data structures, etc. Mixed model Selection factors: the number of possible logical paths the nature of input data the amount of computation involved the complexity of algorithms
Lecturer: Dr. AJ Bieszczad Chapter 87-9 Exmining code Code walkthrough –the developer presents the code and the documentation to a review team –the team comments on their correctness Code inspection –the review team checks the code and documentation against a prepared list of concerns –more formal than a walkthrough Inspections criticize the code, and not the developer
Lecturer: Dr. AJ Bieszczad Chapter Table 8.2. Typical inspection preparation and meeting times. Development artifactPreparation timeMeeting time Requirements document25 pages per hour12 pages per hour Functional specification45 pages per hour15 pages per hour Logic specification50 pages per hour20 pages per hour Source code150 lines of code per hour75 lines of code per hour User documents35 pages per hour20 pages per hour Table 8.3. Faults found during discovery activities. Discovery activityFaults found per thousand lines of code Requirements review2.5 Design review5.0 Code inspection10.0 Integration test3.0 Acceptance test2.0
Lecturer: Dr. AJ Bieszczad Chapter Proving code correct Formal proof techniques + formal understanding of the program + mathematical correctness - difficult to build a model - usually takes more time to prove correctness than to write the code itself Symbolic execution + reduces the number of cases for the formal proof - difficult to build a model - may take more time to prove correctness than to write the code itself Automated theorem-proving + extremely convenient - tools exists only for toy problems at research centers - ideal prover that that can infer program correctness from arbitrary code can never be built (the problem is equivalent to the halting problem of Turing machines, which is unsolvable)
Lecturer: Dr. AJ Bieszczad Chapter Testing program components Test point or test case –A particular choice of input data to be used in testing a program Test –A finite collection of test cases
Lecturer: Dr. AJ Bieszczad Chapter Test thoroughness Statement testing Branch testing Path testing Definition-use testing All-uses testing All-predicate-uses/some-computational-uses testing All-computational-uses/some-predicate-uses testing
Lecturer: Dr. AJ Bieszczad Chapter Example: Array arrangement Statement testing: Branch testing: Path testing:
Lecturer: Dr. AJ Bieszczad Chapter Relative strength of test strategies All paths All definition-use paths All uses All computational/ some predicate uses All definitions All computational uses All predicate/ some computational uses Statement All predicate uses Branch
Lecturer: Dr. AJ Bieszczad Chapter Comparing techniques Table 8.5. Fault discovery percentages by fault origin. Discovery techniqueRequirementsDesignCodingDocumentation Prototyping Requirements review Design review Code inspection Unit testing15200 Table 8.6. Effectiveness of fault discovery techniques. (Jones 1991) Requirements faults Design faultsCode faultsDocumentation faults Reviews FairExcellent Good Prototypes GoodFair Not applicable Testing Poor GoodFair Correctness proofs Poor Fair
Lecturer: Dr. AJ Bieszczad Chapter Integration testing Bottom-up Top-down Big-bang Sandwich testing Modified top-down Modified sandwich
Lecturer: Dr. AJ Bieszczad Chapter Component hierarchy for the examples
Lecturer: Dr. AJ Bieszczad Chapter Bottom-up testing
Lecturer: Dr. AJ Bieszczad Chapter Top-down testing
Lecturer: Dr. AJ Bieszczad Chapter Modified top-down testing
Lecturer: Dr. AJ Bieszczad Chapter Big-bang integration
Lecturer: Dr. AJ Bieszczad Chapter Sandwich testing
Lecturer: Dr. AJ Bieszczad Chapter Modifying sandwich testing
Lecturer: Dr. AJ Bieszczad Chapter Comparison of integration strategies.
Lecturer: Dr. AJ Bieszczad Chapter Sync-and-stabilize approach (Microsoft)
Lecturer: Dr. AJ Bieszczad Chapter Testing OO vs. testing other systems
Lecturer: Dr. AJ Bieszczad Chapter Where testing OO is harder?
Lecturer: Dr. AJ Bieszczad Chapter Test planning Establish test objectives Design test cases Write test cases Test test cases Execute tests Evaluate test results
Lecturer: Dr. AJ Bieszczad Chapter Test plan Purpose: to organize testing activities Describes the way in which we will show our customers that the software works correctly Developed in parallel with the system development Contents: –test objectives –methods for test execution for each stage of testing –tools –exit criteria –system integration plan –source of test data –detailed list of testcases
Lecturer: Dr. AJ Bieszczad Chapter Automated testing tools Code analysis –Static analysis code analyzer structure checker data analyzer sequence checker –Dynamic analysis program monitor Test execution –Capture and replay –Stubs and drivers –Automated testing environments Test case generators
Lecturer: Dr. AJ Bieszczad Chapter Sample output from static code analysis
Lecturer: Dr. AJ Bieszczad Chapter Probability of finding faults during development
Lecturer: Dr. AJ Bieszczad Chapter When to stop testing Coverage criteria Fault seeding Estimate of remaining faults S – seeded faults, n – discovered faults, N – actual faults N = Sn/s Two test team testing x - faults found by group 1, y – by group 2, q – dicovered by both E 1 =x/nE 2 =y/nq <=x, q<=y also: E 1 =q/y (group 1 found q faults out of group’s 2 y) so:y=q/E 1 and n=y/E 2 therefore: n=q/(E 1 E 2 )
Lecturer: Dr. AJ Bieszczad Chapter When to stop testing Coverage criteria Fault seeding Confidence in the software C = 1if n > N S/(S-N+1) if n < N S – seeded faults, n – discovered faults, N – actual faults (e.g., N=0, S=10, n=10 91% N=0, S=49, n=49 98%)
Lecturer: Dr. AJ Bieszczad Chapter Classification tree to identify fault- prone components