Lecturer: Dr. AJ Bieszczad Chapter 87-1 How does software fail? Wrong requirement: not what the customer wants Missing requirement Requirement impossible.

Slides:



Advertisements
Similar presentations
Test process essentials Riitta Viitamäki,
Advertisements

Verification and Validation
1 Integration Testing CS 4311 I. Burnstein. Practical Software Testing, Springer-Verlag, 2003.
Testing and Quality Assurance
Software Failure: Reasons Incorrect, missing, impossible requirements * Requirement validation. Incorrect specification * Specification verification. Faulty.
ISBN Prentice-Hall, 2006 Chapter 8 Testing the Programs Copyright 2006 Pearson/Prentice Hall. All rights reserved.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
Chapter 9 Testing the System, part 2. Testing  Unit testing White (glass) box Code walkthroughs and inspections  Integration testing Bottom-up Top-down.
Chapter 8 Testing the Programs Shari L. Pfleeger Joann M. Atlee 4th Edition.
Outline Types of errors Component Testing Testing Strategy
Chapter 11: Testing The dynamic verification of the behavior of a program on a finite set of test cases, suitable selected from the usually infinite execution.
Verification and Validation CIS 376 Bruce R. Maxim UM-Dearborn.
Bottom-Up Integration Testing After unit testing of individual components the components are combined together into a system. Bottom-Up Integration: each.
1 Software Testing Techniques CIS 375 Bruce R. Maxim UM-Dearborn.
1CMSC 345, Version 4/04 Verification and Validation Reference: Software Engineering, Ian Sommerville, 6th edition, Chapter 19.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
CH08: Testing the Programs
©Ian Sommerville 1995 Software Engineering, 5th edition. Chapter 22Slide 1 Verification and Validation u Assuring that a software system meets a user's.
System/Software Testing
S/W Project Management
Verification and Validation Yonsei University 2 nd Semester, 2014 Sanghyun Park.
111 Testing Overview CS 4311 Frank Tsui, Orland Karam, and Barbara Bernal, Essential of Software Engineering, 3rd edition, Jones & Bartett Learning. Sections.
The Software Development Life Cycle: An Overview Presented by Maxwell Drew and Dan Kaiser Southwest State University Computer Science Program.
CS 501: Software Engineering Fall 1999 Lecture 16 Verification and Validation.
Chapter 8 Testing the Programs Shari L. Pfleeger Joann M. Atlee
CS4311 Spring 2011 Verification & Validation Dr. Guoqiang Hu Department of Computer Science UTEP.
CMSC 345 Fall 2000 Unit Testing. The testing process.
Software Testing.
Verification and Validation Overview References: Shach, Object Oriented and Classical Software Engineering Pressman, Software Engineering: a Practitioner’s.
Software Testing Testing principles. Testing Testing involves operation of a system or application under controlled conditions & evaluating the results.
Software Testing ©Dr. David A. Workman School of EE and Computer Science March 19, 2007.
Testing Basics of Testing Presented by: Vijay.C.G – Glister Tech.
Software Testing Testing types Testing strategy Testing principles.
Testing Workflow In the Unified Process and Agile/Scrum processes.
1 CS 501 Spring 2002 CS 501: Software Engineering Lecture 23 Reliability III.
Unit 7 Chapter 8 Testing the Programs. Unit 7 Requirements Read Chapters 8 and 9 Respond to the Unit 7 Discussion Board (25 points) Attend seminar/Take.
Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Chapter 19 Verification and Validation.
1 Ch. 1: Software Development (Read) 5 Phases of Software Life Cycle: Problem Analysis and Specification Design Implementation (Coding) Testing, Execution.
Testing the programs In this part we look at classification of faults the purpose of testing unit testing integration testing strategies when to stop testing.
Software Development Problem Analysis and Specification Design Implementation (Coding) Testing, Execution and Debugging Maintenance.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
The Software Development Life Cycle: An Overview Presented by Maxwell Drew and Dan Kaiser Southwest State University Computer Science Program.
Software Testing and Quality Assurance 1. What is the objectives of Software Testing?
ISBN Prentice-Hall, 2006 Chapter 8 Testing the Programs Copyright 2006 Pearson/Prentice Hall. All rights reserved.
Testing the Programs 中国科学技术大学软件学院 孟宁 2010 年 12 月.
1 Software Testing Strategies: Approaches, Issues, Testing Tools.
Chapter 1 Software Engineering Principles. Problem analysis Requirements elicitation Software specification High- and low-level design Implementation.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
1 Phase Testing. Janice Regan, For each group of units Overview of Implementation phase Create Class Skeletons Define Implementation Plan (+ determine.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20.
Chapter 8 Testing the Programs. Integration Testing  Combine individual comp., into a working s/m.  Test strategy gives why & how comp., are combined.
Chapter 8 Testing the Programs 8.1 Software Faults and Failures 1. Introduction  faults: A: definition: the problem caused by error B: cause: X: the software.
Verification vs. Validation Verification: "Are we building the product right?" The software should conform to its specification.The software should conform.
Software Testing. Software Quality Assurance Overarching term Time consuming (40% to 90% of dev effort) Includes –Verification: Building the product right,
Software Testing.
Software Engineering TESTING Compiled by: Dr. S. Prem Kumar
Rekayasa Perangkat Lunak Part-13
Chapter 9, Testing.
System Design.
Verification and Validation Overview
Some Simple Definitions for Testing
Software testing strategies 2
Lecture 09:Software Testing
Chapter 8 Testing the Programs Shari L. Pfleeger Joann M. Atlee 4th Edition.
Verification and Validation Unit Testing
Testing the Programs 东华理工大学软件学院 李祥.
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
Presentation transcript:

Lecturer: Dr. AJ Bieszczad Chapter 87-1 How does software fail? Wrong requirement: not what the customer wants Missing requirement Requirement impossible to implement Faulty design Faulty code Improperly implemented design

Lecturer: Dr. AJ Bieszczad Chapter 87-2 Terminology Fault identification: what fault caused the failure? Fault correction: change the system Fault removal: take out the fault

Lecturer: Dr. AJ Bieszczad Chapter 87-3 Types of faults Algorithmic fault Syntax fault Computation and precision fault Documentation fault Stress or overload fault Capacity or boundary fault Timing or coordination fault Throughput or performance fault Recovery fault Hardware and system software fault Documentation fault Standards and procedures fault

Lecturer: Dr. AJ Bieszczad Chapter 87-4 Table 8.1. IBM orthogonal defect classification. Fault typeMeaning FunctionFault that affects capability, end-user interfaces, product interfaces, interface with hardware architecture, or global data structure InterfaceFault in interacting with other components or drivers via calls, macros, control blocks or parameter lists CheckingFault in program logic that fails to validate data and values properly before they are used AssignmentFault in data structure or code block initialization. Timing/serializationFault that involves timing of shared and real-time resources Build/package/mergeFault that occurs because of problems in repositories, management changes, or version control DocumentationFault that affects publications and maintenance notes AlgorithmFault involving efficiency or correctness of algorithm or data structure but not design

Lecturer: Dr. AJ Bieszczad Chapter 87-5 HP fault classification

Lecturer: Dr. AJ Bieszczad Chapter 87-6 Faults distribution (HP)

Lecturer: Dr. AJ Bieszczad Chapter 87-7 Testing steps Unit test Unit test Unit test Integration test Function test Performance test Acceptance test Installation test Design specifications System functional requirements Other software requirements Customer requirements specifications User environment SYSTEM IN USE!

Lecturer: Dr. AJ Bieszczad Chapter 87-8 Views of Test Objects Closed box or black box the object seen from outside tests developed without the knowledge of the internal structure focus on inputs and outputs Open box or clear box the object seen from inside tests developed so the internal structure is tested focus algorithms, data structures, etc. Mixed model Selection factors: the number of possible logical paths the nature of input data the amount of computation involved the complexity of algorithms

Lecturer: Dr. AJ Bieszczad Chapter 87-9 Exmining code Code walkthrough –the developer presents the code and the documentation to a review team –the team comments on their correctness Code inspection –the review team checks the code and documentation against a prepared list of concerns –more formal than a walkthrough Inspections criticize the code, and not the developer

Lecturer: Dr. AJ Bieszczad Chapter Table 8.2. Typical inspection preparation and meeting times. Development artifactPreparation timeMeeting time Requirements document25 pages per hour12 pages per hour Functional specification45 pages per hour15 pages per hour Logic specification50 pages per hour20 pages per hour Source code150 lines of code per hour75 lines of code per hour User documents35 pages per hour20 pages per hour Table 8.3. Faults found during discovery activities. Discovery activityFaults found per thousand lines of code Requirements review2.5 Design review5.0 Code inspection10.0 Integration test3.0 Acceptance test2.0

Lecturer: Dr. AJ Bieszczad Chapter Proving code correct Formal proof techniques + formal understanding of the program + mathematical correctness - difficult to build a model - usually takes more time to prove correctness than to write the code itself Symbolic execution + reduces the number of cases for the formal proof - difficult to build a model - may take more time to prove correctness than to write the code itself Automated theorem-proving + extremely convenient - tools exists only for toy problems at research centers - ideal prover that that can infer program correctness from arbitrary code can never be built (the problem is equivalent to the halting problem of Turing machines, which is unsolvable)

Lecturer: Dr. AJ Bieszczad Chapter Testing program components Test point or test case –A particular choice of input data to be used in testing a program Test –A finite collection of test cases

Lecturer: Dr. AJ Bieszczad Chapter Test thoroughness Statement testing Branch testing Path testing Definition-use testing All-uses testing All-predicate-uses/some-computational-uses testing All-computational-uses/some-predicate-uses testing

Lecturer: Dr. AJ Bieszczad Chapter Example: Array arrangement Statement testing: Branch testing: Path testing:

Lecturer: Dr. AJ Bieszczad Chapter Relative strength of test strategies All paths All definition-use paths All uses All computational/ some predicate uses All definitions All computational uses All predicate/ some computational uses Statement All predicate uses Branch

Lecturer: Dr. AJ Bieszczad Chapter Comparing techniques Table 8.5. Fault discovery percentages by fault origin. Discovery techniqueRequirementsDesignCodingDocumentation Prototyping Requirements review Design review Code inspection Unit testing15200 Table 8.6. Effectiveness of fault discovery techniques. (Jones 1991) Requirements faults Design faultsCode faultsDocumentation faults Reviews FairExcellent Good Prototypes GoodFair Not applicable Testing Poor GoodFair Correctness proofs Poor Fair

Lecturer: Dr. AJ Bieszczad Chapter Integration testing Bottom-up Top-down Big-bang Sandwich testing Modified top-down Modified sandwich

Lecturer: Dr. AJ Bieszczad Chapter Component hierarchy for the examples

Lecturer: Dr. AJ Bieszczad Chapter Bottom-up testing

Lecturer: Dr. AJ Bieszczad Chapter Top-down testing

Lecturer: Dr. AJ Bieszczad Chapter Modified top-down testing

Lecturer: Dr. AJ Bieszczad Chapter Big-bang integration

Lecturer: Dr. AJ Bieszczad Chapter Sandwich testing

Lecturer: Dr. AJ Bieszczad Chapter Modifying sandwich testing

Lecturer: Dr. AJ Bieszczad Chapter Comparison of integration strategies.

Lecturer: Dr. AJ Bieszczad Chapter Sync-and-stabilize approach (Microsoft)

Lecturer: Dr. AJ Bieszczad Chapter Testing OO vs. testing other systems

Lecturer: Dr. AJ Bieszczad Chapter Where testing OO is harder?

Lecturer: Dr. AJ Bieszczad Chapter Test planning Establish test objectives Design test cases Write test cases Test test cases Execute tests Evaluate test results

Lecturer: Dr. AJ Bieszczad Chapter Test plan Purpose: to organize testing activities Describes the way in which we will show our customers that the software works correctly Developed in parallel with the system development Contents: –test objectives –methods for test execution for each stage of testing –tools –exit criteria –system integration plan –source of test data –detailed list of testcases

Lecturer: Dr. AJ Bieszczad Chapter Automated testing tools Code analysis –Static analysis code analyzer structure checker data analyzer sequence checker –Dynamic analysis program monitor Test execution –Capture and replay –Stubs and drivers –Automated testing environments Test case generators

Lecturer: Dr. AJ Bieszczad Chapter Sample output from static code analysis

Lecturer: Dr. AJ Bieszczad Chapter Probability of finding faults during development

Lecturer: Dr. AJ Bieszczad Chapter When to stop testing Coverage criteria Fault seeding Estimate of remaining faults S – seeded faults, n – discovered faults, N – actual faults N = Sn/s Two test team testing x - faults found by group 1, y – by group 2, q – dicovered by both E 1 =x/nE 2 =y/nq <=x, q<=y also: E 1 =q/y (group 1 found q faults out of group’s 2 y) so:y=q/E 1 and n=y/E 2 therefore: n=q/(E 1 E 2 )

Lecturer: Dr. AJ Bieszczad Chapter When to stop testing Coverage criteria Fault seeding Confidence in the software C = 1if n > N S/(S-N+1) if n < N S – seeded faults, n – discovered faults, N – actual faults (e.g., N=0, S=10, n=10  91% N=0, S=49, n=49  98%)

Lecturer: Dr. AJ Bieszczad Chapter Classification tree to identify fault- prone components