COMP 354 Software Engineering I Section BB Summer 2009 Dr Greg Butler

Slides:



Advertisements
Similar presentations
Testing Relational Database
Advertisements

Software Testing and Analysis. Ultimate goal for software testing Quality Assurance.
Defect testing Objectives
Lecture 8: Testing, Verification and Validation
Verification and Validation
Software Engineering-II Sir zubair sajid. What’s the difference? Verification – Are you building the product right? – Software must conform to its specification.
Software Failure: Reasons Incorrect, missing, impossible requirements * Requirement validation. Incorrect specification * Specification verification. Faulty.
CMSC 345, Version 11/07 SD Vick from S. Mitchell Software Testing.
Software testing.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
Testing an individual module
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation.
Verification and Validation CIS 376 Bruce R. Maxim UM-Dearborn.
Verification and Validation
1 Software Testing Techniques CIS 375 Bruce R. Maxim UM-Dearborn.
1CMSC 345, Version 4/04 Verification and Validation Reference: Software Engineering, Ian Sommerville, 6th edition, Chapter 19.
Chapter 24 - Quality Management Lecture 1 1Chapter 24 Quality management.
CSCI 5801: Software Engineering
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
System/Software Testing
Project Quality Management
Verification and Validation Yonsei University 2 nd Semester, 2014 Sanghyun Park.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation.
CS 501: Software Engineering Fall 1999 Lecture 16 Verification and Validation.
CMSC 345 Fall 2000 Unit Testing. The testing process.
Chapter 12: Software Testing Omar Meqdadi SE 273 Lecture 12 Department of Computer Science and Software Engineering University of Wisconsin-Platteville.
Software Inspection A basic tool for defect removal A basic tool for defect removal Urgent need for QA and removal can be supported by inspection Urgent.
1 Debugging and Testing Overview Defensive Programming The goal is to prevent failures Debugging The goal is to find cause of failures and fix it Testing.
CSC 480 Software Engineering Lecture 14 Oct 16, 2002.
1 Software testing. 2 Testing Objectives Testing is a process of executing a program with the intent of finding an error. A good test case is in that.
Software Testing Testing types Testing strategy Testing principles.
Software Testing The process of operating a system or component under specified conditions, observing and recording the results, and making an evaluation.
Software Testing Yonsei University 2 nd Semester, 2014 Woo-Cheol Kim.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20.
1 Introduction to Software Testing. Reading Assignment P. Ammann and J. Offutt “Introduction to Software Testing” ◦ Chapter 1 2.
CHAPTER 9: VERIFICATION AND VALIDATION 1. Objectives  To introduce software verification and validation and to discuss the distinction between them 
Chapter 12: Software Inspection Omar Meqdadi SE 3860 Lecture 12 Department of Computer Science and Software Engineering University of Wisconsin-Platteville.
1 Ch. 1: Software Development (Read) 5 Phases of Software Life Cycle: Problem Analysis and Specification Design Implementation (Coding) Testing, Execution.
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 20 Slide 1 Defect testing l Testing programs to establish the presence of system defects.
Software Development Problem Analysis and Specification Design Implementation (Coding) Testing, Execution and Debugging Maintenance.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
CS451 Lecture 10: Software Testing Yugi Lee STB #555 (816)
 Software Testing Software Testing  Characteristics of Testable Software Characteristics of Testable Software  A Testing Life Cycle A Testing Life.
Dynamic Testing.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
Chapter 12: Software Testing Omar Meqdadi SE 273 Lecture 12 Department of Computer Science and Software Engineering University of Wisconsin-Platteville.
1 Phase Testing. Janice Regan, For each group of units Overview of Implementation phase Create Class Skeletons Define Implementation Plan (+ determine.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
Software Testing Reference: Software Engineering, Ian Sommerville, 6 th edition, Chapter 20.
SOFTWARE TESTING LECTURE 9. OBSERVATIONS ABOUT TESTING “ Testing is the process of executing a program with the intention of finding errors. ” – Myers.
Software Testing. SE, Testing, Hans van Vliet, © Nasty question  Suppose you are being asked to lead the team to test the software that controls.
Defect testing Testing programs to establish the presence of system defects.
CX Introduction to Web Programming Testing.
PREPARED BY G.VIJAYA KUMAR ASST.PROFESSOR
Software Testing.
CSC 480 Software Engineering
Verification and Validation
Verification & Validation
UNIT-4 BLACKBOX AND WHITEBOX TESTING
Software testing strategies 2
Lecture 09:Software Testing
Software testing.
Chapter 10 – Software Testing
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
CSE 1020:Software Development
UNIT-4 BLACKBOX AND WHITEBOX TESTING
Presentation transcript:

COMP 354 Software Engineering I Section BB Summer 2009 Dr Greg Butler

Validation and Verification - What must be verified? Error, Fault, Failure Correct, Robust, Reliable Testing - Aim of testing Testing - test plan, test suite, test case, test data Test infrastructure Reviews, walkthroughs, inspections

Validation and Verification Validation: Are we building the right product? Validation is a process by which one establishes that a deliverable satisfies the needs of the users. Verification: Are we building the product right? Verification is a process by which one establishes that a deliverable satisfies another deliverable.

What must be verified? Every Every quality of the product (and deliverables) correctness performance reliability robustness portability maintainability user friendliness... Results may not be Yes/No : eg percentage of tests passed Results may be subjective : eg how portable is a system

Error, fault, failure Error is mistake made by person when reasoning Fault is mistake in code (or deliverable) due to error Failure is how fault in code manifests itself when the system is executed Testing shows failures. Debugging tries to locate the fault (and fix it).

Correct, robust, reliable Correctness Correctness: Does the system always give the output described in the requirements when given an input described in the requirements? Every right input produces expected output Robustness Robustness: Does the system behave reasonably when given an incorrect (unexpected) input? Reliability Reliability: How often the system produces a correct output when given a correct input? Probability of correct behaviour Needs to know distribution of typical usage of system

Testing Testing is verifying the quality of correctness Aim of testing To discover the presence of errors!!! Testing cannot prove correctness Except for very simple systems where you can test every possible input

Test data, test case, test suite test data: a set of inputs devised to exercise a test can be automatically generated sometimes (see structural testing) test case consists of the purpose of the test in terms of the system requirements it exercises an input specification (ie test data) a specification of the expected output test suite is a collection of test cases

Test Plan major components of a complete test plan are: description of the major phases of testing (eg unit testing, module testing,... ) objectives (acceptance criteria) for the testing process overall testing schedule and resource allocation (when, who, time and machine resources) The test schedule should allow for slippage. description of the relationship between the test plan and other project plans (eg implementation schedule) description of how traceability of test results to system requirements is to be demonstrated description of how tests results are recorded (It must be possible to audit the testing process to guarantee that tests have been carried out on latest versions of the software) description of how the test cases were designed, and how the test data was generated description of all the test cases, including all test data

Stages of Testing Unit Testing: testing an individual unit or basic component of the system eg testing a function SQRT Module Testing: testing a module consisting of several units, to check that their interaction and interfaces are ok eg testing that pop and push in a stack provide a last-in-first-out behaviour Subsystem Testing: testing a subsystem consisting of several modules, to check module interaction and module interfaces are ok eg an output subsystem writing to a database, and keeping an audit trail, and providing rollback services Integration Testing: testing the entire system once all the subsystems have been integrated Acceptance Testing: testing with real data for the customer to satisfy the customer that the system meets the requirements

Stages of Testing Remember the V-Model

Testing Whole System Stress (Overload) Testing: check the capability of the system to perform under 'overloaded' conditions eg full tables, memory, filesystem eg more simultaneous users than expected again Regression Testing: testing old features again when new features are added or changes are made to make sure you fully understood the changes to be made and did not impact parts of the system supposedly isolated from the changes regression regression = degradation in level of correctness (of old features)

Code Walkthrough Code walkthrough Code walkthrough is informal analysis based on "playing the computer" and "walking through" the code The states of the "computation" are recorded on a piece of paper or a blackboard. Code walkthrough is cooperative organized activity with several participants Aims of code walkthrough: notdiscovery of errors - not fixing errors stylistic review of code compliance to coding standards educating junior programmers

Code Inspection Code inspection Code inspection is like a walkthrough, but difference in goals. Aims of code inspection: discovery of commonly made errors Given a list of commonly made errors, look at code to see if any of them occur Code inspection is a hand execution of the code. Code inspection is not a hand execution of the code.

Process for Code Inspection Preparation r equires a precise specification of the code to be inspected must be availablea precise specification of the code to be inspected must be available members of the inspection team must be familiar with all relevant standards of the organizationmembers of the inspection team must be familiar with all relevant standards of the organization code: up-to-date and free of syntax errorscode: up-to-date and free of syntax errors checklist of likely errorschecklist of likely errors Plan/schedule : people on team, code to be inspected Overview of code is presented Each individual team member reads code, notes errors People Small number of peers, not management-typesMeeting short --- no more than two hours well-defined, single focus on a section or small number of related sections of code (no more than lines of code in total) aim: discovery of errors --- do not stray from this

Sample Checklist of Common Errors use of uninitialized variables have all constants been named for each conditional statement, is the condition correctis the condition correct jumps into loopsjumps into loops incompatible types in an assignmentincompatible types in an assignment nonterminating loopsnonterminating loops for each array, should the lower bound be 0, 1, or something elseshould the lower bound be 0, 1, or something else should the upper bound be Size or Size-1should the upper bound be Size or Size-1 array indexes out of boundsarray indexes out of bounds are delimiters '\0' explicitly assigned for character strings improper storage allocation/deallocation actual-formal parameter mismatches in procedure calls correct number of parameterscorrect number of parameters matching types of each formal-actual parameter pairmatching types of each formal-actual parameter pair comparison for equality of floating point values are compound statements correctly bracketted if links/pointers are used, are link assignments correct when updating data structures using links have all possible error conditions been taken into account

Structural Testing (White-Box) Structural testing structurecontrolflow-graph Structural testing is based on structure of control flow-graph of the code. Test cases are chosen to cover all possibilities of... statement coverage each elementary statement at least once statement coverage : select a set T of test cases such that, by executing the program P for each test case, each elementary statement of P is executed at least once. edge coverage each edgeat least once edge coverage : select a set T of test cases such that, by executing P for each test case, each edge of the control flow graph of P is traversed at least once. condition coverage each edge and all possible values of the constituents of compound conditionsat least once condition coverage : select a set T of test cases such that, by executing P for each test case, each edge of the control flow graph of P is traversed and all possible values of the constituents of compound conditions are exercised at least once. path coverage each path path coverage : select a set T of test cases such that, by executing P for each test case, each path from the initial to the final node of the control flow graph of P is traversed.

Functional Testing (Black-Box) Functional Testing is based on our view of the program as a function from inputs to outputs as described in the requirements document. No need to look at code which implements the program. equivalence partitioning typicalboundary Test cases designed by equivalence partitioning of input to identify typical and boundary cases.

Equivalence Partitioning for Test Cases Equivalence partitioning: partition the domain of inputs into disjoint sets such that inputs in the same set exhibit similar/identical/equivalent properties with respect to the test being performed. Example Example: function which takes a 5- digit number as input set 1 = { integers < } set 2 = { integers in range } set 3 = { integers > } Example Example: function which takes a linked list as input set 1 = { empty list } set 2 = { lists of length 1 } first node = last node problems set 3 = { lists of moderate length } set 4 = { very long lists } might be space problems