INSE - Lecture 11 Testing u Verification testing vs pursuit testing u Philosophy of testing u Stages of testing u Methods of testing u Design of test data.

Slides:



Advertisements
Similar presentations
Testing Relational Database
Advertisements

Test process essentials Riitta Viitamäki,
Verification and Validation
Software Failure: Reasons Incorrect, missing, impossible requirements * Requirement validation. Incorrect specification * Specification verification. Faulty.
Looking ahead Your final-year project. In final year you do a project...  For these courses, it ’ s assumed you will build an “ artefact ”, and report.
Testing Important to guarantee quality of software
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
Objectives Understand the basic concepts and definitions relating to testing, like error, fault, failure, test case, test suite, test harness. Understand.
Software Testing. “Software and Cathedrals are much the same: First we build them, then we pray!!!” -Sam Redwine, Jr.
1 Software Testing and Quality Assurance Lecture 1 Software Verification & Validation.
Chapter 11: Testing The dynamic verification of the behavior of a program on a finite set of test cases, suitable selected from the usually infinite execution.
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
Extreme Programming Software Development Written by Sanjay Kumar.
Verification and Validation Yonsei University 2 nd Semester, 2014 Sanghyun Park.
Slide 6.1 CHAPTER 6 TESTING. Slide 6.2 Overview l Quality issues l Nonexecution-based testing l Execution-based testing l What should be tested? l Testing.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation.
TESTING.
Objectives Understand the basic concepts and definitions relating to testing, like error, fault, failure, test case, test suite, test harness. Explore.
CS 501: Software Engineering Fall 1999 Lecture 16 Verification and Validation.
CMSC 345 Fall 2000 Unit Testing. The testing process.
Chapter 12: Software Testing Omar Meqdadi SE 273 Lecture 12 Department of Computer Science and Software Engineering University of Wisconsin-Platteville.
Software Testing.
1 Debugging and Testing Overview Defensive Programming The goal is to prevent failures Debugging The goal is to find cause of failures and fix it Testing.
Chapter 8 – Software Testing Lecture 1 1Chapter 8 Software testing The bearing of a child takes nine months, no matter how many women are assigned. Many.
INT-Evry (Masters IT– Soft Eng)IntegrationTesting.1 (OO) Integration Testing What: Integration testing is a phase of software testing in which.
Lecture 11 Testing and Debugging SFDV Principles of Information Systems.
Testing Basics of Testing Presented by: Vijay.C.G – Glister Tech.
Software Testing Testing types Testing strategy Testing principles.
Testing E001 Access to Computing: Programming. 2 Introduction This presentation is designed to show you the importance of testing, and how it is used.
Software Development Software Testing. Testing Definitions There are many tests going under various names. The following is a general list to get a feel.
1 CS 501 Spring 2002 CS 501: Software Engineering Lecture 23 Reliability III.
Testing and Debugging Version 1.0. All kinds of things can go wrong when you are developing a program. The compiler discovers syntax errors in your code.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Chapter 19 Verification and Validation.
Verification and Validation Assuring that a software system meets a user's needs.
Chapter 12: Software Inspection Omar Meqdadi SE 3860 Lecture 12 Department of Computer Science and Software Engineering University of Wisconsin-Platteville.
Chapter 8 Lecture 1 Software Testing. Program testing Testing is intended to show that a program does what it is intended to do and to discover program.
Software Testing Process By: M. Muzaffar Hameed.
Design - programming Cmpe 450 Fall Dynamic Analysis Software quality Design carefully from the start Simple and clean Fewer errors Finding errors.
Software Engineering 2004 Jyrki Nummenmaa 1 BACKGROUND There is no way to generally test programs exhaustively (that is, going through all execution.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
Software Engineering Saeed Akhtar The University of Lahore.
CSC 480 Software Engineering Test Planning. Test Cases and Test Plans A test case is an explicit set of instructions designed to detect a particular class.
1 Software Testing Strategies: Approaches, Issues, Testing Tools.
CS451 Software Implementation and Integration Yugi Lee STB #555 (816) Note: This lecture was designed.
Software Quality Assurance and Testing Fazal Rehman Shamil.
Rekayasa Perangkat Lunak Sesi 14 Software Testing.
 Software Testing Software Testing  Characteristics of Testable Software Characteristics of Testable Software  A Testing Life Cycle A Testing Life.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
Chapter 12: Software Testing Omar Meqdadi SE 273 Lecture 12 Department of Computer Science and Software Engineering University of Wisconsin-Platteville.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
SOFTWARE TESTING SOFTWARE TESTING Presented By, C.Jackulin Sugirtha-10mx15 R.Jeyaramar-10mx17K.Kanagalakshmi-10mx20J.A.Linda-10mx25P.B.Vahedha-10mx53.
Lecturer: Eng. Mohamed Adam Isak PH.D Researcher in CS M.Sc. and B.Sc. of Information Technology Engineering, Lecturer in University of Somalia and Mogadishu.
Verification vs. Validation Verification: "Are we building the product right?" The software should conform to its specification.The software should conform.
Testing Integral part of the software development process.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
Laurea Triennale in Informatica – Corso di Ingegneria del Software I – A.A. 2006/2007 Andrea Polini XVII. Verification and Validation.
Software Testing. Software Quality Assurance Overarching term Time consuming (40% to 90% of dev effort) Includes –Verification: Building the product right,
Software Testing Strategies for building test group
Testing Tutorial 7.
CSC 480 Software Engineering
Chapter 9, Testing.
Chapter 8 – Software Testing
Verification and Testing
Lecture 09:Software Testing
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
Chapter 7 Software Testing.
Software Testing Strategies
Presentation transcript:

INSE - Lecture 11 Testing u Verification testing vs pursuit testing u Philosophy of testing u Stages of testing u Methods of testing u Design of test data u Management issues

Verification testing versus Pursuit testing u Definitions

Verification vs Pursuit testing u Verification testing is testing to find out if a product is “ correct ” and acceptable u Pursuit testing is when we know there is an error, and we improvise additional tests to “ chase ” the error, and locate it – it ’ s better called diagnostics. u Pursuit testing is really part of debugging - in next lecture. u This lecture is about Verification testing

Philosophy of testing u Key observations

What tests can/can ’ t do u A test can “ prove ” the existence of a bug u A test cannot “ prove ” the absence of bugs: –there can be bugs in a program, but the tests just don ’ t trigger them; –or the bug might be triggered, but you just don ’ t spot that in the output u So a “ good ” test will increase our confidence that there are no evident bugs. –but what else does “ good ” mean in that context?

“ This product ” u Software products do very diverse things. u So the tests need to be correspondingly diverse. u So basic thinking from tests of one product is unlikely to carry forward well to another product. u So every new product needs a stage of original thought on how to test this unique product.

Stages of testing u Test preparation - philosophy, test design, test scheduling u Component tests - usually find coding problems u Integration tests - usually find design problems u User tests - usually find spec & other problems u Maintenance tests - find introduced problems

Test preparation u Derives from specification documents and design documents; u Needed after implementation … So really needs to be a separate “ stream ” of the lifecycle, in parallel to implementation. u Should not be improvised in a hurry after implementation - such tests will have “ gaps ” in their coverage.

Component tests / Unit tests u To test a small fragment in isolation will need a small “ main program ” for the purpose … u … we call this a “ test harness ”. u The test harness should be designed to (try to) exhibit possible faults. u Some IDEs permit direct execution without explicit test harnesses.

Component tests / Module tests u Ideally, one is testing something which doesn ’ t have enough bugs that their symptoms confuse one another … –… suggests an optimum “ module size ” »e.g. if you average one bug per 250 lines, then keep modules down to (say) 500 lines. u Again - need a test harness (or IDE support).

Integration / Subsystem tests u Again - need a test harness (or IDE support). u Hard to test a module without having already tested and debugged any modules it needs... u … but we might “ fake ” a used module by instead using a “ test stub ”… u … so we can to some extent achieve top- down testing.

Integration / System tests u Testing the whole product - i.e. first test against the (whole) specification since prototyping. u It ’ s often very hard to devise comprehensive system tests - especially ones that reflect “ live ” patterns of use.

User tests / Beta tests u Give a near-finished version of the product to sample customers … u … almost a sort of late prototyping; u … meets the problem of testing in “ real ” situations? u … the lack of finishing might be –cosmetic; –fancier facilities missing; –debugging not yet complete. u Biggest problem: how to collect representative feedback.

User tests / Acceptance tests u For software “ written to order ”, these are usually specified in contract - i.e. what the customer wants to see before agreeing you ’ ve met contract. u Therefore usually done by (or with) customer staff; or perhaps by third-party independent specialist testers.

Maintenance tests u All the usual tests, but many with an extra flavour... u Regression tests - comparing results of a test with the results of the same test from a prior version of the software - often to see that there ’ s been no change, sometimes to see that there ’ s only been intended changes in the results.

Methods of testing u top-down testing? u static testing. u dynamic testing & design of test data. u black-box & white-box. u be-bugging.

Top-down testing? u The “ natural ” order of testing is bottom- up. u But using test stubs we can (to some extent) test top-down.

Static test methods u Walkthroughs of the code u Compiler checks u Checks based on tools –e.g. cross-referencers u “ Proving ” the source code –very long proofs => programs to do the proving

Dynamic test methods u Running the program –with carefully-designed test data –then carefully checking the output u Profile-running the program –inspect the profile for anomalies u Running the program under a “ dynamic debugger ”

Black-box tests u Design the test-data (and harness) to determine how well the product matches its specification –e.g. for some “ range ” input - try »just in range (both ends?), »just out of range (both ends?), »a sample well in range, and »a sample well out of range.

White-box tests u tests designed using internal knowledge of the design & code; u attack especially any perceived weak points - –profiling to ensure every execution path is tested; –adding “ print ” statements to verify transient values; –avoid re-testing dual uses of re-used code?

The “ be-bugging ” method u Deliberately introduce a known number of “ typical ” bugs into a fairly clean program. u Set a new team of tester to find bugs in the program. u Suppose they find (say) 2 / 3 of the bugs you “ sowed ” in the program plus another 10. u Assume the 10 is 2 / 3 of the bugs that were they & you didn ’ t know about. u Then there are ~5 more unknown bugs to find … ?

Design of test data

Design of (dynamic) test data u The design of data for test-runs should be designed to search in every corner of –the code under test (white-box testing); –the problem & the specification (black-box testing) for all imaginable errors. u (Unimaginable, too!) u For unit & subsystem tests, that usually means designing the test data and the test harness (for the unit/subsystem) together.

All test output needs checking u (something often forgotten when designing tests & test-data!) u Design the tests so that the output can easily & reliably be checked - –e.g. helpful layout; –e.g. not of excessive volume; –e.g. “ simple ” - of some evident pattern or other not needing careful thought.

Management issues in testing

Things to ensure – u Check that it ’ s done! –… and done well enough for that product! –… and done imaginatively but meticulously! u Aptitudes and motivation of testing staff? –test-designers? –test-doers? –checkers of the test output? u Audit trails of testing done - –test-auditors? –test-documentors?

After this lecture u think about the testing you are going to do, and the testing you have done.

u © C Lester