Verification and Validation

Slides:



Advertisements
Similar presentations
Software Engineering COMP 201
Advertisements

Software Testing and Analysis. Ultimate goal for software testing Quality Assurance.
Verification and Validation
Verification and Validation
Software Engineering-II Sir zubair sajid. What’s the difference? Verification – Are you building the product right? – Software must conform to its specification.
©Ian Sommerville 2000Software Engineering. Chapter 22Slide 1 Chapter 22 Verification and Validation.
Software Engineering COMP 201
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
1 / 28 CS 425/625 Software Engineering Verification and Validation Based on Chapter 19 of the textbook [SE-6] Ian Sommerville, Software Engineering, 6.
Verification and Validation
1 Software Testing and Quality Assurance Lecture 1 Software Verification & Validation.
Session VII: Verification and Validation / Testing
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation.
Verification and Validation CIS 376 Bruce R. Maxim UM-Dearborn.
Verification and Validation
1CMSC 345, Version 4/04 Verification and Validation Reference: Software Engineering, Ian Sommerville, 6th edition, Chapter 19.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
©Ian Sommerville 1995 Software Engineering, 5th edition. Chapter 22Slide 1 Verification and Validation u Assuring that a software system meets a user's.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
Verification and Validation Yonsei University 2 nd Semester, 2014 Sanghyun Park.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation.
Adaptive Processes © Adaptive Processes Simpler, Faster, Better Verification and Validation Assuring that a software system meets a user's needs.
Dr. Tom WayCSC Code Reviews & Inspections CSC 4700 Software Engineering.
Verification and Validation Chapter 22 of Ian Sommerville’s Software Engineering.
Verification and Validation Hoang Huu Hanh, Hue University hanh-at-hueuni.edu.vn.
©Ian Sommerville 2006Software Engineering, 8th edition. Chapter 22 Slide 1 Verification and Validation Slightly adapted by Anders Børjesson.
Software testing techniques 2.Verification and validation From I. Sommerville textbook Kaunas University of Technology.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
Software Testing Testing types Testing strategy Testing principles.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Chapter 19 Verification and Validation.
CS.436 Software Engineering By Ajarn..Sutapart Sappajak,METC,MSIT Chapter 13 Verification and validation Slide 1 1 Chapter 13 Verification and Validation.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
SoftwareVerification and Validation
This chapter is extracted from Sommerville’s slides. Textbook chapter
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Chapter 19 Verification and Validation.
Verification and Validation
Ch 22 Verification and Validation
Anton Krbaťa Ján Budáč  Verification: "Are we building the product right ?„  Validation: "Are we building the right product ?"
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Bzupages.com Verification and Validation.
CHAPTER 9: VERIFICATION AND VALIDATION 1. Objectives  To introduce software verification and validation and to discuss the distinction between them 
Verification and Validation Assuring that a software system meets a user's needs.
Chapter 12: Software Inspection Omar Meqdadi SE 3860 Lecture 12 Department of Computer Science and Software Engineering University of Wisconsin-Platteville.
Software Engineering, 8th edition Chapter 22 1 Courtesy: ©Ian Somerville 2006 April 27 th, 2009 Lecture # 19 Verification and Validation.
Verification and Validation Assuring that a software system meets a user's needs.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
©Ian Sommerville Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation with edits by Dan Fleck Coming up: Objectives.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation with edits by Dan Fleck.
This chapter is extracted from Sommerville’s slides. Textbook chapter 22 1 Chapter 8 Validation and Verification 1.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Software inspections l Involve people examining the source representation with.
References & User group Reference: Software Testing and Analysis Mauro Pezze Software Engineering Ian Sommerville Eight Edition (2007) User group:
Verification vs. Validation Verification: "Are we building the product right?" The software should conform to its specification.The software should conform.
Pradeep Konduri Niranjan Rao Julapelly.  Introduction and Overview  Verification Vs Validation  Process  Goals  Confidence  Role of V&V in different.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
©Ian Sommerville 2006Software Engineering, 8th edition. Chapter 22 Slide 1 Verification and Validation.
Laurea Triennale in Informatica – Corso di Ingegneria del Software I – A.A. 2006/2007 Andrea Polini XVII. Verification and Validation.
Verification and Validation. Topics covered l Verification and validation planning l Program Testing l Software inspections.
Verification and Validation
Verification and Validation
CSC 480 Software Engineering
Verification and Validation
Verification & Validation
Verification and Validation
Verification and Validation
Verification and Validation
Lecture 09:Software Testing
Verification and Validation Unit Testing
Verification & Validation
Chapter 7 Software Testing.
Presentation transcript:

Verification and Validation Assuring that a software system meets a user's needs

Verification vs validation Verification: "Are we building the product right" Validation: "Are we building the right product" … The software should conform to its specification … The software should do what the user really requires

Static and dynamic verification Software inspections (static verification and validation) Software testing (dynamic verification) … Concerned with analysis of the static system representation to discover problems (May be supplement by tool-based document and code analysis) … Concerned with exercising and observing product behaviour - The system is executed with test data and its operational behaviour is observed

Static and dynamic V&V V & V Is a whole life-cycle process - V & V must be applied at each stage in the software process. Has three principal objectives Ensuring that requirements are correct The discovery of defects in a system The assessment of whether or not the system is usable in an operational situation. <POOR DIAGRAM> THE CORRECT POINT – STATIC VALIDATION Should be done during requirements specification, and probably to some extent during high level design. STATIC VERIFICATION should be done during design and programming. DYNAMIC VERIFICATION should be done on prototypes and on programs Static Inspection includes study of requirements documents, design diagrams, and program code. Program inspections can be formal proofs or less formal walkthroughs (group reads code looking for errors)

Program testing Can reveal the presence of errors, NOT their absence A successful test is a test which discovers one or more errors The only validation technique for non-functional requirements Should be used in conjunction with static verification and validation to provide full V&V coverage

Types of testing Defect testing Statistical testing Defect testing Tests designed to discover system defects. A successful defect test is one which reveals the presence of defects in a system. – Intentionally try to create hard cases – stress the system Covered in Chapter 20 Statistical testing tests designed to reflect the frequency of user inputs. Used to get a realistic feel for how things will work in operational environment - for reliability estimation, or other non-functional issues such as performance. Covered in Chapter 21 (we may skip)

V& V goals Verification and validation should establish confidence that the software is fit for its purpose This does NOT mean completely free of defects Must be good enough for its intended use the type of use will determine the degree of confidence that is needed - Depends on system’s purpose, user expectations and marketing environment Software function The level of confidence depends on how critical the software is to an organization – business wise, safety wise etc User expectations Users may have low expectations of certain kinds of software <A shot at Microsoft?> Marketing environment Getting a product to market early may be more important than finding defects in the program. Users may tolerate more errors in cheap software

Testing and debugging Defect testing and debugging are distinct processes Verification and validation is concerned with establishing the existence of defects in a program Debugging is concerned with locating and repairing these errors Debugging involves formulating a hypothesis about program behaviour then testing these hypotheses to find the system error … it is generally desirable to have an independent group do testing after the developers have tested and declared the system done … it is preferable if the developers debug their own problems – they know their code best. But code should be fully documented in case the developer is no longer around or is tied up in other tasks

The debugging process This you guys know from your own debugging for courses. But I want to emphasize – it is common for changes to fix bugs to cause other bugs. Re-testing should not only test for elimination of the bug, but also that other parts of the program have not been broken in the process. If significant changes have been made, a complete regression test, may be done – repeating all tests that were previously done to make sure only the intended changes have occurred (particularly when a patch release is being prepared) Also note, sometimes it is a challenge to recreate the problem. The developer probably found the easy bugs before saying the program was done. Testers (and later customers) may do a particular combination of things that just happens to cause a problem. If they don’t remember exactly what they did, then the developer may not be able to recreate the error – and then may be short on clues. Thus, it is important for testers to create test plans, not just so they make sure they hit everything – but also so they know what it is they did (when I was at IBM, it was common for developers to go over and observe the tester’s screen as the tester recreated the bug)

19.1 V & V planning Careful planning is required to get the most out of testing and inspection processes Planning should start early in the development process The plan should identify the balance between static verification and testing Planning should define standards for the testing process ½ the budget may be spent on V & V … I think there are two levels of planning. Here we are talking about high level planning. At a lower level, individual testers can create a detailed plan of what exactly they are going to test – describing product tests

The V-model of development <we saw this same diagram last semester – the take home point is that testing planning starts early, as it is informed by early activities>

The structure of a software test plan The testing process Requirements traceability Tested items Testing schedule Test recording procedures Hardware and software requirements Constraints … what phases of testing to be done – e.g. unit test, subsystem integration test, system integration test, acceptance test … plan to test all requirements – must know what parts of system support which requirements in order to do this – requirements traceability … … including resource allocation … to be followed – e.g. tester might enter faults into a DB that can then be searched by management and developers, to ensure that all problems are addressed … needed for testers – including tools to be used … affecting the testing process UPDATE as things change

19.2 Software inspections Involve people examining the source representation with the aim of discovering anomalies and defects Do not require execution of a system so may be used before implementation May be applied to any representation of the system (requirements, design, test data, etc.) Very effective technique for discovering errors … frequently this means examining code – getting people to read it carefully (minor?) … e.g. study that 60% of errors can be detected using informal program inspections. Cheaper to find errors than by extensive program testing

Inspection success Many different defects may be discovered in a single inspection. In testing, one defect ,may mask another so several executions are required They reuse domain and programming knowledge - reviewers are likely to have seen the types of error that commonly arise and look out for them

Inspections and testing Inspections and testing are complementary and not opposing verification techniques Inspections can check conformance with a specification but not conformance with the customer’s real requirements Inspections cannot test integration of subsystems Inspections cannot check non-functional characteristics such as performance, usability, etc. … Both should be used during the V & V process Organizations that have not been doing them may be reluctant to do them – because it moves V & V expenses earlier (with promise of later savings). It takes faith on the part of developers and managers to do them

Program inspections Group meeting to carefully, line-by-line review code Intended explicitly for defect DETECTION (not correction) Defects may be logical errors, or non-compliance with standards Invented by IBM in the 1970’s … beneficial sometimes if members come from different roles (e.g. QA, not just developers <can’t see users being able to participate – somebody could take on the role of the user> … correction would make meetings go forever, and waste time … (or anomalies in the code that might indicate an erroneous condition (e.g. an uninitialized variable) )

Inspection teams Made up of at least 4 members Author of the code being inspected Inspector who finds errors, omissions and inconsistencies Reader who reads the code to the team Moderator who chairs the meeting and notes discovered errors Other roles are Scribe and Chief moderator … Developer must try not to be defensive … some organizations skip the reader Meeting should be moderated … by somebody other than the developer Probably better to have a Scribe who records all errors, allowing the moderator to keep meeting on track etc

Inspection pre-conditions A precise specification must be available Team members must be familiar with the organisation standards Syntactically correct code must be available An error checklist should be prepared Management must accept that inspection will increase costs early in the software process Management must not use inspections for staff appraisal … in order that group can check if program meets the spec … this is not for removing syntax errors – waste of a lot of people’s time when compiler could do this … otherwise the meeting will not go well – defensive developers, reviewers not raising issues for political reasons (especially if the developer is more senior)

The inspection process Planning done in advance – selecting team, getting a room, ensuring developer is ready, and materials are available System overview presented to inspection team – an introduction given by developer Code and associated documents are distributed to inspection team in advance. They must read it and try to identify bugs – otherwise the meeting will not work – will take too long Inspection takes place and discovered errors are noted – meeting should not last more than 2 hours or people will burn out (lose effectiveness) Modifications are made to repair discovered errors Re-inspection may or may not be required

Inspection checklists Checklist of common errors should be used to drive the inspection Error checklist is programming language dependent The 'weaker' the type checking, the larger the checklist Examples: Initialization, Constant naming, loop termination, array bounds, etc.

Inspection checks

Inspection rate 500 statements/hour during overview 125 source statement/hour during individual preparation 90-125 statements/hour can be inspected Inspection is therefore an expensive process Inspecting 500 lines costs about 40 person-hours effort

19.3 Automated static analysis Static analysers are software tools for source text processing They parse the program text and try to discover potentially erroneous conditions and bring these to the attention of the V & V team Very effective as an aid to inspections. A supplement to but not a replacement for inspections

Static analysis checks <Some of this the compiler should get – possibly as a warning (but C is very forgiving!) (LINT is a static analyzer for C on UNIX that checks things that a compiler for a stronger typed language would check)>

Stages of static analysis Control flow analysis. Checks for loops with multiple exit or entry points, finds unreachable code, etc. Data use analysis. Detects uninitialised variables, variables written twice without an intervening assignment, variables which are declared but never used, etc Interface analysis. Checks the consistency of routine and procedure declarations and their use. Functions never used, results of functions not used

Stages of static analysis Information flow analysis. Identifies the dependencies of output variables. Does not detect anomalies itself but highlights information for code inspection or review Path analysis. Identifies paths through the program and sets out the statements executed in that path. Again, potentially useful in the review process Both these stages generate vast amounts of information. Must be used with care. … basically a documentation showing how variables get their values … kind of a particular kind of abstract of the code … could also be used to generate test cases.

Use of static analysis Particularly valuable when a language such as C is used which has weak typing and hence many errors are undetected by the compiler Less cost-effective for languages like Java that have strong type checking and can therefore detect many errors during compilation

19.4 Cleanroom software development The name is derived from the 'Cleanroom' process in semiconductor fabrication. The philosophy is defect avoidance rather than defect removal Software development process based on: Formal specification. Incremental development Structured programming Static verification using correctness arguments Statistical testing to determine program reliability. …. Inspections replace unit testing Software specified using a state-transition model Software broken into parts (increments) which are separately developed and verified. Critical components developed first … limited number of control constructs … generate code from spec using correctness-preserving transformations … mathematical arguments of correctness, not necessarily code formally proved correct

The Cleanroom process

Incremental development Change requests result in new release of increment. But increments are small, change requests are isolated (in theory) As new increments are added, the integration is tested

Cleanroom process teams Specification team. Responsible for developing and maintaining the system specification Development team. Responsible for developing and verifying the software. The software is NOT executed or even compiled during this process Certification team. Responsible for developing a set of statistical tests to exercise the software after development. Reliability growth models used to determine when reliability is acceptable Three teams are used when Cleanroom process is used for large system development … user level, and formal technical spec … code inspection plus correctness arguments … test cases are developed as the software is being developed

Cleanroom process evaluation Results in IBM have been very impressive with few discovered faults in delivered systems Independent assessment shows that the process is no more expensive than other approaches Fewer errors than in a 'traditional' development process Not clear how this approach can be transferred to an environment with less skilled or less highly motivated engineers Not being quickly adopted by industry