Presentation is loading. Please wait.

Presentation is loading. Please wait.

Verification & Validation

Similar presentations


Presentation on theme: "Verification & Validation"— Presentation transcript:

1 Verification & Validation
IS301 – Software Engineering Lecture #30 – M. E. Kabay, PhD, CISSP Assoc. Prof. Information Assurance Division of Business & Management, Norwich University V: M. E. Kabay, PhD, CISSP Copyright © 2004 M. E. Kabay All rights reserved.

2 Topics Verification and validation planning Software inspections
Automated static analysis Cleanroom software development

3 Verification and Validation
Assuring that software system meets user's needs Copyright © 2004 M. E. Kabay All rights reserved.

4 Objectives To introduce software verification and validation and to discuss distinction between them To describe program inspection process and its role in V & V To explain static analysis as verification technique To describe Cleanroom software development process

5 Verification vs Validation
“Are we building the product right?” software should conform to its specification Validation: “Are we building the right product?” software should do what user really requires Copyright © 2004 M. E. Kabay All rights reserved.

6 Copyright © 2004 M. E. Kabay. All rights reserved.
V & V Process Whole life-cycle process V & V at each stage in software process Two principal objectives Discover defects in system Assess whether system is usable in operational situation Copyright © 2004 M. E. Kabay All rights reserved.

7 Static and Dynamic Verification
Software inspections Analysis of static system representation Static verification May be supplemented by tool-based document and code analysis Software testing Exercising and observing product behavior (dynamic verification) System executed with test data and its operational behavior observed Copyright © 2004 M. E. Kabay All rights reserved.

8 Copyright © 2004 M. E. Kabay. All rights reserved.
Static and Dynamic V&V Copyright © 2004 M. E. Kabay All rights reserved.

9 Copyright © 2004 M. E. Kabay. All rights reserved.
Program Testing Can reveal Presence of errors But NOT their absence Successful test Discovers one or more errors Only validation technique for non-functional requirements Should be used in conjunction with static verification to provide full V&V coverage Copyright © 2004 M. E. Kabay All rights reserved.

10 Copyright © 2004 M. E. Kabay. All rights reserved.
Types of Testing Defect testing Tests designed to discover system defects Successful defect test reveals presence of defects in system Statistical testing Tests designed to reflect frequency of user inputs Used for reliability estimation Copyright © 2004 M. E. Kabay All rights reserved.

11 V & V Goals Establish confidence that software is fit for purpose
Does NOT mean completely free of defects Good enough for its intended use Type of use will determine degree of confidence needed

12 V & V Confidence Depends on system’s purpose, user expectations and marketing environment Software function Level of confidence depends on how critical software is to organization User expectations Users may have low expectations of certain kinds of software Marketing environment Getting product to market early may be more important than finding defects in program Sound familiar?

13 Copyright © 2004 M. E. Kabay. All rights reserved.
Testing and Debugging Defect testing and debugging are distinct processes Verification and validation concerned with establishing existence of defects in program Debugging concerned with locating and repairing these errors Debugging involves formulating hypotheses about program behavior then testing these hypotheses to find system error(s) Copyright © 2004 M. E. Kabay All rights reserved.

14 Copyright © 2004 M. E. Kabay. All rights reserved.
Debugging Process Copyright © 2004 M. E. Kabay All rights reserved.

15 Copyright © 2004 M. E. Kabay. All rights reserved.
V & V Planning Careful planning required to get most out of testing and inspection processes Start early in development process Identify balance between Static verification and Testing Test planning is about Defining standards for testing process rather than Describing product tests Copyright © 2004 M. E. Kabay All rights reserved.

16 V-Model of Development

17 Structure of Software Test Plan
Testing process Requirements traceability Tested items Testing schedule Test recording procedures Hardware and software requirements Constraints Copyright © 2004 M. E. Kabay All rights reserved.

18 Software Inspections People examine representation of system to find anomalies and defects May be applied to any representation of system Requirements, design, test data . . . Do not require execution of system Used before implementation Very effective technique for discovering errors

19 Inspection Success In testing, one defect may mask another so several executions are required Reuse domain and programming knowledge so reviewers are likely to have seen types of error that commonly arise Thus many different defects may be discovered in single inspection

20 Inspections and Testing
Inspections and testing are complementary and not opposing verification techniques Both should be used during V & V process Inspections can check Conformance with specification But not conformance with customer’s real requirements Inspections cannot check non-functional characteristics Performance, usability, etc.

21 Copyright © 2004 M. E. Kabay. All rights reserved.
Program Inspections Formalized approach to document reviews Intended explicitly for defect DETECTION (not correction) Defects may be Logical errors, Anomalies in code that might indicate erroneous condition (E.g. uninitialized variable) Or non-compliance with standards Copyright © 2004 M. E. Kabay All rights reserved.

22 Inspection Pre-Conditions
Precise specification must be available Team members must be familiar with organization standards Syntactically correct code must be available Error checklist should be prepared Management must accept that inspection will increase costs early in software process Management must not use inspections for staff evaluations I.e., finding errors does not necessarily mean programmer is BAD Copyright © 2004 M. E. Kabay All rights reserved.

23 Inspection Process

24 Copyright © 2004 M. E. Kabay. All rights reserved.
Inspection Procedure System overview presented to inspection team Code and associated documents distributed to inspection team in advance Inspection takes place and discovered errors noted Modifications made to repair discovered errors Re-inspection may or may not be required Copyright © 2004 M. E. Kabay All rights reserved.

25 Copyright © 2004 M. E. Kabay. All rights reserved.
Inspection Teams Made up of at least 4 members Author of code being inspected Inspector who finds errors, omissions and inconsistencies Reader who reads code to team Moderator who chairs meeting and notes discovered errors Other roles: Scribe Chief moderator Copyright © 2004 M. E. Kabay All rights reserved.

26 Inspection Checklists
Checklist of common errors should be used to drive inspection Error checklist is programming language dependent ‘Weaker’ type checking needs larger checklist Examples: Initialization Constant naming Loop termination Array bounds, etc. Copyright © 2004 M. E. Kabay All rights reserved.

27 Copyright © 2004 M. E. Kabay. All rights reserved.
Inspection Checks (1) Data Faults Are all program variables initialized before their values are used? Have all constants been named? Should the lower bound of arrays be 0, 1 or something else? Should the upper bound of arrays be equal to the size of the array or (size – 1)? If character strings are used, is a delimiter explicitly assigned? Copyright © 2004 M. E. Kabay All rights reserved.

28 Inspection Checks (2) Control Faults
For each conditional statement, is the condition correct? Is each loop certain to terminate? Are compound statements correctly bracketed? In case statements, are all possible cases accounted for?

29 Inspection Checks (3) Input/Output Faults
Are all input variables used? Are all output variables assigned a value before the are output? Interface Faults Do all function and procedure calls have the correct number of parameters? Do formal and actual parameter types match? Are the parameters in the right order? If components access shared memory (globals), do they have the same model of the shared memory structure?

30 Inspection Checks (4) Storage Management Faults
If a linked structure is modified, have all links been correctly reassigned? If dynamic storage is used, has space been allocated correctly? Is space explicitly deallocated after it is no longer required? Exception Management Faults Have all possible error conditions been taken into account?

31 Copyright © 2004 M. E. Kabay. All rights reserved.
Inspection Rate Estimating cost of inspection for 500 lines of code: 500 statements/hour during overview = 1 hr per person x 4 people = 4 person-hours 125 source statements/hour during individual preparation = 4 hours x 4 people = 16 person-hours statements/hour can be inspected in meeting with 4 people in team = ~5 hours x 4 people = ~20 person-hours Inspection is therefore expensive process Inspecting 500 lines thus takes ~ = ~40 person-hours Estimate programmer salary $80K/2K hr ~$40/hr Multiply by 2 for extended costs = $80/hr Therefore costs of 40 person-hours effort = ~ $3,200 Copyright © 2004 M. E. Kabay All rights reserved.

32 Automated Static Analysis
Static analyzers = software for source text processing Parse program text Try to discover potentially erroneous conditions Report to V & V team Effective aid to inspections Supplement to but not replacement for inspections Copyright © 2004 M. E. Kabay All rights reserved.

33 Static Analysis Checks
Copyright © 2004 M. E. Kabay All rights reserved.

34 Stages of Static Analysis (1)
Control flow analysis Checks for loops with multiple exit or entry points, finds unreachable code, etc. Data use analysis Detects uninitialized variables, variables written twice without intervening assignment, variables which are declared but never used, etc. Interface analysis Checks consistency of routine and procedure declarations and their use Copyright © 2004 M. E. Kabay All rights reserved.

35 Stages of Static Analysis (2)
Information flow analysis Identifies dependencies of output variables. Does not detect anomalies itself but highlights information for code inspection or review Path analysis Identifies paths through program and sets out statements executed in that path. Again, potentially useful in review process Both these stages generate vast amounts of information. Must be used with care. Copyright © 2004 M. E. Kabay All rights reserved.

36 LINT Static Analysis 138% more lint_ex.c #include <stdio.h>
printarray (Anarray) int Anarray; { printf(“%d”,Anarray); } main () int Anarray[5]; int i; char c; printarray (Anarray, i, c); printarray (Anarray) ; 139% cc lint_ex.c 140% lint lint_ex.c lint_ex.c(10): warning: c may be used before set lint_ex.c(10): warning: i may be used before set printarray: variable # of args. lint_ex.c(4) :: lint_ex.c(10) printarray, arg. 1 used inconsistently lint_ex.c(4) :: lint_ex.c(10) printarray, arg. 1 used inconsistently lint_ex.c(4) :: lint_ex.c(11) printf returns value which is always ignored LINT Static Analysis

37 Use of Static Analysis Particularly valuable for language such as C
Weak typing Hence many errors undetected by compiler Less cost-effective for languages like Java Strong type checking Can therefore detect many errors during compilation

38 Cleanroom Software Development
Name derived from 'Cleanroom' process in semiconductor fabrication Philosophy is defect avoidance rather than defect removal Software development process based on: Incremental development Formal specification Static verification using correctness arguments Statistical testing to determine program reliability Copyright © 2004 M. E. Kabay All rights reserved.

39 Copyright © 2004 M. E. Kabay. All rights reserved.
Cleanroom Process Copyright © 2004 M. E. Kabay All rights reserved.

40 Cleanroom Process Characteristics
Formal specification using state transition model Incremental development Structured programming Limited control and abstraction constructs are used Static verification using rigorous inspections Statistical testing of system (covered in Ch. 21)(but not in this course)

41 Incremental Development
Copyright © 2004 M. E. Kabay All rights reserved.

42 Formal Specification and Inspections
State-based model = system specification Inspection process checks program against this model Programming approach defined so that correspondence between model and system is clear Mathematical arguments (not proofs) are used to increase confidence in inspection process

43 Cleanroom Process Teams
Specification team Develops and maintains system specification Development team Develops and verifies software. Software NOT executed or even compiled during this process Certification team Develops set of statistical tests to exercise software after development Reliability growth models used to determine when reliability acceptable Copyright © 2004 M. E. Kabay All rights reserved.

44 Cleanroom Process Evaluation
Results in IBM impressive Few discovered faults in delivered systems Independent assessment: Process no more expensive than other approaches Fewer errors than in ‘traditional’ development process Not clear how this approach can be transferred to environment with Less skilled or Less highly motivated engineers Copyright © 2004 M. E. Kabay All rights reserved.

45 Key Points (1) Verification and validation are not same thing. Verification shows conformance with specification; validation shows that program meets customer’s needs Test plans should be drawn up to guide testing process. Static verification techniques involve examination and analysis of program for error detection

46 Key points (2) Program inspections are very effective in discovering errors Program code in inspections is checked by small team to locate software faults Static analysis tools can discover program anomalies which may be indication of faults in code Cleanroom development process depends on incremental development, static verification and statistical testing

47 Homework Required by Wed 17 Nov 2004: For 32 points 22.1 – 22.5 (@4)
22.9 & (think hard) Optional by Mon 29 Nov 2004 For up to 25 extra points, write detailed answers for any or all of 22.6, 22.7 22.8

48 Copyright © 2004 M. E. Kabay. All rights reserved.
DISCUSSION Copyright © 2004 M. E. Kabay All rights reserved.


Download ppt "Verification & Validation"

Similar presentations


Ads by Google