Download presentation
Presentation is loading. Please wait.
Published byChristian Black Modified over 9 years ago
1
Software Verification and Validation
2
Objectives n Testing allegiance to customer not development team n Test ALL documents n Intent: Find Defects Good test = test has high probability of finding an error Successful test = test that uncovers an error n Prevent defect migration (apply early in life- cycle) n Develop and Use Test Tools n Trained, skilled people in testing
3
People n Creatively destructive n Hunt for errors (not people who create errors) n Trained for testing n 25% of development time spent on testing, but teachers spend 5% teaching it
4
Software Verification and Validation Verification – Are we building the product right? Set of tasks that ensure SW correctly implements a specific function Validation – Are we building the right product? Different set of tasks that ensure SW built is traceable to customer requirements
5
Methods for Verification Formal ProofInspectionsWalkthroughBuddyCheck Formal InformalVery informal PresenterNONENot authorAnyoneNone # peopleTeam3-6Larger #s1 or 2 PreparationYes PresenterNONE Data/reportYes…a proofyes?No AdvantagesVery effectiveEffectiveFamiliarizes Many Inexpensive disadvantagesRequires trained mathematicians Short term costFewer errors found
6
Verfication: Correctness Proofs n Testing uncovers errors, cannot prove program correctness n Manual correctness proofs Mathematical induction and predicate calculus Feasible only on small programs Proofs can contain errors n Automated Correctness proofs MacroCompiler produces symbolic representation of SW Predicate calculus and AI theory Limited to types of applications
7
Strategies and Techniques for Validation n Well-defined activities Low-level: unit testing, integration test High-level: usability, functional test, system test, acceptance test Regression testing – test unchanged code AGAIN n Testware n Black box – derived from requirements and functional specs/no knowledge of structure or code n White box – based on internal code
8
Test Information Flow Eval Testing Reliability Debug SWConfig (specs, code) Test results Test Config (plan,cases) expected results errors Expected results corrections Predicted reliability
9
TESTING STRATEGIES
10
Software Testing Steps
11
Unit Testing n Evaluate Module interface # of input parms = args, match, order, input-only args altered?, global variable definitions consistent, constraints passed (ex. maxsize of array) Logical data structure Improper or inconsistent declaration, erroneous initialization, incorrect variable names, inconsistent data types, underflow, overflow, address exceptions, global data File Structure File attributes correct, OPEN statements correct, format spec matches I/O statement, buffer size=record size, files opened before use, EOF handled, I/O errors handled, textual errors in output White box and Coverage Most common errors (computations) Arithmetic precedence, mixed mode ops, incorrect initialization, precision inaccuracy, symbolic rep of expression
12
More Unit Testing Comparisons Of different data types, logical operators or precedence NOT, expectation of equality when precision error, improper or non- existent loop termination Antibugging Cleanly terminate processing ore reroute when error occurs (incorporated, but not usually tested) Error Handling Intelligible description, provide enough info, error noted = error encountered, error handling BEFORE system intervention
13
Integration Testing n Data can be lost across interface n One module can have an inadvertent effect on another n Accumulating imprecision n Global variables
14
Testing at Different Levels Bottom-up Integration n Uses Drivers at many different levels n Disadv:interface problems appear later in process Top-down Integration n Uses stubs n Test main logic first, add modules, retest n Disadv: planned lower- levels may be impossible to write Still need entire package validated
15
Top-down Approach
16
Bottom-up Approach
17
Validation Testing n Uses black box tests to demonstrate conformity to requirements n Deviations can rarely be fixed prior to scheduled completion n Configuration review of all deliverables for support of maintenance n Alpha – at developer’s site by customer n Beta – at customer’s site -> report back to developer
18
System Testing Incorporated into larger computer-based system n Recovery tests within time – cause system to fail and verify recovery performed n Security – testing plays role of person who breaks in, attacks system, penetrates database n Stress tests – abnormal frequency, volume, quantities n Performance tests – timings/resource utilizations
19
TESTING TECHNIQUES
20
Black Box Testing n Interface Testing Unit interfaces and I/O interfaces n Equivalence Partitioning Input class: generate 1 test point for each input class Output class – generate 1 test point that gives result in each output class Subdivide into subclasses (middle of range and 2 extremes) Error Handling Routines/Exception output conditions (input beyond accepted range) n Boundary value analysis The boundaries of a range n Functional Testing functions w/I certain math classes can be distinguished by their values on a small # of points F(x) = y, if x > 0; y-1, if x <= 0 Functions of more than one variable n Random inputs n Cause-effect graphing testing n Comparison testing
21
White Box Testing n Based on program structure n Coverage metrics for thoroughness Statement coverage Branch coverage (each T/F) Path coverage (every combination of T/Fs) Impractical – too many paths Eliminate infeasible paths If y < 0 then x = y –1 else x = y+1; if x > 0 then … then-then infeasible Missing paths (“special cases”) – can involve 1 single input data point (if y = 0)
22
More…Path coverage Coincidental correctness Read x; y = x + 2; vs. y = x * 2; Write y; Can’t distinguish if input is 2 => 1 point per path is insufficient
23
Paths Loop <= 20 10 4 possible paths At 1 task/msec = 3170 years
24
More White Box Testing n Data Flow Coverage n Mutation Analysis Create mutant programs See if mutant programs give identical output for each test; If yes, mutant is live If no, mutant is killed Display set of live mutants, if all killed, confidence in Test Examples: Replace 1 constant or variable with another Replace one arithmetic operator with another Similarly, relational and logical operators Delete a statement Increment change
25
Tools n Static Analyzers No actual inputs, compilers, check syntax, unreachable code, undefined references n Code Auditors n Assertion Processors n Test File/Data Generation n Test Verifier n Test Harnesses n Output Comparators n Simulators n Data Flow Analyzers n Symbolic Execution Systems ( verifiers )
26
Reliability n Based on error rate n Based on internal characteristics of program (complexity, #operands, #operators) n Seed SW with known errors and evaluate #seeded errors detected vs. actual errors detected –this evaluates power of tests
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.