Download presentation
Presentation is loading. Please wait.
1
Software Verification
Introduction & The Model-Driven Test Design Process
2
Testing in the 21st Century
Today’s software market : is much bigger is more competitive has more users Embedded Control Applications airplanes, air traffic control spaceships watches ovens remote controllers increase pressure on testers Programmers must unit test – with no training, education or tools ! Tests are key to functional requirements – but who builds those tests ? PDAs memory seats DVD players garage door openers cell phones
3
Cost of Testing You’re going to spend at least half of your development budget on testing, whether you want to or not In the real-world, testing is the principle post-design activity Restricting early testing usually increases cost Extensive hardware-software integration requires more testing
4
Program Managers often say: “Testing is too expensive.”
Why Test? If you don’t start planning for each test when the functional requirements are formed, you’ll never know why you’re conducting the test What fact is each test trying to verify? Program Managers often say: “Testing is too expensive.” Not testing is even more expensive
5
Testing event generation event evaluation instrumentation
events send(42) send(42) instrumentation specification
6
Test Design in Context Test Design is the process of designing input values that will effectively test software Test design is one of several activities for testing software Most mathematical Most technically challenging
7
The Sieve program Is this program correct? int P[101];
int V=1; int N=1; int I; main() { while (N<101) { for (I=1;I<=N;I++) if (P[I]=0) { P[I]=V++ ; N++ ; } else if (V%P[I]==0) I=N+1; else I++} } Is this program correct?
8
Types of Test Activities
Testing can be broken up into four general types of activities Test Design Test Automation Test Execution Test Evaluation Each type of activity requires different skills, background knowledge, education and training 1.a) Criteria-based 1.b) Human-based
9
1. Test Design – (a) Criteria-Based
Design test values to satisfy coverage criteria or other engineering goal This is the most technical job in software testing Test design is analogous to software architecture on the development side
10
1. Test Design – (b) Human-Based
Design test values based on domain knowledge of the program and human knowledge of testing Requires knowledge of : Domain, testing, and user interfaces Requires almost no traditional CS A background in the domain of the software is essential An empirical background is very helpful (biology, psychology, …) A logic background is very helpful (law, philosophy, math, …)
11
Embed test values into executable scripts
2. Test Automation Embed test values into executable scripts This is slightly less technical Requires knowledge of programming Fairly straightforward programming – small pieces and simple algorithms Programming is out of reach for many domain experts
12
What is JUnit? Open source Java testing framework used to write and run repeatable automated tests (junit.org) JUnit features include: Assertions for testing expected results Test features for sharing common test data Test suites for easily organizing and running tests Graphical and textual test runners JUnit is widely used in industry JUnit can be used as stand alone Java programs (from the command line) or within an IDE such as Eclipse
13
3. Test Execution 4. Test Evaluation
Run tests on the software and record the results This is easy – and trivial if the tests are well automated 4. Test Evaluation Evaluate results of testing, report to developers This is much harder than it may seem
14
Types of Test Activities – Summary
Design Design test values to satisfy engineering goals Criteria Requires knowledge of discrete math, programming and testing 1b. Design test values from domain knowledge and intuition Human Requires knowledge of domain, UI, testing 2. Automation Embed test values into executable scripts Requires knowledge of scripting 3. Execution Run tests on the software and record the results Requires very little knowledge 4. Evaluation Evaluate results of testing, report to developers Requires domain knowledge These four general test activities are quite different It is a poor use of resources to use people inappropriately
15
Model-Driven Test Design
refined requirements / test specs model / structure test requirements test requirements DESIGN ABSTRACTION LEVEL IMPLEMENTATION ABSTRACTION LEVEL software artifact input values pass / fail test results test scripts test cases
16
Model-Driven Test Design – Steps
criterion refine refined requirements / test specs model / structure test requirements generate analysis test requirements domain analysis DESIGN ABSTRACTION LEVEL IMPLEMENTATION ABSTRACTION LEVEL software artifact input values feedback prefix postfix expected evaluate execute automate pass / fail test results test scripts test cases
17
Model-Driven Test Design – Activities
refined requirements / test specs Test Design model / structure test requirements DESIGN ABSTRACTION LEVEL IMPLEMENTATION ABSTRACTION LEVEL software artifact input values Test Automation Test Evaluation pass / fail test results Test Execution test scripts test cases
18
How to cover the executions?
IF (A>1)&(B=0) THEN X=X/A; END; IF (A=2)|(X>1) THEN X=X+1; END; Choose values for A,B,X. Value of X may change, depending on A,B. What do we want to cover? Paths? Statements? Conditions?
19
Execute every statement at least once
By choosing A=2,B=0,X=3 each statement will be chosen. The case that the tests fail is not checked! IF (A>1)&(B=0) THEN X=X/A; END; IF (A=2)|(X>1) THEN X=X+1; END;
20
Important Terms
21
Validation & Verification
Validation : The process of evaluating software at the end of software development to ensure compliance with intended usage Verification : The process of determining whether the products of a given phase of the software development process fulfill the requirements established during the previous phase
22
According to Boehm Verification means “we are building the product right.” Validation means “we are building the right product”.
23
Verification or validation
Example: elevator response Unverifiable (but validatable) spec: ... if a user presses a request button at floor i, an available elevator must arrive at floor i soon... Verifiable spec: ... if a user presses a request button at floor i, an available elevator must arrive at floor i within 30 seconds...
24
Software Faults, Errors & Failures
Software Fault : A static defect in the software Software Failure : External, incorrect behavior with respect to the requirements or other description of the expected behavior Software Error : An incorrect internal state that is the manifestation of some fault Faults in software are equivalent to design mistakes in hardware. They were there at the beginning and do not “appear” when a part wears out.
25
Testing : Finding inputs that cause the software to fail
Testing & Debugging Testing : Finding inputs that cause the software to fail Debugging : The process of finding a fault given a failure
26
Static and Dynamic Testing
Static Testing : Testing without executing the program This include software inspections and some forms of analyses Very effective at finding certain kinds of problems – especially “potential” faults, that is, problems that could lead to faults when the program is modified Dynamic Testing : Testing by executing the program with real inputs
27
White-box and Black-box Testing
Black-box testing : Deriving tests from external descriptions of the software, including specifications, requirements, and design White-box testing : Deriving tests from the source code internals of the software, specifically including branches, individual conditions, and statements Model-based testing : Deriving tests from a model of the software (such as a UML diagram
28
Tests that are at the limit of the software’s expected input domain
Stress Testing Tests that are at the limit of the software’s expected input domain Very large numeric values (or very small) Very long strings, empty strings Null references Very large files Many users making requests at the same time Invalid values
29
Top-Down and Bottom-Up Testing
Top-Down Testing : Test the main procedure, then go down through procedures it calls, and so on Bottom-Up Testing : Test the leaves in the tree (procedures that make no calls), and move up to the root. Each procedure is not tested until all of its children have been tested
30
Test Case Test Case Values : The values that directly satisfy one test requirement Expected Results : The result that will be produced when executing the test if the program satisfies it intended behavior
31
Search routine specification
procedure Search (Key : INTEGER ; T: array 1..N of INTEGER; Found : BOOLEAN; L: 1..N) ; Pre-condition -- the array has at least one element 1 <= N Post-condition -- the element is found and is referenced by L ( Found and T (L) = Key) or -- the element is not in the array ( not Found and not (exists i, 1 >= i >= N, T (i) = Key ))
32
Search routine test cases
33
Testing levels
34
Testing Levels Based on Test Process Maturity
Level 0 : There’s no difference between testing and debugging Level 1 : The purpose of testing is to show correctness Level 2 : The purpose of testing is to show that the software doesn’t work Level 3 : The purpose of testing is not to prove anything specific, but to reduce the risk of using the software Level 4 : Testing is a mental discipline that helps all IT professionals develop higher quality software
35
Level 0 Thinking Testing is the same as debugging
Does not distinguish between incorrect behavior and mistakes in the program Does not help develop software that is reliable or safe
36
Level 1 Thinking Purpose is to show correctness
What do we know if no failures? Good software or bad tests? Test engineers have no: Strict goal Real stopping rule Formal test technique Test managers are powerless
37
This describes most software companies.
Level 2 Thinking Purpose is to show failures Looking for failures is a negative activity This describes most software companies.
38
Level 3 Thinking Testing can only show the presence of failures
Whenever we use software, we incur some risk Risk may be small and consequences unimportant Risk may be great and the consequences catastrophic Testers and developers work together to reduce risk
39
Level 4 Thinking Testing is only one way to increase quality
Test engineers can become technical leaders of the project
40
Testing Models
41
Testing at Different Levels
System testing: Test the overall functionality of the system Acceptance testing: Is the software acceptable to the user? main Class P Module testing: Test each class, file, module or component Integration testing: Test how modules interact with each other Class A method mA1() method mA2() Class B method mB1() method mB2() Unit testing: Test each unit (method) individually This view obscures underlying similarities
42
Criteria Based on Structures
Structures : Four ways to model software Graphs Logical Expressions (not X or not Y) and A and B A: {0, 1, >1} B: {600, 700, 800} C: {swe, cs, isa, infs} Input Domain Characterization if (x > y) z = x - y; else z = 2 * x; Syntactic Structures
43
1. Graph Coverage – Structural
6 5 3 2 1 7 4 Node (Statement) Cover every node 12567 Edge (Branch) Cover every edge 12567 1357 Path Cover every path 12567 1257 13567 1357 … This graph may represent statements & branches methods & calls components & signals states and transitions
44
Software Artifact : Java Method
Example Software Artifact : Java Method /** * Return index of node n at the * first position it appears, * -1 if it is not present */ public int indexOf (Node n) { for (int i=0; i < path.size(); i++) if (path.get(i).equals(n)) return i; return -1; } 4 5 3 2 1 i = 0 i < path.size() if return i return -1 Control Flow Graph
45
1. Graph - FSM Example Memory Seats in a Lexus ES 300
Guard (safety constraint) Trigger (input) Driver 1 Configuration Driver 2 [Ignition = off] | Button2 [Ignition = off] | Button1 sideMirrors () [Ignition = on] | Modified Configuration lumbar () [Ignition = on] | (to Modified) seatBottom () [Ignition = on] | seatBack () [Ignition = on] | Ignition = off New Configuration Driver 1 Driver 2 [Ignition = on] | Reset AND Button1 [Ignition = on] | Reset AND Button2
46
( (a > b) or G ) and (x < y)
2. Logical Expressions ( (a > b) or G ) and (x < y) Transitions Logical Expressions Program Decision Statements Software Specifications
47
2. Logic – Active Clause Coverage
( (a > b) or G ) and (x < y) 1 T F T 2 F F T With these values for G and (x<y), (a>b) determines the value of the predicate 3 F T T 4 F F T 5 T T T 6 T T F
48
3. Input Domain Characterization
Describe the input domain of the software Identify inputs, parameters, or other categorization Partition each input into finite sets of representative values Choose combinations of values System level Number of students { 0, 1, >1 } Level of course { 600, 700, 800 } Major { swe, cs, isa, infs } Unit level Parameters F (int X, int Y) Possible values X: { <0, 0, 1, 2, >2 }, Y : { 10, 20, 30 } Tests F (-5, 10), F (0, 20), F (1, 30), F (2, 10), F (5, 20)
49
4. Syntactic Structures Based on a grammar, or other syntactic definition Primary example is mutation testing Induce small changes to the program: mutants Find tests that cause the mutant programs to fail: killing mutants Failure is defined as different output from the original program Check the output of useful tests on the original program Example program and mutants if (x > y) if (x >= y) z = x - y; z = x + y; z = x – m; else z = 2 * x; if (x > y) z = x - y; else z = 2 * x;
50
Four Test Modeling Software
Coverage Overview Four Test Modeling Software Graphs Logic Input Space Syntax Use cases Specs Design Source Applied to DNF Specs FSMs Source Applied to Input Models Integ Source Applied to
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.