Presentation is loading. Please wait.

Presentation is loading. Please wait.

Contents Introduction Requirements Engineering Project Management

Similar presentations


Presentation on theme: "Contents Introduction Requirements Engineering Project Management"— Presentation transcript:

1 Contents Introduction Requirements Engineering Project Management
Software Design Detailed Design and Coding Quality Assurance Maintenance Analysis Design Testing Coding Operation and Maintenance Installation Require- ments Requirements Specification Planning PVK-HT03

2 Quality Assurance Introduction Testing Testing Phases and Approaches
Black-box Testing White-box Testing System Testing PVK-HT03

3 What is Quality Assurance?
QA is the combination of planned and unplanned activities to ensure the fulfillment of predefined quality standards. Constructive vs analytic approaches to QA Qualitative vs quantitative quality standards Measurement Derive qualitative factors from measurable quantitative factors Software Metrics PVK-HT03

4 Approaches to QA Constructive Approaches Analytic approaches
Syntax-directed editors Type systems Transformational programming Coding guidelines ... Analytic approaches Inspections Static analysis tools (e.g. lint) Testing Usage of methods, languages, and tools that ensure the fulfillment of some quality factors. Usage of methods, languages, and tools to observe the current quality level. PVK-HT03

5 ?! Fault vs Failure Different types of faults
can lead to can lead to human error fault failure Different types of faults Different identification techniques Different testing techniques Fault prevention and -detection strategies should be based on expected fault profile PVK-HT03

6 HP´s Fault Classification
Fault origin: WHERE? Specification/ requirements Design Code Environment/ support Documen- tation Other Requirements or specifications Functionality HW interface SW interface User interface Functional description (Inter-)Process communications Data definition Module design Logic description Error checking Standards Logic Computation Data handling Module interface/ implementation Standards Test HW Test SW Integration SW Development tools Fault type: WHAT? Missing Unclear Wrong Changed Better way Fault mode: WHY?

7 Fault Profile of a HP Division
See [Pfleeger 98]. PVK-HT03

8 Testing Phases . SYSTEM IN USE! Unit test Integration Function
Performance Acceptance (,) test Installation Component code . Tested component Integrated modules Functioning system Verified software Accepted, validated SYSTEM IN USE! Design specifications System functional requirements Other Customer specification User environment Pre-implementation PVK-HT03

9 Pre-Implementation Testing
Inspections evaluation of documents and code prior to technical review or testing Walkthrough In teams Examine source code/detailed design Reviews More informal Often done by document owners Advantages Effective High learning effect Distributing system knowledge Disadvantages Expensive

10 Testing vs “Proofing” Correctness
Verification Check the design/code against the requirements Are we building the product right? Validation Check the product against the expectations of the customer Are we building the right product? Testing Testing can neither proof that a program is error free, nor that it is correct Testing is the process in which a (probably unfinished) program is executed with the goal to find errors. [Myers 76] Testing can only show the presence of errors, never their absence. [Dijkstra 69] PVK-HT03

11 Goals of Testing Detect deviations from specifications
Debugging Regression testing Establish confidence in software Operational testing Evaluate properties of software Reliability Performance Memory use/leakage Security Usability PVK-HT03

12 Principles of Software Testing*
Complete testing is not possible Testing is creative and difficult A major objective of testing is defect detection Testing must be risk-based Testing must be planned Testing requires independence * Hetzel, The Complete Guide to Software Testing PVK-HT03

13 Test Cases and Test Data
Test case  test data Test data: Inputs devised to test the system Test case: Situation to test Inputs to test this situation Expected outputs Test are reproducible PVK-HT03

14 Fundamental Steps of Software Testing
Understand requirements and specifications Create the execution environment Select test cases Execute & evaluate test cases Evaluate test progress PVK-HT03

15 Unit Testing Defn: Tests the smallest individually executable code units. Objective: Find faults in the units. Assure correct functional behavior of units. By: Usually programmers. Tools: Test driver/harness Coverage evaluator Automatic test generator PVK-HT03

16 Test Methods Functional testing (black-box)
Structural testing (white-box, glass-box) Uses code/detailed design is to develop test cases Typically used in unit testing Approaches: Coverage-based testing Symbolic execution Data flow analysis ... Functional testing (black-box) Uses function specifications to develop test cases Typically used in system testing Equivalence partitioning Border case analysis develop black-box test cases develop white-box test cases perform white-box testing perform black-box testing time PVK-HT03

17 Test Preparation Exhaustive testing is prohibited, because of the combinatorial explosion of test cases Choose representative test data i paths to test #tests 1 X, Y 2 2 XX, XY, YX, YY 4 3 XXX, XXY, ... 8 ... for i := 1 to 100 do if a = b then X else Y; 2  > 2,5  1030 With 106 tests/sec this would take 81016 years Choose test data (test cases) PVK-HT03

18 How to Choose Test Data Example 1 Example 2 Both paths must be tested!
How can I know there is a “path”? if ((x + y + z)/3 = x) then writeln( “x, y, z are equal”) else writeln( “x, y, z are unequal”); Test case 1: x=1, y=2, z=3 Test case 2: x=y=z=2 if (d = 0) then writeln( “division by zero”) else x = y/n; (* *) PVK-HT03

19 Black-box Testing Test generation without knowledge of software structure Also called specification-based or functional testing Equivalence partitioning Boundary-value analysis Error Guessing .... PVK-HT03

20 Equivalence Partitioning
Input data Input- and output data can be grouped into classes where all members in the class behave in a comparable way. Define the classes Choose representatives Typical element Borderline cases Inputs causing anomalous behaviour System Outputs which reveal the presence of faults Output data class 1: x < 25 class 2: x >= 25 and x <= 100 class 3: x > 100 x  [ ] PVK-HT03

21 Equivalence Partitioning Examples
Module keeps a running total of amount of money received Input: AmountofContribution, specified as being a value from $0.01 to $99,999.99 Module prints an appropriate response letter Input: MemberStatus, specified as having a value of {Regular, Student, Retiree, StudioClub} 3 equivalence classes: Values from $0.01 to $99, (valid) Values less than $0.01 (invalid) Values greater than $99, (invalid) 2 equivalence classes: Values of {Regular, Student, Retiree, StudioClub} (valid) All other input (invalid) PVK-HT03

22 White-box Testing Methods based on internal structure of code
Approaches: Coverage-based testing Statement coverage Branch coverage Path coverage Data-flow coverage Symbolic execution Data flow analysis ... a b d c if a then b else c; d while a do b; c PVK-HT03

23 Statement Coverage Every statement is at least executed once in some test 4 2 7 6 1 8 3 5 2 test cases: 12467; 13567 PVK-HT03

24 Branch Coverage For every decision point in the graph, each branch is at least chosen once 4 2 7 6 1 8 3 5 2 test cases: 12467; 1358 PVK-HT03

25 Condition Coverage Test all combinations of conditions in boolean expressions at least once Why in boolean expressions only? if (X or not (Y and Z) and ... then b; c := (d + e * f - g) div op( h, i, j); PVK-HT03

26 Path Coverage Assure that every distinct paths in the control-flow graph is executed at least once in some test PVK-HT03

27 Path Coverage Assure that every distinct paths in the control-flow graph is executed at least once in some test 4 2 7 6 1 8 3 5 PVK-HT03

28 Path Coverage Assure that every distinct paths in the control-flow graph is executed at least once in some test 4 2 7 6 1 8 3 5 4 test cases: 12467; 1358; 1248; 13567 PVK-HT03

29 Test Coverage–Example
Statement Coverage: 5/10 = 50% Branch Coverage: 2/6 = 33% Path Coverage: 1/4 = 25% PVK-HT03

30 Data-flow testing Def-use analysis: match variable definitions (assignments) and uses. Example: x = 5; if (x > 0) ... Does assignment get to the use? PVK-HT03

31 Data Flow Coverage All-uses coverage
x :=1 x :=3 z :=x+y y :=2 r :=4 x :=2 z := 2*r z := 2*x-y Red path covers the defs y :=2; r :=4; x :=1 Blue path covers y :=2; x :=3. Does not cover x :=2 PVK-HT03

32 Coverage-based Testing
Advantages Systematic way to develop test cases Measurable results (the coverage) Extensive tool support Flow graph generators Test data generators Bookkeeping Documentation support Disadvantages Code must be available Does not (yet) work well for data-driven programs PVK-HT03

33 Integration Testing Defn: Testing two or more units or components
Objectives Interface errors Functionality of combined units; look for problems with functional threads By: Developers or Testing group Tools: Interface analysis; call pairs Issues: Strategy for combining units PVK-HT03

34 Integration Testing How to integrate & test the system Top-down
Bottom-up Critical units first Functionality-oriented (threads) Implications of build order Top-down => stubs; more thorough top-level Bottom-up => drivers; more thorough bottom-level Critical => stubs & drivers. A Test all B C D Test B,E,F Test C Test D,G E F G Test E Test F Test G PVK-HT03

35 System Testing Defn: Test the functionality, performance, reliability, security of the entire system. By: Separate test group. Objective: Find errors in the overall system behavior. Establish confidence in system functionality. Validate that system achieves its desired non-functional attributes. Tools: User simulator. Load simulator PVK-HT03

36 Realities of System Testing
Available time for testing is short Compressing development risks introducing problems Compressing testing risks missing critical problems Testers want to start testing early System testing requires an available system Developers resist testing until system is “ready” To optimize use of the existing resources, use risk analysis. PVK-HT03

37 Acceptance Testing Defn: Operate system in user environment, with standard user input scenarios. By: End user Objective: Evaluate whether system meets customer criteria. Determine if customer will accept system. Tools: User simulator. Customer test scripts/logs from operation of previous system. PVK-HT03

38 Regression Testing Defn: Test of modified versions of previously validated system. By: System or regression test group. Objective: Assure that changes to system have not introduced new errors. Tools: Regression test base, capture/replay Issues: Minimal regression suite, test prioritization PVK-HT03

39 Test Automation Automation has several meanings
Test generation: Produce test cases by processing of specifications, code, model. Test execution: Run large numbers of test cases/suites without human intervention. Test management: Log test cases & results; map tests to requirements & functionality; track test progress & completeness PVK-HT03

40 Issues of Test Automation
Automating Test Execution Does the payoff from test automation justify the expense and effort of automation? Learning to use the automation tool is non-trivial Testers become programmers All tests, including automated tests, have a finite lifetime. Complete automated execution implies putting the system into the proper state, supplying the inputs, running the test case, collecting the result, verifying the result. PVK-HT03

41 Best uses of automated test execution
Load/stress tests--Nearly impossible to have 1000 human testers simultaneously accessing a database. Regression test suites--Tests collected from previous releases; run to check that updates don’t break previously correct operation. Sanity tests (smoke tests)--run after every new system build to check for obvious problems. PVK-HT03

42 Test documentation Test plan: describes system and plan for exercising all functions and characteristics Test specification and evaluation: details each test and defines criteria for evaluating each feature Test description: test data and procedures for each test Test analysis report: results of each test PVK-HT03

43 The Key Problems of Software Testing
Selecting or generating the right test cases. Knowing when a system has been tested enough. Knowing what has been discovered/demonstrated by execution of a test suite. PVK-HT03


Download ppt "Contents Introduction Requirements Engineering Project Management"

Similar presentations


Ads by Google