Presentation is loading. Please wait.

Presentation is loading. Please wait.

Advanced Software Engineering: Software Testing COMP 3702 Instructor: Anneliese Andrews.

Similar presentations


Presentation on theme: "Advanced Software Engineering: Software Testing COMP 3702 Instructor: Anneliese Andrews."— Presentation transcript:

1 Advanced Software Engineering: Software Testing COMP 3702 Instructor: Anneliese Andrews

2 Advanced Software Engineering: Software Testing COMP 3702 Instructor: Anneliese Andrews

3 Thomas Thelin – Software Testing'05 News & Project  News  Guest lecture 10/5 by Tomas Berling  Project  Deadline 10/5: Report to supervisor and peer review group  Project presentations 17/5, 8-12 http://serg.telecom.lth.se/education/SwTest/

4 Thomas Thelin – Software Testing'05 Lecture  Chapter 9  Metrics  Chapter 16 (& Appendix III)  Process improvement and assessment  TMM, TIM  Presentation technique

5 Thomas Thelin – Software Testing'05 Test management questions  What is effective?  What is efficient?  When to stop testing? ?

6 Thomas Thelin – Software Testing'05 Monitoring

7 Thomas Thelin – Software Testing'05 Why measure?  Why is metrics important for the development and test process?  Are metrics useful for motivation purposes?  Can metrics improve quality?

8 Thomas Thelin – Software Testing'05 Measurement basics Basic data:  Time (calendar and staff hours)  Failures / Faults  Size Basic rule:  Feedback to origin  Use data or don’t measure

9 Thomas Thelin – Software Testing'05 Purpose of metrics  Project monitoring – check the status  Project controlling – corrective actions  Plan new projects  Measure and analyze results  The profit of testing  The cost of testing  The quality of testing  The quality of the test process  Basis of improvement, not only for the test process

10 Thomas Thelin – Software Testing'05 Monitoring testing  Status  Coverage metrics  Test case metrics  Failure / Fault metrics  How much have we accomplished?  What is the quality status of the software?  Productivity metrics  How productive and efficient are the testers?  Efficiency / Cost metrics  How much time have we spent?  How much money have we spent?  Effectiveness metrics  How effective is the testing techniques in detecting defects? Metrics Estimation Cost Stop?

11 Thomas Thelin – Software Testing'05 Test metrics: Size/complexity/length What?  Size – LOC, Function Points  Complexity – McCabe  Length – Halstead Why?  Estimate test effort

12 Thomas Thelin – Software Testing'05 Test metrics: Efficiency What?  # faults/hour  # faults/test case Why?  Evaluate efficiency of V&V activities

13 Thomas Thelin – Software Testing'05 Test metrics: Effectiveness What?  % found faults per phase  % missed faults Why?  Evaluate effectiveness of V&V activities

14 Thomas Thelin – Software Testing'05 Test metrics: Coverage What?  % statements covered  % branches covered  % data flow  % requirements  % equivalence classes Why?  Track completeness of test

15 Thomas Thelin – Software Testing'05 Test metrics: Test execution status What?  # faults/hour  # executed tests  Requirements coverage Why?  Track progress of test project  Decide stopping criteria

16 Thomas Thelin – Software Testing'05 Test metrics: Trouble reports What?  # faults/size  repair time  root cause Why?  Monitor quality  Monitor efficiency  Improve

17 Thomas Thelin – Software Testing'05 Selecting the right metrics  What is the purpose of the collected data?  What kinds of questions can they answer?  Who will use the data?  How is the data used?  When and who needs the data?  Which forms and tools are used to collect the data?  Who will collect them?  Who will analyse the data?  Who have access to the data?

18 Thomas Thelin – Software Testing'05  Goals  What is the organization trying to achieve?  The objective of process improvement is to satisfy these goals  Questions  Questions about areas of uncertainty related to the goals  You need process knowledge to derive the questions  Metrics  Measurements to be collected to answer the questions Goal-Question-Metric Paradigm (GQM)

19 Thomas Thelin – Software Testing'05 GQM guidelines  Analyze  the detection of design faults using inspection and testing  for the purpose of  evaluation  With respect to their  effectiveness and efficiency  from the point of view of the  managers  in the context of  developers, and in a real application domain

20 Thomas Thelin – Software Testing'05 Test estimation  Guessing  Authorized deadlines  Previous test effort  In-project experience through incremental planning  Formula-based  COCOMO  Test point analysis  Consensus of experts (Delphi)  Detailed work breakdown of testing tasks  Historical data helps estimating regardless of the method

21 Thomas Thelin – Software Testing'05 Test Point Analysis Dynamic test points Static test points Total number of test points Primary test hours Environmental factors Skill factors Planning and control factors Total number of test hours TP f =FP f *D f *Q d PT=TP*S*E

22 Thomas Thelin – Software Testing'05 Cost of testing  How much does testing cost?  As much as resources we have!

23 Thomas Thelin – Software Testing'05 Costs of testing  Pre-run costs planning, specification, environment setup  Execution costs execution time (staff and equipment), regression tests  Post-run costs outcome analysis, documentation

24 Thomas Thelin – Software Testing'05 Minimizing costs Test performance  Automate execution and evaluation Test maintenance  Add new, review effectiveness Testware development  Consider as part of the product

25 Thomas Thelin – Software Testing'05 When to stop testing?  All coverage goals met (requirements, code,...)  Detection of specific number of failures  Rates of failure detection fallen below a specified level  Fault seeding ratios are favourable  Reliability above a certain value  Number of failures left below the limit  Cost has reach the limit

26 Thomas Thelin – Software Testing'05 Most common stop criteria  Time has reach the limit (milestone, release date,...)  All planned tests executed and passed

27 Thomas Thelin – Software Testing'05 Example Number of failures per day Number of executed test cases Number of detected failures Interpretation?

28 Thomas Thelin – Software Testing'05 Example Interpretation? Number of executed test cases Number of detected failures

29 Thomas Thelin – Software Testing'05 Example Interpretation? Number of executed test cases Number of detected failures

30 Thomas Thelin – Software Testing'05 Process models  Purpose  Improvement  Assessment  Models  (Integrated) Capability maturity model (CMM, CMMI)  Software process improvement and capability determination (SPICE)  ISO 9001, Bootstrap  Test maturity model (TMM)  Test process improvement model (TPI)  Test improvement model (TIM)

31 Thomas Thelin – Software Testing'05 CMM

32 Thomas Thelin – Software Testing'05 Test Maturity Model (TMM)  Levels  Maturity goals and sub-goals  Scope, boundaries, accomplishments  Activities, tasks, responsibilities  Assessment model  Maturity goals  Assessment guidelines  Assessment procedure

33 Thomas Thelin – Software Testing'05  Test Maturity Model

34 Thomas Thelin – Software Testing'05 Change model Vision Strategies Processes Plans Organization Quality system Models Ability Application Create Utilize Create Utilize Architecture Design Software Hardware Product

35 Thomas Thelin – Software Testing'05 Test improvement model – TIM  Key areas  Organization  Planning and monitoring  Test cases  Testware  Review / Inspection  Maturity levels  Initial  Basic  Cost effective  Risk thinking  Optimizing

36 Thomas Thelin – Software Testing'05 Maturity profile (example)

37 Thomas Thelin – Software Testing'05 Phases of assessment / improvement TMM  Plan (Preparation)  Do (conducting, reporting)  Check (analyzing, action planning)  Act (implementing improvement) TIM  Evaluation – mirror of the test organization  Present, discuss and reflect – understand the problems  Develop improvement strategy  Implement the improvement strategy  Continually improvement

38 Thomas Thelin – Software Testing'05 Evaluation  Interview based on a questionnaire  Covers the key areas and maturity levels  “Magic” questions:  What would you like to change?  What would make your situation better?

39 Thomas Thelin – Software Testing'05 Present, discuss and reflect  Write a report and present  All questions are discussed  Goal: create a common understanding

40 Thomas Thelin – Software Testing'05 Develop improvement strategy  Prioritized areas are identified  Improvement puzzle  Support from the organization is very important

41 Thomas Thelin – Software Testing'05 Implement the improvement strategy  Seminars and workshops  Pilot project – internal mentors  Role play

42 Thomas Thelin – Software Testing'05 General advice  Identify the real problems before starting an improvement program  “What the customer wants is not always what it needs”  Implement “easy changes” first  Involve people  Changes take time!

43 Thomas Thelin – Software Testing'05 Presentation technique “Why on earth people who have something to say which is worth hearing should not take the slight trouble to learn how to make it heard is one of the strange mysteries of modern life.” Sir Arthur Conan Doyle

44 Thomas Thelin – Software Testing'05 uppmärksamhet Minuter Föredraget

45 Thomas Thelin – Software Testing'05 Start Mitt sommarlov… Slut Vi började med att Sedan följde.. Så småningom.. Till slut kom vi fram till..

46 Thomas Thelin – Software Testing'05 Start Löpsedelstekniken - i vilken ordning vill lyssnaren veta? Slut

47 Thomas Thelin – Software Testing'05 Våra sinnen och minnet.. Ett antal dagar senare Föredragets tidpunkt.. ihågkomst ord bild ord + bild

48 Thomas Thelin – Software Testing'05 Presentationsteknik  Löpsedelstekniken  Närhet  Igenkänning  Personifiering  Konsekvenser

49 Thomas Thelin – Software Testing'05 This week  Project  Write report  Deadline 10/5: Report to supervisor and peer review group  Deadline 17/5: Comments (peer review) to supervisor and group  Presentations 17/5, 8-12.  Lab  Process simulation  Report of Lab 4 to Carina


Download ppt "Advanced Software Engineering: Software Testing COMP 3702 Instructor: Anneliese Andrews."

Similar presentations


Ads by Google