Software Quality Engineering

Slides:



Advertisements
Similar presentations
Requirements Specification and Management
Advertisements

Verification and Validation
Chapter 4 Quality Assurance in Context
Software Project Management Lecture # 11. Outline Quality Management ( chapter 26 - Pressman )  Software reviews  Formal Inspections & Technical Reviews.
Characteristics of a good SRS
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 6/e (McGraw-Hill 2005). Slides copyright 2005 by Roger Pressman.1.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
SE382 Software Engineering Lecture 21b
Stepan Potiyenko ISS Sr.SW Developer.
Overview Lesson 10,11 - Software Quality Assurance
Soft. Eng. II, Spr. 02Dr Driss Kettani, from I. Sommerville1 CSC-3325: Chapter 6 Title : The Software Quality Reading: I. Sommerville, Chap: 24.
Week 7: Requirements validation Structured walkthroughs Why have walkthroughs When to have walkthroughs Who participates What procedures are helpful Thoughtless.
Software Engineering CSE470: Requirements Analysis 1 Requirements Analysis Defining the WHAT.
1 Software Requirements Specification Lecture 14.
OHT 4.1 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Software Quality assurance (SQA) SWE 333 Dr Khalid Alnafjan
Chapter 24 - Quality Management Lecture 1 1Chapter 24 Quality management.
CS 4310: Software Engineering
Overview Software Quality Software Quality and Quality Assurance
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation.
S oftware Q uality A ssurance Part One Reviews and Inspections.
Software Quality Assurance Activities
Software Inspections. Defect Removal Efficiency The number of defects found prior to releasing a product divided by The number of defects found prior.
Software Inspection A basic tool for defect removal A basic tool for defect removal Urgent need for QA and removal can be supported by inspection Urgent.
Unit 8 Syllabus Quality Management : Quality concepts, Software quality assurance, Software Reviews, Formal technical reviews, Statistical Software quality.
© Mahindra Satyam 2009 Defect Management and Prevention QMS Training.
Phil Cronin Anne Hill Allen Schones CIS841 Summer on Campus 1998 IN-PROCESS INSPECTIONS FOR OBJECT ORIENTED DESIGNS.
Chapter 6 : Software Metrics
Software Project Management Lecture # 10. Outline Quality Management (chapter 26)  What is quality?  Meaning of Quality in Various Context  Some quality.
© 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley 1 Product Design Finalization; Inspections.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
Chapter 3: Software Project Management Metrics
CHAPTER 9: VERIFICATION AND VALIDATION 1. Objectives  To introduce software verification and validation and to discuss the distinction between them 
Chapter 12: Software Inspection Omar Meqdadi SE 3860 Lecture 12 Department of Computer Science and Software Engineering University of Wisconsin-Platteville.
1 Quality Attributes of Requirements Documents Lecture # 25.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Software Requirements Specification Document (SRS)
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.1.
1 These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman.
Mahindra Satyam Confidential Quality Management System Software Defect Prevention.
More SQA Reviews and Inspections. Types of Evaluations  Verification Unit Test, Integration Test, Usability Test, etc  Formal Reviews  aka "formal.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill, 2009). Slides copyright 2009 by Roger Pressman.
Software Project Management Lecture # 12. Outline Quality Management ( chapter 26 - Pressman )  SQA  Who does it?  SQA Activities  Software reviews.
by: Er. Manu Bansal Deptt of IT Software Quality Assurance.
Review Techniques SEII-Lecture 16
Software Quality Control and Quality Assurance: Introduction
CIS 375 Bruce R. Maxim UM-Dearborn
Software Configuration Management (SCM)
Software Verification and Validation
Verification & Validation
Verification and Validation
Software Engineering: A Practitioner’s Approach, 6/e 第 12 章 评审技术 copyright © 1996, 2001, 2005 R.S. Pressman & Associates, Inc. For University Use Only.
Chapter 20 Review Techniques
Software Requirements analysis & specifications
Lecture 12: Chapter 15 Review Techniques
Engineering Processes
Chapter 13 Quality Management
Chapter 25 Process and Project Metrics
Applied Software Project Management
Chapter 20 Review Techniques
QA Reviews Lecture # 6.
Chapter 32 Process and Project Metrics
WALKTHROUGH and INSPECTION
Metrics for Process and Projects
Software Reviews.
Requirement Validation
3. Software Quality Management
Presentation transcript:

Software Quality Engineering Lecture # 5

Verification Static Testing Informal reviews Walkthrough Inspection

Comparison of verification methods

Review + Fix Reqs Review Fix + Impl fix after Delivery True cost of review A five-person review costs five person-days. Consider an error that takes one person-day to fix if found during the requirements phase: Review + Fix Reqs Review Fix + Impl fix after Delivery 5+1 5+2 10

Example 50 KLOC takes 60 PM The IBM rule of thumb is that inspection adds 15% to the resources required, i.e., an additional 9PM.

Example Data show that it takes 1.58 PHour to find a single defect in inspections. There are 1188 PH in 9PM, so one can expect to find 752 defects in 9PM. It is known that it typically takes 9PH to fix a defect found after delivery. So these 752 defects will require 51PM to fix, almost as long as it took to build the software.

example If the same defects are found earlier by inspection, say no later than coding stage, then it will take 1PH to fix each. Therefore with inspections, these 752 defects will require 5.7PM to fix.

example So, a savings over the lifecycle of 32%. W/O Inspection 60+51 = 111PM W Inspection 60+9+5.7 = 74.7PM, a savings over the lifecycle of 32%.

Review Metrics Preparation effort, Ep—the effort (in person-hours) required to review a work product prior to the actual review meeting Assessment effort, Ea— the effort (in person-hours) that is expending during the actual review Rework effort, Er— the effort (in person-hours) that is dedicated to the correction of those errors uncovered during the review Work product size, WPS—a measure of the size of the work product that has been reviewed (e.g., the number of UML models, or the number of document pages, or the number of lines of code) Minor errors found, Errminor—the number of errors found that can be categorized as minor (requiring less than some pre-specified effort to correct) Major errors found, Errmajor— the number of errors found that can be categorized as major (requiring more than some pre-specified effort to correct)

Review Metrics The total review effort and the total number of errors discovered are defined as: Ereview = Ep + Ea + Er Errtot = Errminor + Errmajor Defect density represents the errors found per unit of work product reviewed. Defect density = Errtot / WPS

Example If past history indicates that the average defect density for a requirements model is 0.6 errors per page, and a new requirement model is 32 pages long, a rough estimate suggests that your software team will find about 19 or 20 errors during the review of the document. If you find only 6 errors, you’ve done an extremely good job in developing the requirements model or your review approach was not thorough enough.

Example The effort required to correct a minor model error (immediately after the review) was found to require 4 person-hours. The effort required for a major requirement error was found to be 18 person-hours. Examining the review data collected, you find that minor errors occur about 6 times more frequently than major errors. Therefore, you can estimate that the average effort to find and correct a requirements error during review is about 6 person-hours. Requirements related errors uncovered during testing require an average of 45 person-hours to find and correct. Using the averages noted, we get: Effort saved per error = Etesting – Ereviews 45 – 6 = 30 person-hours/error Since 22 errors were found during the review of the requirements model, a saving of about 660 person-hours of testing effort would be achieved. And that’s just for requirements-related errors.

Conducting the review Review Process Entry criteria Planning Preparation Review meeting Rework Follow-up Exit criteria

Entry criteria Work product completed Independent (if dependent on other work products, they must be completed) Reviewers selected Reviewers trained In case of re-review, previous review comments resolved.

planning Intended goal Review type Review team Roles and responsibilities

preparation Work product distributed Meeting scheduled

Review meeting Introduction (participants and objectives of the review) Work presented Concerns and issues raised (determine the validity of review comments) Review log sent to all participants

rework Review comments analyzed Estimated effort to fix the review comment Determine the need to re-review Status of comments updated

Follow-up Defect resolution and status update

Exit criteria Goal satisfied Defects tracked to closure

Review Checklist (Requirements) Requirements are hard to get right! Review checklist for SRS Correctness Ambiguity Completeness Consistency Verifiability Modifiability Traceability Feasibility

Correctness Every requirement stated in the SRS should correctly represent an expectation from the proposed software. We do not have standards, guidelines or tools to ensure the correctness of the software. If the expectation is that the software should respond to all button presses within 2 seconds, but the SRS states that ‘the software shall respond to all buttons presses within 20 seconds’, then that requirement is incorrectly documented.

Ambiguity There may be an ambiguity in a stated requirement. If a requirement conveys more than one meaning, it is a serious problem. Every requirement must have a single interpretation only. We give a portion of the SRS document (having one or two requirements) to 10 persons and ask their interpretations. If we get more than one interpretation, then there may be an ambiguity in the requirement(s). Hence, requirement statement should be short, explicit, precise and clear. However, it is difficult to achieve this due to the usage of natural languages (like English), which are inherently ambiguous. A checklist should focus on ambiguous words and should have potential ambiguity indicators.

Completeness The SRS document should contain all significant functional requirements and non- functional requirements. It should also have forms (external interfaces) with validity checks, constraints, attributes and full labels and references of all figures, tables, diagrams, etc. The completeness of the SRS document must be checked thoroughly by a checklist.

Consistency Consistency of the document may be maintained if the stated requirements do not differ with other stated requirements within the SRS document. For example, in the overall description of the SRS document, it may be stated that the passing percentage is 50 in ‘result management software’ and elsewhere, the passing percentage is mentioned as 40. In one section, it is written that the semester mark sheet will be issued to colleges and elsewhere it is mentioned that the semester mark sheet will be issued directly to students. These are examples of inconsistencies and should be avoided. The checklist should highlight such issues and should be designed to find inconsistencies.

Verifiability The SRS document is said to be verifiable, if and only if, every requirement stated therein is verifiable. Non-verifiable requirements include statements like ‘good interfaces’, ‘excellent response time’, ‘usually’, ‘well’, etc. These statements should not be used.

Modifiability The SRS document should incorporate modifications without disturbing its structure and style. Thus, changes may be made easily, completely and consistently while retaining the framework. Modifiability is a very important characteristic due to frequent changes in the requirements. What is constant in life? It is change and if we can handle it properly, then it may have a very positive impact on the quality of the SRS document.

Traceability The SRS document is traceable if the origin of each requirement is clear and may also help for future development. Traceability may help to structure the document and should find place in the design of the checklist.

Feasibility Some of the requirements may not be feasible to implement due to technical reasons or lack of resources. Such requirements should be identified and accordingly removed from the SRS document. A checklist may also help to find non-feasible requirements

References Software engineering: A practitioner’s approach by Roger S. Pressman 8th edition Software Testing by Yogesh Singh