CS 350, slide set 6 M. Overstreet Old Dominion University Spring 2005.

Slides:



Advertisements
Similar presentations
Estimating Defect Density. True genius resides in the capacity for evaluation of uncertain, hazardous and conflicting information Winston Churchill.
Advertisements

This material is approved for public release. Distribution is limited by the Software Engineering Institute to attendees. Sponsored by the U.S. Department.
Copyright © 1994 Carnegie Mellon University Disciplined Software Engineering - Lecture 1 1 Disciplined Software Engineering Lecture #7 Software Engineering.
Important concepts in software engineering The tools to make it easy to apply common sense!
Personal Software Process
© 2010 John Dalbey Ch 9: Reviews Humphrey proposes that personal reviews will result in improved quality. Now that we have a defined process and some real.
CS 350: Introduction to Software Engineering Slide Set 5 Software Quality C. M. Overstreet Old Dominion University Spring 2006.
Aplicaciones de Ingeniería de Software
Testing Metrics Software Reliability
Applied Software Project Management 1 Introduction Dr. Mengxia Zhu Computer Science Department Southern Illinois University Carbondale.
Swami NatarajanJuly 14, 2015 RIT Software Engineering Reliability: Introduction.
Stoimen Stoimenov QA Engineer QA Engineer SitefinityLeads,SitefinityTeam6 Telerik QA Academy Telerik QA Academy.
Using A Defined and Measured Personal Software Process Watts S. Humphrey CS 5391 Article 8.
Personal Software Process Overview CIS 376 Bruce R. Maxim UM-Dearborn.
Personal Software Process Software Quality CIS 376 Bruce R. Maxim UM-Dearborn.
CS527: (Advanced) Topics in Software Engineering Overview of Software Quality Assurance Tao Xie ©D. Marinov, T. Xie.
This material is approved for public release. Distribution is limited by the Software Engineering Institute to attendees. Sponsored by the U.S. Department.
Upstream Prerequisites
Software Quality Chapter Software Quality  How can you tell if software has high quality?  How can we measure the quality of software?  How.
Quality Planning & Defect Estimation
Chapter 15 Projecting Defects( 缺陷预测 ). 山东大学齐鲁软件学院 2 outline  Analyze and use your defect data to help improve both planning accuracy and product quality.
CS 350, slide set 7 M. Overstreet Old Dominion University Spring 2005.
Disciplined Software Engineering Lecture #8 Software Engineering Institute Carnegie Mellon University Pittsburgh, PA Sponsored by the U.S. Department.
INFO 637Lecture #41 Software Engineering Process II Development Plan INFO 637 Glenn Booker.
CS 350, slide set 8 M. Overstreet Old Dominion University Spring 2005.
1 9/19/2015ã 2007, Spencer Rugaber Personal Software Process (PSP) Application of CMM principles to individuals Developed by Watts Humphrey of the Software.
SOFTWARE ENGINEERING1 Introduction. Software Software (IEEE): collection of programs, procedures, rules, and associated documentation and data SOFTWARE.
Chapter 6 : Software Metrics
CS /39 Illinois Institute of Technology CS487 Software Engineering David A. Lash.
CS 350, slide set 10 M. Overstreet Old Dominion University Spring 2006.
Lecture: The Personal Software Process. 2 Overview  Personal Software Process assumptions process stages measures and quality strategy results.
Disciplined Software Engineering Lecture #7 Software Engineering Institute Carnegie Mellon University Pittsburgh, PA Sponsored by the U.S. Department.
© 1998 Carnegie Mellon UniversityTutorial The Personal Software Process (PSP) The overview of the PSP that follows has been built from material made.
This material is approved for public release. Distribution is limited by the Software Engineering Institute to attendees. Sponsored by the U.S. Department.
Copyright © 1994 Carnegie Mellon University Disciplined Software Engineering - Lecture 1 1 Disciplined Software Engineering Lecture #8 Software Engineering.
CS 350, slide set 5 M. Overstreet Old Dominion University Spring 2005.
From Quality Control to Quality Assurance…and Beyond Alan Page Microsoft.
INFO 636 Software Engineering Process I Prof. Glenn Booker Week 9 – Quality Management 1INFO636 Week 9.
INFO 636 Software Engineering Process I Prof. Glenn Booker Week 8 – Reviews 1INFO636 Week 8.
“Look, who is the most successful in attracting and holding good people? The nonprofits. The satisfaction has to be greater than in business because there.
PSP Quality Strategy [SE-280 Dr. Mark L. Hornick 1.
1 The Personal Software Process Estimation Based on Real Data* * Would Martin Fowler approve? “I want you to take this personally…”
Copyright © 1994 Carnegie Mellon University Disciplined Software Engineering - Lecture 7 1 Design and Code Reviews - Overview What are design and code.
This material is approved for public release. Distribution is limited by the Software Engineering Institute to attendees. Sponsored by the U.S. Department.
Measuring and Estimating Software Defects Loren Stroup UCF EEL6883: Software Engineering II.
Prof. Aiken CS 169 Lecture 21 Software Process CS169 Lecture 2.
SE-280 Dr. Mark L. Hornick 1 Design and Code Reviews Review Checklists.
SOFTWARE ENGINEERING1 Introduction. SOFTWARE ENGINEERING2 Software Q : If you have to write a 10,000 line program in C to solve a problem, how long will.
Chapter 19 Process Quality. 山东大学计算机学院 2 outline  Then meaning of process quality  Process measurement  COQ  Failure costs, Appraisal costs, Prevetion.
Implementation Phase CS4311 – Spring 2008 References: Shach, Object Oriented and Classical Software Engineering E. Braude, Software Engineering, an Object-Oriented.
Watts Humphrey IBM director of programming and vice-president of technical development Joined CMU Software Engineering Institute in 1986 Initiator and.
CS 350: Introduction to Software Engineering Slide Set 2 Process Measurement C. M. Overstreet Old Dominion University Fall 2005.
This material is approved for public release. Distribution is limited by the Software Engineering Institute to attendees. Sponsored by the U.S. Department.
Mistakes, Errors and Defects. 12/7/2015Mistakes, Errors, Defects, Copyright M. Smith, ECE, University of Calgary, Canada 2 Basic Concepts  You are building.
Dr. DEVENDRA TAYAL– THE SCOPE OF SOFTWARE ENGINEERING.
CSE SW Metrics and Quality Engineering Copyright © , Dennis J. Frailey, All Rights Reserved CSE8314M13 8/20/2001Slide 1 SMU CSE 8314 /
Introduction to the Personal Software Process. Overview Process Fundamentals PSP Concepts and Structure PSP Planning and Measurement PSP Quality Management.
Oman College of Management and Technology Course – MM Topic 7 Production and Distribution of Multimedia Titles CS/MIS Department.
Copyright , Dennis J. Frailey CSE Software Measurement and Quality Engineering CSE8314 M00 - Version 7.09 SMU CSE 8314 Software Measurement.
CSC 480 Software Engineering PSP Project 1 August 20, 2004.
CS 350, slide set 4 M. Overstreet Old Dominion University Spring 2005.
CS 350, slide set 10 M. Overstreet Old Dominion University Spring 2005.
Introduction Edited by Enas Naffar using the following textbooks: - A concise introduction to Software Engineering - Software Engineering for students-
Personal Software Process Team Software Process
Introduction SOFTWARE ENGINEERING.
Introduction Edited by Enas Naffar using the following textbooks: - A concise introduction to Software Engineering - Software Engineering for students-
Software Inspections and Testing
SQA for Individuals based on
Mistakes, Errors and Defects
Presentation transcript:

CS 350, slide set 6 M. Overstreet Old Dominion University Spring 2005

Reading PSP text, ch. 15, 16, 17, 18, 19, 20

Topics Projecting defects Economics of defect removal Design defects Quality

Defect rates (from 38 developers taking PSP course) Defects/KLOC Years of programming experience

Lessons from table Very high defect rates for practicing software developers 100 defects/KLOC (or more) before PSP training Still 50 defects/KLOC after PSP training Training in tracking defects much more effective than experience in reducing defect levels

Defect estimation Defect density (Dd) is defects per 1000 lines of code If you have data on i previous program which include size and number of defects, then Dd plan = 1000(D D i )/(N N i ) New entries on the Project Plan form: Plan Defects/KLOC Plan Defects Injected (for each phase) Plan Defects Removed (for each phase) Text has completely worked out example

Comments Filling out tedious forms and count- ing defects won’t improve code quality. If you want to reduce your own defect levels, then these techniques can likely help. If you don’t care, the forms won’t help.

Defect problems New systems are bigger and more complex Text example: early laser printer SW had 20 KLOC; new has 1,000 KLOC 10 years ago cars had 0 KLOC; now several KLOC

Defect metrics We’ve already talked about Defects/ KLOC as a measure of code quality Yield measures the percentage of defects found by a removal method (such as testing or code review) As such, yield is an indication of the effectiveness of an defect removal activity.

Computing yield Can be computed for both reviews and testing. For code review, Yield = 100(# defects found before compile)/ (defects injected before compile) For testing: Yield = 100(# defects found during testing)/ (defects present before testing) Problem: how do know we know the number of defects present before compile or testing?

Defect-injection rates Defects/Hour Program number

Defect removal rates Defects/hour Program number

Typical numbers for small, class-size programs Not using PSP Inject about 100/hr Find 50/hr in compile Find 40/hr in testing Using PSP Inject about 50/hr With code reviews, remove 60% to 70% before first compile For a 500 loc program, PSP saves about 7 hours overall (more for review, less for debugging). Think scaling: for a 10,000 loc program, same numbers mean a 140 hr savings.

PSP effects on compile and test time Program Number % Total Effort Point: PSP reduces compile and test time

Computing Defects/Hour Goal: measure efficiency of defect removal efforts at different points. Simple computation if: You have accurate defect counts in each development phase You have accurate measures of the amount of time spent in each phase A simple division for each phase Def./hr = (# defects found)/(hours in phase)

Suggestions for improving defect removal rates Focus on yield first Humphrey suggests targeting a yield of 70% or better Do code reviews before 1 st compile Goal: 0 comp. errs on 1 st compile Review error logs for reoccurring problems; if appropriate, revise checklists Humphrey (from Brown): "one definition of insanity is doing the same thing over and over and expecting a different result"

Reduce defect injection rates Record all defects Produce more complete designs Use better methods for developing designs, code, or whatever Use better tools My experience is that it's hard to get students to use tools. Can also be true of programmers

Design defects Sometimes tricky to separate design defects from code defects No real clear line between design and code Lots of tools exist to convert what in the past was a design (pictures) to source code (at least in a skeletal form) The amount of detail I put in a design depends, in part, on how knowledgeable I think the programmer is likely to be. Want to leave out as much as possible, but some times the programmer needs guidance Sometimes the programmer needs freedom instead

Let's simplify Humphrey suggests error types 10 through 40 as “code-like" errors, 50 through 90 as “design-like" errors

Better designs come with experience (maybe) Start with high-level designs then refine Experienced developers know when how level design components are feasible Inexperienced developers can produce designs that have components that won't work at all. Recall NASA's design of n-version tester Clearly (at least to an experienced developer) the design solved several critical problems: Failed versions Rounding error No additional details required to see this

Testing As software gets more complex, it gets harder to test. IRI (an NSF-funded project for inter- active distance education) is a distributed system Testing is hard because much depends on the order in which code fragments run. If 10 parts, then 10! possible orders 10! = 3.6 million IRI may have 20 components that run in parallel. 20! = ?

Defect removal yields MethodApprox. Yield % Code review70-80 Code inspection50-70 Compile50 Unit test40-50 Integration test45 Requirements test45 Algorithm test8 Note: This data may not be typical!

Compile & Test Data Compile Defects Test Defects

Lessons (Based on limited data): Lots of compile errors  lots of test errors  lots of errors delivered to customer Testing less effective with large programs But code reviews still work (why?)

Basic belief of CMM Quality of product depends on quality of process Use metrics to measure quality of process Productivity? LOC/hour Code? Defects/KLOC Efficiency? Defects/hour (low for injection, high for removal) If you think you have a better process, then you need data comparing it with other processes

Comment on Humphrey's data Humphrey believes in data for making decisions on what works Industry will rarely release these types of data Productivity Error rates Yield rates

Towards quality code Build the best possible units you can Do careful reviews Inspect requirements to help ensure unit does All it is supposed to do Does exactly what it is supposed to do Do careful code inspections (we will talk about later— but it is a group activity) If the unit is small enough, do exhaustive unit testing after the inspection Do thorough integration testing Should be done by group different from the code developers Do comprehensive system testing

Costs of quality Failure costs must be balanced with the additional costs of building quality code Quality costs: Time for code reviews Time for code inspections Time for developing test plans, test data, test drivers, and testing Building prototypes to assist in making key design decisions Can sometimes measure prototype behavior Building prototypes to better understand system (or compiler or whatever) behavior

For PSP cost of quality (A little artificial) Appraisal COQ is the sum of all review times. It is a percentage of total development time. Failure COQ includes all time spent in compiling and testing. It is a percentage of total development time

A new process metric: A/FR A/FR is for "appraisal/failure ratio A/FR = (appraisal COQ)/(failure COQ) Can also be computed as (code review time)/(compile time + test time) When A/FR is less than 1, testing is finding lots of faults Goal: A/FR should be larger than 2 How can you achieve this goal? On new project, predict testing time based on your previous data Then spend more time doing review!

A/FR data Appraisal/Failure Cost Ratio Test Defects/KLOC

Exam 1 Available at class web site on Thursday. Complete by deadline and to cmo You must use to submit! Due following Tuesday by midnight! Late exams not accepted. Designed to take about an hour to complete