Presentation is loading. Please wait.

Presentation is loading. Please wait.

Identifying and Using a Project’s Key Subprocess Metrics Jeff S. Holmes BTS Fort Worth.

Similar presentations


Presentation on theme: "Identifying and Using a Project’s Key Subprocess Metrics Jeff S. Holmes BTS Fort Worth."— Presentation transcript:

1 Identifying and Using a Project’s Key Subprocess Metrics Jeff S. Holmes BTS Fort Worth

2 Everyone Loves a Hero

3 Heroes Come Through! Firemen – Saves Baby in Burning House Policemen – Catches Bad Guy Athlete – Hits Game Winning Homerun

4 Software Engineering Heroes! All night coding! Debugging over the weekend! THIS SHOULD NOT BE THE NORM!

5 How To Minimize “Fire Drills”? Preventative Actions  Proper wiring can prevent fires.  Locking your car can prevent theft.  Don’t get behind in the ball game.  Understand project status earlier.

6 Metrics, metrics, everywhere…

7 But What is Really Important? Customer Wants Customer Wants –Functionality –Zero Defects –On Time What software metrics map to these? What software metrics map to these? How can we optimize these outputs? How can we optimize these outputs?

8 BTS Fort Worth Approach Selected DMAIC to Improve Process Identified Project with Two Years of Data Performed Statistical Analysis Conducted Pilot Currently in “Control” Phase

9 DMAIC : Define Identify “what is important” BTS FW Monitors  Productivity (KLOC/Hour) *  Quality (Post Release Defects/KLOC)  Schedule Adherence These are BTS FW “Big X’s”

10 DMAIC : Measure Software Development Life Cycle Requirements Resources Perfect Software! The “Simple” View

11 DMAIC : More Details Requirements Resources Perfect Software! Design Code Test Perfect Requirements Resources Perfect Design Perfect Models Perfect Design Perfect Models Resources Perfect Code Resources

12 DMAIC : Subprocess Identification BTS FW Identified Following Subprocesses  Planning Phase  Requirements Phase  Design Phase  Code Phase  Test Phase  Release Phase  Code Inspections

13 DMAIC : Measured Data BTS FW Uses Following Data:  # Requirements  # Developers on the project (Resources)  % Time in Planning  % Time in Requirements  % Time in Design  % Time in Code  % Time in Test  % Time in Release  Requirements Churn  Actual Size (KLOC)  Avg Defect Detection Rate (DDR) in Code Inspection

14 DMAIC : Data Sources BTS FW Data  DOORs for Requirements  Project Plans # Developers and % Times  ClearCase for Code Size  Inspection Database for DDR

15 DMAIC : BTS Subprocess Metrics Planning Design Code Test Req Count Time in Phase Req Churn Time in Phase Requirements Resources Time in Phase Req Churn KLOC Time in Phase Release Time in Phase Code Inspections DDR

16 DMAIC : BTS Subprocess Metrics Planning Design Code Test Req Count Time in Phase Req Churn Time in Phase Requirements Resources Time in Phase Req Churn KLOC Time in Phase Release Time in Phase Code Inspections DDR Productivity (KLOC/Hour) Quality (PR Defects/KLOC) Schedule Adherence ? ? ? ? ?

17 DMAIC : BTS FW Analysis Project Data 8 Releases since 2002 Similar work “Stable” team Used Step-wise Linear Regression to Identify statistically significant factors Develop prediction formulas for “Big X’s”

18 CAUTION !! The following slides contain statistics that could be hazardous to your health! Persons who suffer from narcolepsy or “statisticitis” should consider leaving the room.

19 DMAIC : Stepwise Linear Regression Describes the relationship between one 'predicted' variable and 'predictor' variables Goal – get the simplest equation with the best predictive power for Productivity – KLOC/Hour Quality – Post Release Defect/KLOC

20 DMAIC : Standard Least Squares Model accounts for 99.82% of variance.

21 DMAIC : Significant Effects Most significant effects % Time in Req Average DDR Interaction between % Time in Code and Average DDR < 0.05 is significant

22 DMAIC : Standard Least Squares Model accounts for 90.62% of variance.

23 DMAIC : Significant Effects Most significant effects % Time in Requirements Interaction between % Time in Requirements and Requirements Churn < 0.05 is significant

24 DMAIC : Statistically Significant # Requirements # Developers on the project % Time in Planning % Time in Requirements % Time in Design % Time in Code % Time in Test % Time in Release Requirements Churn Actual Size (KLOC) Avg Defect Detection Rate (DDR) in Code Inspection

25 DMAIC : Key Subprocess Metrics Planning Design Code Test Req Count Time in Phase Req Churn Time in Phase Requirements Resources Time in Phase Req Churn KLOC Time in Phase Release Time in Phase Code Inspections DDR Productivity (KLOC/Hour) Quality (PR Defects/KLOC) Schedule Adherence ? ? ? ? ?

26 DMAIC : Variation Analysis Prediction formulas generated to identify: Good and bad variance Most significant factors NOTE: Prediction formula uses all effects from the models, not just the significant ones. Formula added in Percent Planning.

27 DMAIC : Factor Weighting Metric10% Deviation in Each Subprocess Area % Plan10%0000 % Req010%000 % Code0010%00 Req Churn00010%0 Avg DDR000010% Productivity03%31%09% Quality14%40%06%0 Cycle Time03%24%09%

28 DMAIC : Factors’ Effects MetricLOC/Hr EffectPR Defect/KLOC Effect % Planning TimeN/AMore time = Less defects/KLOC % Req TimeMore Time = Less KLOC/Hr Less time = More defects/KLOC % Code TimeMore Time = Less KLOC/Hr N/A Code Insp. DDRHigher DDR = Less KLOC/Hr N/A 2SR Req. ChurnN/ALess Churn = Less defects/KLOC

29 DMAIC : BTS FW Limits Green limit indicates direction a metric can deviate from the average and have desired results. Red indicates direction of undesired results. MetricAverageStd DevLower Limit Upper Limit % Planning Time18 %8 %10 %26 % % Req Time15 %8 %7 %23 % % Code Time21 %12 %11 %33 % Code Insp. DDR1.89.51.392.39 2SR Req. Churn1.761.8903.65

30 Subprocess Metrics Notes Initial Data Left Much to be “Desired” Despite Poor Data, the Analysis Identified: Which Metrics and Processes Are Significant Prediction Formulas Based on Project’s Data Insight into Factors’ Effects Limits for Monitoring the Factors

31 DMAIC : Pilot Confirmation Used prediction formulas on other projects Compared project actuals vs. predicted. Used historical data from 5 projects. Unable to compare predicted quality versus actual. (Predicted LOC/Hr) These projects have not been in field long enough for CRUD to stabilize. Interesting results found on predicted LOC/Hr.

32 DMAIC : Predicted vs. Actual LOC/Hr Projects A, B, and C projects had huge deviations. Projects D and E were within 20%.

33 DMAIC : Improve Performance? So what? How do you use this information? Does Project Management have confidence in this analysis?

34 More emphasis on statistically significant activities Resulting in Increased Productivity On-Time Delivery Desired Functionality Delivered Improved Quality DMAIC : Applying Analysis

35 DMAIC : Agile Processes BTS FW Adopted Agile Practices Iterative Development Prioritizes Requirements Negates Requirements Churn Pair Programming Optimizes Coding and Inspection Time Minimal Documentation Moves effort from non-statistical activities.

36 DMAIC : Agile Pilot Results Productivity 0.00291 KLOC/Hr 20% improvement from 0.002399 Inspection Defect Detection Rate 1.18 Defects/Hr Detected 48% improvement from 0.8 Quality 0 Post Release Defects!

37 DMAIC : Agile Pilot Results Customer Wants Functionality – All functionality delivered Zero Defects – No customer found defects On Time – Product delivered 6 months early!

38 DMAIC : Agile Monitoring Monitor Iterations Not Phases Refactoring Subprocess Monitoring Two Agile projects in-work now

39 DMAIC : Agile Monitoring Monitoring LOC per week Defects caught per week by inspection Defects caught per week by test Time spent per week Ratio of new work to correction work.

40 DMAIC : Agile Metrics

41 Summary Save your “heroes” for real crises. Save your “heroes” for real crises. Understand subprocesses Understand subprocesses Monitor subprocesses Monitor subprocesses Seek to optimize key subprocesses Seek to optimize key subprocesses

42 Recommendations Examine current project data, it could prove to be very valuable! Examine current project data, it could prove to be very valuable! Improve data capture on important data. Improve data capture on important data. Use the data as a guideline, but experience can never be discounted. Use the data as a guideline, but experience can never be discounted.

43 THANK YOU! Jeff S. Holmes Principal Staff Software Engineer Motorola Six Sigma Black Belt Fort Worth BTS Development Team Fort Worth, TX 817-245-7053J.Holmes@Motorola.com


Download ppt "Identifying and Using a Project’s Key Subprocess Metrics Jeff S. Holmes BTS Fort Worth."

Similar presentations


Ads by Google