Quantitatively Measured Process Improvements at Northrop Grumman IT Craig Hollenbach Northrop Grumman IT.

Slides:



Advertisements
Similar presentations
Integrated Project Management IPM (Without IPPD) Intermediate Concepts of CMMI Project meets the organization Author: Kiril Karaatanasov
Advertisements

Web Development Engineering Processes Introduction to Web Development Outsourcing Processes.
SEP1 - 1 Introduction to Software Engineering Processes SWENET SEP1 Module Developed with support from the National Science Foundation.
Taking a Waterfall Project Agile REF: Paul Geberth GCSS-J Project Manager Establishment of an Agile Project.
More CMM Part Two : Details.
1 State of Michigan Achieving Software Process Improvement with Capability Maturity Model (CMM)
A A A N C N U I N F O R M A T I O N T E C H N O L O G Y : IT OPERATIONS 1 Problem Management Jim Heronime, Manager, ITSM Program Tanya Friehauf-Dungca,
Project Cost Management Estimation Budget Cost Control
© Copyright Richard W. Selby and Northrop Grumman Corporation. All rights reserved. 0 Process Synchronization and Stabilization February 2007 Rick.
Stepan Potiyenko ISS Sr.SW Developer.
OHT 8.1 Galin, SQA from theory to implementation © Pearson Education Limited Review objectives Formal design reviews (FDRs) Participants Preparations.
OHT 8.1 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Review objectives Formal design reviews (FDRs) Participants Preparations.
RIT Software Engineering
SE 450 Software Processes & Product Metrics 1 Defect Removal.
CMM Overview - 1 © Paul Sorenson CMPUT Software Engineering refs. IEEE Software, March 1988, 73-79, and IEEE Software, July 1993, (Capability.
SOFTWARE PROJECT MANAGEMENT Project Quality Management Dr. Ahmet TÜMAY, PMP.
12 Steps to Useful Software Metrics
Using A Defined and Measured Personal Software Process Watts S. Humphrey CS 5391 Article 8.
Design Reviews Peer Reviews. Agenda Peer Reviews Participants of Peer Review Preparation for a Peer Review Session The Peer Review Session Post-peer Review.
Capability Maturity Model
The Importance and Value of Process Improvement. Rationale for Process Improvement Establishing an attitude and culture of quality improvement and continuous.
Six Sigma By: Tim Bauman April 2, Overview What is Six Sigma? Key Concepts Methodologies Roles Examples of Six Sigma Benefits Criticisms.
Using Six Sigma to Achieve CMMI Levels 4 and 5
Chapter : Software Process
CMMI Course Summary CMMI course Module 9..
Integrated Capability Maturity Model (CMMI)
PMP® Exam Preparation Course
COMPANY CONFIDENTIAL Page 1 Final Findings Briefing Client ABC Ltd CMMI (SW) – Ver 1.2 Staged Representation Conducted by: QAI India SM - CMMI is a service.
CMMI Technology Conference and User Group November 2003 Experiences with Leveraging Six Sigma to Implement CMMI Levels 4 and 5 Jeff Facemire & Hortensia.
Org Name Org Site CMM Assessment Kick-off Meeting Dates of assessment.
N By: Md Rezaul Huda Reza n
Software Project Management Introduction to Project Management.
© Mahindra Satyam 2009 Defect Management and Prevention QMS Training.
People First … Mission Always Capability Maturity Model Integration (CMMI ® ) Millee Sapp 2 Dec 08 Warner Robins Air Logistics Center.
PPMT CE-408T Engr. Faisal ur Rehman CED N-W.F.P UET P.
Why do we … Life cycle processes … simplified !!!.
S Q A.
Software Engineering Lecture # 17
NDIA Systems Engineering Supportability & Interoperability Conference October 2003 Using Six Sigma to Improve Systems Engineering Rick Hefner, Ph.D.
Copyright 2003 Northrop Grumman Corporation 0 To PIID or Not to PIID: Lessons Learned in SCAMPI Evidence Preparation To PIID or Not to PIID: Lessons Learned.
3rd Annual CMMI Technology Conference and User Group
© 1998 Carnegie Mellon UniversityTutorial The Personal Software Process (PSP) The overview of the PSP that follows has been built from material made.
IIL’s International Project Management Day, 2007 The Power of the Profession: A Lesson Learned and Solution Implemented Becomes a Best Practice in Project.
Application of the CMMI SM to Plan and Control Life Cycle Costs Dr. Mary Anne Herndon Science Applications International Corporation (SAIC) November, 2003.
Georgia Institute of Technology CS 4320 Fall 2003.
CHAPTER 9 INSPECTIONS AS AN UP-FRONT QUALITY TECHNIQUE
INFO 636 Software Engineering Process I Prof. Glenn Booker Week 9 – Quality Management 1INFO636 Week 9.
Company LOGO Team assignment 03 Team 04 K15T02. Members… 1.Hoàng Thị Kim Dâng 2.Thái Thanh Nhã 3.Trần Thị Mộng Hà 4.Trần Tiễn Hưng 5.Chu Thị Thu Hương.
@2002 Copyright, Itreya Technologies CMMI kick off July 2005.
PRIMO Limited & 6 Sigma By HKU SPACE 6 Sigma Consultant Firm 30-May-2006.
Software Engineering Modern Approaches Eric Braude and Michael Bernstein 1.
A Process Improvement Plan for a High Maturity (and Diverse) Organization Alan Pflugrad Northrop Grumman Information Technology Defense Enterprise Solutions.
1 Software Quality Engineering. 2 Quality Management Models –Tools for helping to monitor and manage the quality of software when it is under development.
CMMI The quality of a software product is only as good as the process used to develop and maintain it. Whether a software organization is competing in.
Project Management Strategies Hidden in the CMMI Rick Hefner, Northrop Grumman CMMI Technology Conference & User Group November.
Mahindra Satyam Confidential Quality Management System Software Defect Prevention.
Managing Multiple Projects Steve Westerman California Department of Motor Vehicles Steve Young Mathtech, Inc.
Copyright 2012 John Wiley & Sons, Inc. Part II Project Planning.
CMMI Certification - By Global Certification Consultancy.
Chapter 18 Maintaining Information Systems
Identify the Risk of Not Doing BA
TechStambha PMP Certification Training
Presented To: 3rd Annual CMMI Technology Conference and User Group
12 Steps to Useful Software Metrics
د. حنان الداقيز خريف /28/2016 Software Quality Assurance ضمان جودة البرمجيات ITSE421 5 – The components of the SQA.
CMMI – Staged Representation
Engineering Processes
Software Engineering I
Capability Maturity Model
Capability Maturity Model
Presentation transcript:

Quantitatively Measured Process Improvements at Northrop Grumman IT Craig Hollenbach Northrop Grumman IT

Agenda  Northrop Grumman IT Overview  2002 SCAMPI Appraisal  Sample Project Data Inventory Tracking System (ITS) AIT JPATS SIGS  Conclusions

Northrop Grumman Overview  Northrop Grumman $26 B in revenue; 120,000 employees; 50 states; 25 countries  Information Technology (IT) Sector $4 B in sales; 22,000 employees; 48 states; 15 countries  Defense Enterprise Solutions (DES) Business Unit $548 M in sales; 2,900 employees, 23 states, 3 countries  DES provides enterprise-wide technology solutions to the Defense marketplace  Major Applications:

Logicon LIS Litton TASC Logicon LTS L5 Litton PRC (to other units) DES Logicon LAT L3 LDES L5 (to other units) L3 CMMI (to other units) Logicon LISS L3 ENABLER LIEB SPII DES Maturity Pedigree

2002 CMMI Approach  Background Kent’s quote about problems at beginning of 2002  Personnel & Teams PA Process Owners DES Organizational Units (e.g., EPG, training, procurement) High Maturity Process Area Teams, composed of project representatives (L4WG, L5WG, MO, DPWG, TCMSIG)  Approach DES Organizational Improvements –CMMI Process Gap Analysis –Built Umbrella processes for legacy orgs DES Project Improvements –Assigned support reps to assist project personnel –Project representatives participated on high maturity process area teams

2002 SCAMPI Appraisal  SCAMPI appraisal led by independent SEI-certified appraisers in December 2002 determined that DES achieved CMMI-SE/SW maturity level 5 CMMI-SE/SW capability level 5 in PMC, IPM, TS, and VER SW-CMM maturity level 5  DES works with other IT Business Units to transfer our process improvement experience throughout the sector

Inventory Tracking System (ITS)

Inventory Tracking System  Project Description:  USAF/AFMC/MSG Inventory Tracking System (ITS) Modernization  A 3.5-year, $11M Firm-Fixed Price project with a development staff of approximately 15 members  Development Team uses SEI Personal Software Process (PSP)  Implemented CMMI Level 5 quantitative management processes to dramatically improve the cost, schedule, and delivered quality of the software  Currently in preparation for 1 st contractual customer driven test cycle  Contractual Quality Goal is to deliver no known severity 1-3 defects (1-Critical, 2-Urgent, 3-Routine).

ITS Critical + Urgent Defect Density Quantitative Management Plan Goal:  5/KLOC for Critical + Urgent Peer Review - Builds KLOC = Thousand Lines of Code

Peer Review Defect Density (Critical + Urgent cont.) DP 2DP 3 DP 1

Cost Variance by Build DP 2DP 3DP 1

Schedule Variance by Build DP 2DP 3DP 1

Return on Investment – Construction Phase  Hours invested: 124 Team training: 48 Conducting DP Cycles: 76  Defects avoided: If the Defect Density had remained at 6.6 (Build 1), we would have injected 110 more defects.  Hours saved: At an estimated cost of 15 hours per defect this equals 1650 hours.  Return: Hours:1650/124 = 1330% Customer satisfaction: Priceless! – “The contractor has always provided products and services with less defects that industry standards. Most have been provided with no defects. Personnel have been used that show a complete understanding of their subject area and are able to convey this information in a highly professional manor.”

Inventory Tracking System – Test Phase 12  Need to understand Total Defect Density in Peer Review in Construction Phase to relate to Defect Density in Test Phase DP Cycles had an effect on Total defect density also –Build 1 Total defect density = 21.6 defects/KLOC –Build 5 Total defect density = 13 defects/KLOC Total Defect Density for Construction Phase = 19 defects/KLOC Total Defect Density for Testing to Date = 4.5 defects/KLOC 400% Reduction

Inventory Tracking System – Test Phase Management Goals in Test are being exceeded! Critical/Urgent Defect Total Defects DDt Unit .5/KLOC  2/KLOC DDt .25/KLOC  1/KLOC (all internal integration cycles) ITS Test Defects By Test Cycle Actual Defect Density by KLOC DP Cycle

Defect Discovery Rayleigh Curve All Test Defects Feb 03 Prediction = defects Feb 03 Actual = 15 defects.14 defects per K LOC/month Inventory Tracking System – Test Phase Quantitative Management Plan Goals: DDs – Defect Discovery  1/KLOC When the Total Defect Discovery rate falls under 1 defect per KLOC per month the project manager and test lead have enough confidence to stop test cycle. Prediction = 1.44 defects Actual = 1 defect (last week of May) Defect Discovery Rayleigh Curve Build.75 Test Defects

Results  CMMI Quantitative Management & Defect Prevention Cycles have a huge return on Investment in the Construction Phase. Specific results from the first coding cycle to the fifth are: Critical / Urgent defect density reduced by 68%, Cost Variance improved from -59% to +39%, and Schedule Variance improved from +26% to +49%.  This return has a significant effect on the Test Phase Where most projects have the highest defect detection rate in Test, ITS has its lowest defect detection rate. Latent defect analysis estimates delivering a defect density of between 0.35 and 0.7; a total of 20 to 40.  Understanding the quality of the product allows for better management decisions and with highly satisfied customers.

Automated Identification Technology (AIT)

AIT Document Defect Data QM  Began collecting data in Feb ‘02 as part of DP Cycle Process improvement techniques per DP Cycle identified –Use of CM controlled Templates for documents enforced for authors –Type classification identified for defects: technical/non-technical –Documentation Input Defect Report checklist completed Identify number of pages each document Identify document types Identify defects as technical/non-technical –Management and personnel awareness of data collection and purposes Six months data had defect rate vary from 4.9% to 11.6%, with one outlier higher  Process Improvement implementation Resulted in re-evaluation of upper and lower limits Increased personnel and management focus on data Data for last 12 months has defect rate vary from 0.4% to 5.9%

Document defect data Feb ’02 – Jul ‘03

JPATS TIMS

JPATS Challenge  JPATS Build 1.05 (June 2002) The first build to be released after progressing from “Development” to the “Maintenance” phase of the program. –Builds 1.01 – 1.04 were internal builds, not released for customer verification Used “development”–style processes for fixing STRs Failed 9/30 (30%) of the on-site STR verification tests with customer witnesses. –NOTE: STRs Fixed vs. STRs Accepted is a measure that is quantitatively measured by the JPATS Program  As a Result … Kicked off JPATS DP Cycle #1 GOAL: Reduce the STR verification failures to < 5% for JPATS builds and 1.07

DP Cycle Findings  Root causes included Lack of “maintenance”-style processes (e.g. streamlined for dealing with many (~30-130) STRs/build) Lack of “maintenance”-style build planning & tracking  ~40 Countermeasures identified Many top-tier countermeasures focused on improving/updating our STR build processes –Most of these were approved for action by the sponsor

DP Cycle Improvements  Actions JPATS updated/developed the following processes specifically for the Contractor Logistics Support (CLS = maintenance) phase –BP 100, CLS Software Build Process –BP 200, Define a Build –BP 300, Plan and Track a Build –BP 400, Develop a Build UT 100, Unit Test Procedure PR 100, Peer Review Procedure (Code) RT 100, Regression Test Procedure –BP 500, Deploy a Build JPATS developed a build planning and tracking matrix called the “STR Big Board” to track all the elements required by the process per STR across all STRs

DP Cycle Effectiveness  Build 1.06 (July 2002) 0% STR verification failure rate (0/11) Goal of verification failure rate < 5% was met  Build 1.07 (Oct 2002) 2.7% STR verification failure rate (3/113) Goal of < 5% verification failure rate was still met  Subsequent builds have continued to perform well See next slide showing current JPATS QM measure for STR Verification

DP Cycle Effectiveness Problem Occurred DP Cycle Subsequent Results

Synthetic Imagery Graphical System (SIGS)

SIGS Schedule Performance  Goal: SPI (X bar) of 85% in the 1 st third of each PoP, 90% in the 2 nd third, and 95% in the last third  Actual: 92.1% over multiple PoPs; 88.3% at the end of the 1 st PoP; 88.7% at the end of the 2 nd ; and 96.8% at the end of the last PoP  Highlights: Cost Performance (CPI) was 96.8% over the same period; Award Fee average was 99% over the same period

SIGS Schedule Performance (Cont’d)  Situation: O&M project undertaking a major redesign of the system over multiple years using new technology  At the beginning: unfamiliar technology meant that schedule estimates had large uncertainty since there was no available historical data to support the basis of estimate  Process changes: introduced Earned Value (EV) tracking combined with statistical process control (SPC) techniques allowed better monitoring of progress against the plans and identifying when there are special causes of variation  Improvement: by closely tracking the actual effort required to complete the earlier activities, we were able to feed that back into the estimates for the later activities and thus able to produce schedules with less uncertainty

Conclusions