Presentation is loading. Please wait.

Presentation is loading. Please wait.

Joseph Houser Dan Milano Paul Solomon

Similar presentations


Presentation on theme: "Joseph Houser Dan Milano Paul Solomon"— Presentation transcript:

1 Joseph Houser Dan Milano Paul Solomon
Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

2 Most descriptions of EVM include a measure of technical progress
Objective: Review the current expectations of EVM to provide management a measure of quality technical accomplishments and progress Current environment Review OMB and DoD Policy and Guides Review the ANSI/EIA 748 EVM Standard Identify gaps Next steps 2

3 Earned Value Management has matured for the past 40+ years with several success stories
1967 – DoD - Cost/Schedule Control Systems Criteria 1996 – OMB Circular A-11, Part 3 1997 – DoD - Earned Value Management Systems Criteria 1998 – ANSI/EIA-748 EVM Standard (Comm’l) 2002 – OMB Circular A-11, Part 7 (requires 748 compliance – all Agencies) 2002 – ANSI/EIA-748-A 2004 – NDIA EVMS Intent Guide 2005 – PMI EVMS Practice Standard 2006 ANSI/EIA 748 (update to recognize Program Level EVM) )) ) EVM has matured over the years and the Government accepts and endorses ANSI/EIA 748 EVM Standard 3

4 MATURE PM PROCESSES AND PRACTICES USING EVM IMPROVES BUSINESS MEASURES
Marginal Performer PROGRAM MANAGEMENT CAPABILITY VS. COMPANY RETURN ON SALES 15% - 10% - 5% Mid-Point +5% +10% +15% Program Management Capability Return on Sales (%) Minimal Capability Qualified Participant Best In Class Composite World Class PROGRAM CPARS RATING Program Management Capability Worst Best Minimal Capability Marginal Performer Qualified Participant Best In Class Composite World Class - 8% - 6% - 4% - 2% Mid-Point + 2% + 4% + 6% + 8% Correlation of program management capabilities to award fee performance Composite World Class Minimal Capability Program Management Capability Percent Award Fee Capture Qualified Performer Best in Class Marginal PROGRAM AWARD FEE CAPTURE Source: 00-Mar 21 DCMC Conference Program Management Capability COMPANY WIN RATE VS. PROGRAM MANAGEMENT CAPABILITY - 30% - 20% - 10% Mid-Point + 10% + 20% + 30% 3yr Average Win Rate Minimal Capability Marginal Performer Qualified Participant Best In Class Composite World Class 4

5 Improved cost and schedule control processes and practices do not have to increase PM costs
Composite World Class Best in Class Program Management Capability Qualified Participant Marginal Performer Minimal Capability -30% -20% -10% MID-POINT +10% +20% +30% Program Office FTEs as % of Total Program FTEs FTE — Full Time Equivalent Program Manager(s), Deputy Program Manager(s), Financial Manager(s)/Financial Analyst(s), Scheduler(s)/Planner(s), Configuration and Data Manager(s), Chief Engineer(s)/Chief Technical Specialists, IPT or Functional Team Leads, Risk Focal Point(s), Subcontract Management, Administrative Support, Other Program Office functions 5

6 FAA “Cost of EVM” study indicated programs with mature EVM incur less PM costs
6

7 The use of EVM has several success stories with the Government and industry striving to increase EVM success stories 7

8 Most EVM training include integration of technical / schedule / costs
“All programs have an element of risk requiring effective and integrated cost / schedule management processes.” Technical Schedule Cost Superior technical solution Quick delivery Low cost Risk Management 8

9 OMB requires Quality measurement during Acquisition of Capital Assets
Circular No. A-11, Section 300, Planning, Budgeting, Acquisition and Management of Capital Assets, Section 300-5 Performance-based acquisition management Based on EVMS standard Measure progress towards milestones Cost Capability to meet specified requirements Timeliness Quality 9

10 10.5.1.1 Project Management Plan PMB:
PMI PMBOK® Guide recognizes Product Scope and quality/technical parameters Project Management Plan PMB: Typically integrates scope, schedule, and cost parameters of a project May also include technical and quality parameters 5. Project Scope Management, 2 elements Product scope. The features and functions that characterize a product, service, or result Project scope. The work that needs to be accomplished to deliver a product, service, or result with the specified features and functions. It can be argued that project management plans should always include technical and quality parameters 10

11 GAO Expects EVM to measure technical progress
GAO Cost Guide: “Reliable EVM data usually indicate monthly how well a program is performing in terms of cost, schedule, and technical matters.” “A WBS is an essential part of EVM cost, schedule, and technical monitoring, because it provides a consistent framework from which to measure actual progress.” “The benefits of using EVM are singularly dependent on the data from the EVM system. Organizations must be able to evaluate the quality of an EVM system in order to determine the extent to which the cost, schedule, and technical performance data can be relied on for program management purposes.” 11

12 GAO Cost Guide PMI PMBOK 10.5.1.1 Project Management Plan
Management expectation for EVM to include measures of quality technical progress is reasonable GAO Cost Guide PMI PMBOK  Project Management Plan 5. Project Scope Management 12

13 OSD “the left bar chart illustrates the fact that roughly half of our key Earned Value data fields are empty, for a variety of reasons” Funding Data Quality: Acceptable for Critical Measurements and Decision-making The Pie Charts show the relative amount of data quality (null or missing data). The Bar Chart with gradient colored backgrounds reveal the statistical/ empirically based analysis: satisfactory (Yellow), unacceptable (Red) and acceptable standard of data integrity relative to the Nine core attributes (includes both Acquisition Program Baseline and Earned Value Management metrics) used to assess overall health of Portfolio Performance Measurements. The Bar Chart area with the Green gradient background (at the top) means that funding, programmatic, and EVM data in this area should be acceptable for critical decision-making. The Bar Chart also shows by color legend the Services performance in terms of Data Integrity. EVM Data Quality: Unacceptable for Critical Measurements and Decision-making 13

14 OSD “the left bar chart illustrates the fact that roughly half of our key Earned Value data fields are empty, for a variety of reasons” Data quality and Data Integrity go hand-in-hand; thus this example stresses Data Integrity: the left bar chart illustrates the fact that roughly half of our key Earned Value data fields are empty, for a variety of reasons. The right bar chart indicates “not-reported” in terms of the frequency/category by percentage of counts in each of five key EVM metrics (ACWP, BAC, BCWP, BCWS,PMEAC) 14

15 GAO recent report included poor quality data finding on a major procurement
March 2008: DCMA determined that the data as being of poor quality and issued a report stating that it is deficient to the point where the government is not obtaining useful program performance data to manage risks. 15

16 The EVM community needs to conduct a root cause analysis with corrective action to regain our customers confidence GAO March 2008: DCMA determined that the data as being of poor quality and issued a report stating that it is deficient to the point where the government is not obtaining useful program performance data to manage risks. DCMA has significantly increased oversight with the intent to improve the usefulness of EVM to management 16

17 Let’s summarize, EVM works with numerous success stories
Some implementations go beyond the ANSI 32 Guidelines and have integrated quality and technical parameters Integration of scope, schedule, cost, quality, and technical measures is “Desired by our stakeholders using EVM data” EVM data integrity is a major issue with OSD 17

18 DoD Policy and Guides Specify Technical Performance
DoDI , Operation of the Defense Acquisition System (POL) Defense Acquisition Guidebook (DAG) Systems Engineering Plan (SEP) Preparation Guide WBS Handbook, Mil-HDBK-881A (WBS) Integrated Master Plan & Integrated Master Schedule Preparation & Use Guide (IMP/IMS) Guide for Integrating SE into DOD Acquisition Contracts (Integ SE) 18

19 DoD Policy and Guides: Common Elements
DAG, WBS, IMP/IMS, SEP Integrated Plans WBS, SEP, IMP/IMS Technical Performance Measures (TPM) EVM Technical reviews Event-driven timing Success criteria Assess technical maturity Integ SE Guide Include technical baselines in IMP/IMS During IBR, review: Correlation of TPMs, IMP, IMS, EVM Corr 19

20 Paragraph 3.8 – Performance Measurement
ANSI does not require technical nor quality parameter measurement, only the “quantity of work accomplished” ANSI/EIA 748 EVM Standard Paragraph 3.8 – Performance Measurement “Earned value is a direct measurement of the quantity of work accomplished. The quality and technical content of work performed is controlled by other processes. Earned value is a value added metric that is computed on the basis of the resources assigned to the completed work scope as budget.” EXAMPLES: If a test is complete (design meets the requirements); then it is acceptable to claim 100% earned value of the planned scope for “test” If software design, code, and test is complete, then it is acceptable to claim 100% earned value of the planned scope for “SW Development” ANSI does not require links or interfaces to quality or technical parameter measurement processes 20

21 Paragraph 2.2 Planning, Scheduling, and Budgeting
ANSI recognizes technical performance goals (but not required) and there are no references to quality parameters ANSI/EIA 748 EVM Standard Paragraph 2.2 Planning, Scheduling, and Budgeting a) Schedule the authorized work in a manner which describes the sequence of work and identifies significant task interdependencies required to meet the requirements of the program. B) Identify physical products, milestones, technical performance goals, or other indicators that will be used to measure progress. Note: Technical performance goals are acceptable, but not required 21

22 OSD Policies and Guides
Let’s summarize OMB, OSD policies and ANSI/EIA 748 as related to EVM, quality, and technical performance: OMB Guide Measure capability to meet requirements Measure quality OSD Policies and Guides Integrate WBS, IMP/IMS, EVM with TPMs Include success criteria of technical reviews in IMS Assess technical maturity ANSI/EIA 748 EVM Standard EVM limited to “quantity of work accomplished” Technical performance goals recognized, but not required Quality performance is not referenced Quality and Technical are specifically referenced as being “controlled by other processes” 22

23 Is it possible to include quality and technical parameters with EVM?
FAA has incorporated standard program milestones with exit criteria that represent: Quality and technical parameters Decision authority to verify acceptable quality and technical performance progress 23

24 The FAA technical milestones (required by AMS policy) are defined with clear exit criteria and decision authority 24

25 The FAA AMS technical milestones are used by multiple processes to align common performance measures
FID Earned Value Management OMB Exhibit 300 Report FAA Annual Perf Goals Systems Engineering FAA Functional WBS Mapping FAA Product Oriented WBS Schedule Baseline Cost Baseline 25

26 FAA EVM is summarizes the work and activities required to achieve the FAA AMS Standard Program Milestones 26

27 The FAA implementation of the ANSI/EIA 748 includes clear and well understood quality and technical parameters 27

28 Examples of integrating technical performance with EVM
Milestone success criteria met to claim 100% EV CDR: Design solution meets Allocated performance requirements Functional performance requirements Interface requirements Interim milestones with planned values for TPMs Weight does not exceed 300 lb. at (date) 90% of software functionality shalls met (date) Base EV on 2 measures: Completion of enabling work products (drawings, code) Meeting product requirements (as documented in technical baseline) 28

29 Some of the possible ANSI revisions are:
29


Download ppt "Joseph Houser Dan Milano Paul Solomon"

Similar presentations


Ads by Google