Download presentation
Presentation is loading. Please wait.
1
PROGRAM SUCCESS – A NEW WAY TO PREDICT IT
John Higbee DAU 25 August 2003 (INTRODUCTION – PRIOR TO SHOWING SLIDE 1) GOOD MORNING (INTRODUCE SELF/ORG/TEAM MEMBERS). THANKS FOR HOSTING US HERE TODAY. WE’RE HERE TO WALK YOU THROUGH AN ASA(ALT) INITIATIVE DESIGNED TO: IMPROVE THE ARMY’S ABILITY TO ACCURATELY ASSESS A PROGRAM’S PROBABLITY OF SUCCESS, AND CLEARLY/CONCISELY REPRESENT THAT SUCCESS PROBABILITY TO ARMY LEADERSHIP
2
STARTING POINT Tasking From ASA(ALT) Claude Bolton (March 2002)
Despite Using All the Metrics Commonly Employed to Measure Cost, Schedule, Performance and Program Risk, There are Still Too Many Surprises (Poorly Performing /Failing Programs) Being Briefed “Real Time” to Army Senior Leadership DAU (with Industry Representatives) was Asked to: Identify a Comprehensive Method to Better Determine the Probability of Program Success Recommend a Concise “Program Success” Briefing Format for Use by Army Leadership SECRETARY BOLTON DIRECTED THIS EFFORT AS A RESULT OF HIS CONCERN THAT, ALTHOUGH ARMY PROGRAMS WERE HEAVILY “MEASURED AND REPORTED”, HE REPEATEDLY SAW “BROKEN PROGRAMS” COME UP FOR HIS REVIEW, OR THAT OF USD(AT&L). HE ASKED DEFENSE ACQUISITION UNIVERSITY TO WORK WITH HIS STAFF TO DEVELOP: AN ACCURATE, COMPREHENSIVE METHOD OF ASSESSING A PROGRAM’S PROBABILITYOF SUCCESS, AND A PROCESS/BRIEFING PACKAGE THAT WOULD ALLOW THIS ASSESSMENT TO BE CLEARLY AND CONCISELY CONVEYED TO ARMY LEADERSHIP AS QUICKLY AS POSSIBLE ONCE DEVELOPED DAU RECEIVED THIS TASKING LAST SPRING – DEVELOPED THE CONCEPT AND BRIEFED IT IN AUGUST AND OCTOBER 2002 TO SECRETARY BOLTON – AND AT HIS DIRECTION, INITIATED THE PILOT EFFORT. THIS INITIATIVE IS BEING PILOTED ON TWO ARMY PROGRAMS AT FT. MONMOUTH, WITH A DECISION ON IMPLEMENTATION PLANNED FOR LATE SPRING. IF IMPLEMENTED, THIS PROCESS WILL BE INCORPORATED INTO THE ARMY’S ACQUISITION INFORMATION MANAGEMENT (AIM) APPLICATION AS A SELECTABLE OPTION – ALLOWING PROGRAM MANAGERS TO ENTER DATA USED IN MULTIPLE REPORTS ONLY ONCE. IN PARALLEL, REQUIRED REPORTS WILL BE EXAMINED WITH THE OBJECTIVE OF ELIMINATING THOSE THAT ARE REDUNDANT.
3
PROCESS PREMISE Current Classical Internal Factors for Cost, Schedule, Performance and Risk (Largely Within the Control of the Program Manager) Provide an Important Part of Program Success Picture – But NOT the WHOLE Picture Program Success also Depends on External Factors (Largely Not Within the PM’s Control, but That the PM Can Influence By Informing/Using Service/OSD Senior Leadership) Accurate Assessment of Program Success Requires a Holistic Combination of Internal and External Factors Internal: Requirements, Resources, and Execution External: Fit in the Vision, and Advocacy Develop An Assessment Model/Process Using Selected Metrics For Each Factor - Providing an Accurate “Program Pulse Check” Avoiding The “Bury In Data” Technique DAU ASSEMBLED A TEAM WITH EXPERIENCE AS GOVERNMENT AND INDUSTRY PROGRAM MANAGERS TO ATTACK SECRETARY BOLTON’S CHALLENGE. AS THE TEAM EXPLORED PROGRAM PERFORMANCE (BOTH SUCCESSFUL AND OTHERWISE) IT BECAME APPARENT THAT ULTIMATE PROGRAMMATIC SUCCESS DEPENDED ON MORE THAN JUST SUCCESSFUL MANAGEMENT OF COST, PERFORMANCE AND SCHEDULE RISK. THE TEAM SAW MULTIPLE EXAMPLES OF SUCCESSFULLY MANAGED PROGRAMS THAT LOST RESOURCES AND PRIORITY, AS WELL AS PROGRAMS IN TROUBLE THAT SUSTAINED (AND EVEN GAINED) RESOURCES AND PRIORITY. THE COMMON FACTOR IN THESE CASES WAS THE STRENGTH OF FACTORS EXTERNAL TO THE PROGRAM (SUCH AS HOW WELL THE PROGRAM FIT IN THE CAPABILITY VISION OF THE PARENT SERVICE AND DOD; AND THE STRENGTH OF PROGRAM ADVOCACY ON THE PART OF THE DECISION MAKERS CONTROLLING RESOURCES AND PROGRAM PRIORITY). THESE ARE FACTORS THAT HAVE NOT BEEN FORMALLY/OVERTLY INCLUDED IN MOST PROGRAM ASSESSMENT SCHEMES. THE TEAM’S ANALYSIS OF THE CHALLENGE AND ITS ROOT CAUSES PRODUCED THE FOLLOWING CONCLUSIONS: Current Classical Internal Factors for Cost, Schedule, Performance and Risk (The Metrics Universally Used by the Acquisition Leadership, and Largely Within the Control of the Program Manager) Provide an Important Part of the Program Success Picture – But NOT the WHOLE Picture Program Success also Depends on External Factors (Factors Largely Not Within the PM’s Control, but that the PM Can Influence By Informing/Using Service/OSD Senior Leadership) A COMPREHENSIVE ASSESSMENT OF PROGRAM SUCCESS REQUIRES A HOLISTIC COMBINATION OF INTERNAL AND EXTERNAL FACTORS Internal: Requirements, Resources, and Execution External: Fit in the Vision, and Advocacy WE THEN SELECTED LEVEL 2 METRICS (METRICS THAT IN AGGREGATE CREATE THE LEVEL 1 FACTOR ASSESSMENT) TO PROVIDE A “PULSE CHECK” ON THE PROGRAM AS EFFICIENTLY AS POSSIBLE (AVOID THE “BURY IN DATA” TECHNIQUE)
4
BRIEFING PREMISE Significant Challenge – Develop a Briefing Format That Conveyed Program Assessment Process Results Concisely/Effectively Was Consistent Across Army Acquisition Selected Briefing Format: Uses A Summary Display Organized Similarly to a Work Breakdown Structure Program Success (Level 0); Factors (Level 1); Metrics (Level 2) Relies On Information Keyed With Colors And Symbols, Rather Than Dense Word/Number Slides Easier To Absorb Minimizes Number of Slides More Efficient Use Of Leadership’s Time HAVING ESTABLISHED THE ASSESSMENT MODEL, THE NEXT STEP WAS TO CREATE A PROCESS AND A BRIEFING FORMAT THAT WOULD ALLOW ARMY ACQUISITION LEADERSHIP TO RAPIDLY/EFFICIENTLY UNDERSTAND THE STATE OF THE PROGRAM. TO DO THIS, WE DECIDED TO: LIMIT THE NUMBER OF SLIDES (MORE EFFICIENT USE OF THE LEADERSHIP’S TIME) RELY ON PICTURES, RATHER THAN DENSE WORD/NUMBER SLIDES (PICTURES ARE EASIER TO ABSORB)
5
PROGRAM SUCCESS PROBABILITY SUMMARY
PEO XXX Program Acronym ACAT XX COL, PM Date of Review: dd mmm yy Program Success (2) Program “Smart Charts” INTERNAL FACTORS/METRICS EXTERNAL FACTORS/METRICS Program Requirements (3) Program Resources Program Execution Program “Fit” in Capability Vision (2) Program Advocacy Program Parameter Status (3) Budget DoD Vision (2) Contract Earned Value Metrics (3) OSD (2) Program Scope Evolution Manning Contractor Performance (2) Transformation (2) Joint Staff (2) Contractor Health (2) Interoperability (3) Fixed Price Performance (3) War Fighter (4) Joint (3) Program Risk Assessment (5) Army Secretariat Army Vision (4) THE TEAM SPENT A CONSIDERABLE AMOUNT OF TIME EVALUATING CANDIDATE LEVEL 2 METRICS, THEN DETERMINING UNDER WHICH LEVEL 1 FACTOR THEY SHOULD BE PLACED. ONCE THIS WAS DONE, THE WORK BREAKDOWN STRUCTURE (WBS) FORMAT WAS SELECTED AS THE DESIGN FOR THE SUMMARY (OR “WINDSHIELD”) CHART: ALL THE LEVEL 1 FACTORS AND LEVEL 2 METRICS ARE DISPLAYED EACH FACTOR/METRIC HAS BOTH A STATUS (INDICATED BY COLOR – RED, YELLOW OR GREEN) AND A TREND (INDICATED BY ARROW OR THE NUMBER OF REPORTING PERIODS IT HAS REMAINED STABLE) EACH LEVEL 2 METRIC HAS A CHART DISCUSSING ITS STATUS AND TREND – HOWEVER, THE PROCESS IS DESIGNED TO ALLOW THE LEADERSHIP TO QUICKLY FOCUS ON SPECIFIC AREAS OF INTEREST BELOW THE SUMMARY CHART (USUALLY, LEVEL 1 FACTORS AND/OR LEVEL 2 METRICS THAT SHOW RED, OR YELLOW WITH A “DOWN ARROW”) – THUS ALLOWING THE TYPICAL PROGRAM TO BE PRESENTED IN LESS THAN FIVE SLIDES. WE’LL NOW PROCEED TO DISCUSS THE LEVEL 2 METRICS CHARTS Sustainability Risk Assessment (3) Congressional Current Force (4) Legends: Colors: G: On Track, No/Minor Issues Y: On Track, Significant Issues R: Off Track, Major Issues Gray: Not Rated/Not Applicable Trends: Up Arrow: Situation Improving (number): Situation Stable (for # Reporting Periods) Down Arrow: Situation Deteriorating Testing Status (2) Industry (3) Stryker Force Technical Maturity (3) Future Force International (3) Other Program Life Cycle Phase: ___________
6
2d Gen FLIR UNCLASSIFIED MAJ Ron Jacobs, SAAL-SA, COM 703-604-7018
February 4, 2002 UNCLASSIFIED Mr. Greg Wade, DAPR-FDM, COM Schedule program Description Mission: -Provides enhanced capability to fight during periods of reduced visibility. Also, provide multiple platforms the capability to "see farther than they can shoot" and "see the same battle space". Characteristics/Description: - Common "B kit" for LRAS3, Abrams and Bradley - Unique "A kit" for each sight - FOV: Narrow 2 degrees x 3.6 degrees Wide 7.5 degrees x 13.3 degrees Capability/Improvements: - > range over 1st Gen FLIR - +55% target detection - +70% target recognition - +150% target identification Special Features: - Digital output - Adapts to different platforms - 2X + 4X Electronic Zoom - Improved Displays Contractors: Raytheon Dallas, TX - Manufactures LRAS3, CITV, B Kit and related spares. FY02 $ 63.6M + TBD DRS Palm Bay, FL - Manufactures LRAS3, TIS, B Kit and related spares. FY02 $ 37.2M + TBD Program Funding Current Status • COST: - RDTE $208.9M - PROC $1,609.0M • SCHEDULE: - On Schedule • TECHNICAL: - Meeting technical requirements • FIELDING: - ABRAMS 1CD (3QFY02) - BRADLEY 1CD (2QFY02) - LRAS3 4ID(M) (3QFY02) 1ST IBCT (4QFY02) 1CD (1QFY03) • FUNDING: FY02 FY03 - RDTE $ 0.8M $ 0.0M LRAS3 FBCB2 Interface - PROC $168.6M $133.5M B Kits and Sights for Abrams, Bradley and LRAS3 - QTY 620 445 • ISSUES: - None • TRANSFORMATION CAMPAIGN PLAN: - This program supports the Legacy-to-Objective transition path of the Transformation Campaign Plan (TCP). (SHOW CHARTS FOUR AND FIVE) ON THE WINDSHIELD CHART, YOU’LL NOTE A ‘CHART BUBBLE’ NAMED “SMART CHARTS”. THESE CHARTS, ALREADY AVAILABLE AS AN “AIM VIEW”, ARE STAGED TO ALLOW THE PERSON BEING BRIEFED TO QUICKLY REVIEW THE BASIC PROGRAM INFORMATION IF HE OR SHE IS NOT FAMILIAR WITH THE PROGRAM ALREADY (IF A REFRESHER IS NOT NEEDED, THESE CHARTS WOULD BE SKIPPED)
7
2d Gen FLIR UNCLASSIFIED Congressional / OSD Issues 2nd Gen FLIR
Mr. Greg Wade, DAPR-FDM, COM February 4, 2002 MAJ Ron Jacobs, SAAL-SA, COM Congressional / OSD Issues 2nd Gen FLIR None. Requirements and Unit Costs FY02 Congressional Track • No language • No language SASC: SAC: • No language • No language CONF: CONF: • No language • No language
8
Program Acronym ACAT XX REQUIREMENTS - PROGRAM PARAMETER STATUS
PEO XXX Program Acronym ACAT XX REQUIREMENTS - PROGRAM PARAMETER STATUS COL, PM Date of Review: dd mmm yy (EXAMPLES) Threshold Objective Combat Capability Position diamond along bar to best show where each item is in terms of its threshold - objective range. C4I Interoperability (Strategic, Theater, Force Coord.,Force Control, Fire Control) Cost Manning (Non-KPP) Status as of Last Brief (mm/yy – e.g. “01/03”) Sustained Speed (SHOW SLIDE 6) WE’LL START WITH THE LEVEL 1 REQUIREMENTS FACTOR. THIS FACTOR HAS TWO LEVEL 2 METRICS – THE FIRST OF WHICH IS PROGRAM PERFORMANCE STATUS THIS METRIC IS DESIGNED TO EVALUATE THE PROGRAM STATUS IN MEETING THE PERFORMANCE LEVELS MANDATED BY THE WARFIGHTERS. A SERIES OF VERNIER BARS IS PROVIDED, WITH THRESHOLD AND OBJECTIVE LEVELS INDICATED BY LINES ACROSS THE VERNIERS. A COLORED DIAMOND IS PLACED ON THE VERNIER AT THE POINT IN THE THRESHOLD-OBJECTIVE “TRADE SPACE” THAT REPRESENTS THE CURRENT STATE OF THAT PERFORMANCE PARAMETER RED IF BELOW THRESHOLD; YELLOW IF ON THRESHOLD; GREEN IF ABOVE THRESHOLD; BLUE IF AT OR ABOVE THE OBJECTIVE POSITION IS SET BY LATEST TEST RESULTS; OR, IF NO TEST RESULTS ARE AVAILABLE, LATEST MODELLING/SIMULATION RESULTS IN THE AREA PERFORMANCE PARAMETERS ARE SELECTABLE AT THE DISCRETION OF THE PM WILL USUALLY CONTAIN ALL KEY PERFORMANCE PARAMETERS CAN INCLUDE NON-KPPS IF PM BELIEVES IT IMPORTANT TO INCLUDE THEM WHATEVER IS SELECTED SHOULD BE KEPT TO ONE CHART UNCOLORED DIAMONDS (LABELED WITH MONTH/YEAR OF ASSESSMENT) CAN BE INCLUDED TO PROVIDE A TREND HISTORY FOR EACH PARAMETER Program Performance Status Metric Calculation (maximum value is 10 points): Green (8 to 10): Performance Requirements are clearly understood; are well-managed by warfighter and are being well-realized by Program Manager. All KPP/selected non-KPP threshold values are met by latest testing results (or latest analysis if testing has not occurred) Yellow (6 to <8): Requirements are understood but are in flux (emergent changes from warfighter); warfighter management and/or PM execution of requirements has created some impact to original requirements set (set descope, or modification to original Objective/threshold values has/is occurring). One or more KPP/selected non-KPPs are below threshold values in pre-Operational Assessment testing (or analysis if OA testing has not occurred) Red (<6): “Killer Blow”, OR Requirements flux/”creep” has resulted in significant real-time changes to program plan requiring program rebaselining/restructure One or more KPP/selected non-KPPs are below threshold values as evaluated during OA/OPEVAL testing Endurance Comments: Y Predictive Y(3) Historical
9
REQUIREMENTS - PROGRAM SCOPE EVOLUTION
Program Acronym ACAT XX PEO XXX REQUIREMENTS - PROGRAM SCOPE EVOLUTION COL, PM Date of Review: dd mmm yy Requirement Funded Pgm Schedule (CE to FUE) (Budgeted/Obl) (Used / Planned) Original ORD (date) $#.#B / NA NA / 120 Months Current ORD (date) $#.#B / $#.#B 170/210 Months Stable Increased Descoped THE SECOND LEVEL 2 METRIC UNDER REQUIREMENTS IS PROGRAM SCOPE EVOLUTION. THIS METRIC IS DESIGNED TO ILLUSTRATE THE DEGREE OF PROGRAM RISK INHERENT IN OVERALL PROGRAM SCOPE GROWTH, FROM THE TIME (PRE-PROGRAM INITIATION) WHERE PROGRAM SCOPE WAS FIRST DETERMINED, TO THE PRESENT. A PROGRAM RISK FACTOR/METRIC CAN GENERALLY BE CONSIDERED AS THE RECIPROCAL VALUE OF THE CORRESPONDING PROGRAM SUCCESS FACTOR/METRIC. TO DO THIS, IT’S IMPORTANT TO NOTE THAT THE “ORIGINAL” DATA FOR COST AND SCHEDULE COMES, NOT FROM THE INITIAL ACQUISITION PROGRAM BASELINE (APB), BUT FROM THE STUDIES (ANALYSES OF ALTERNATIVES, CAIG AND/OR ICE TEAMS) THAT WERE DONE TO BOUND THE PROGRAM PRIOR TO PROGRAM INITIATION. THE REQUIREMENTS DATA FOR THE “ORIGINAL” LINE WILL BE TAKEN FROM THE ORIGINAL ORD. DATA FOR THE “CURRENT” LINE CAN BE TAKEN FROM MAPR/SMARTCHART DATA. UNDER THE “ORD” ENTRY, INDICATE HOWTHE REQUIREMENTS SET HAS CHANGED SINCE THE REQUIREMENT WAS INITIALLY APPROVED (GROWN, DESCOPED, OR STABLE) Program Scope Evolution metric calculation (maximum value is 10 points) Green (8 to 10): Program is being executed as originally scoped in “clean sheet of paper” analyses (original scoping of program accurate; minor changes (< 5-10% of original values) only since program initiation) Yellow (6 to <8): Program is executing with some changes from “clean sheet of paper” scopings: requirements, cost, and/or schedule have been changed (by > 10% of original values, in the direction of higher risk) without corresponding adjustment /infusion of resources to mitigate risk Red (<6): Program is executing with significant changes from “clean sheet of paper” scopings: requirements, cost, and/or schedule have been changed (by >15% of original values, in the direction of higher risk), without corresponding adjustment/infusion of resources to mitigate risk, OR Program is in an APB breach/Nunn-McCurdy breach status which has not been resolved. Level 1 Reqm’ts Factor Calculation (max value 20 points) = value (ORD KPP compliance) + value (prog scope evolution): Green (16 to 20) Yellow (12 to <16) Red (<12, or “Killer Blow” in either Level 2 subfactor) The two Level 2 metrics are weighted equally (each has a maximum value of 10 points) Comments: Y Predictive Y Historical
10
Program Acronym ACAT XX Date of Review: dd mmm yy
RESOURCES - BUDGET PEO XXX Program Acronym ACAT XX COL, PM Date of Review: dd mmm yy Army Goals (Obl/Exp): First Year Second Year Third Year RDT&E,A %/58% %/91% OP,A %/ %/ %/--- OM,A SUFF R/Y/G FY01 OBL/EXP FY02 FY03 OBL/ EXP FY04 FY05 FY06 FY07 FY08 FY09 RDT&E,A Xx%/yy% OPA N/A APA WPA O&M,A MILCON WE’LL NOW PROCEED TO THE SECOND OF THE LEVEL 1 FACTORS: RESOURCES. AFTER CONSIDERATION OF MANY DIFFERENT LEVEL 2 METRICS FOR THE RESOURCES FACTOR, WE DECIDED TO USE THREE – THE STATUS OF THE BUDGET; THE STAFFING/HEALTH OF THE GOVERNMENT PROGRAM OFFICE; AND THE STAFFING/HEALTH OF THE CONTRACTOR PROGRAM TEAM. THE BUDGET METRIC IS DESIGNED TO SHOW THE DEGREE OF RISK INHERENT IN THE CURRENT STATE OF THE BUDGET (BOTH IN CURRENT EXECUTION, AND LOOKING FORWARD THROUGHT THE FYDP). IT IS SIMILAR IN MOST RESPECTS TO TYPICAL BUDGET STATUS CHARTS USED IN PROGRAM REVIEWS IT INCLUDES A SECTION FOR FUNDS “CURRENT FOR OBLIGATION AND EXPENDITURE”, AS WELL AS A SECTION FOR THE PROGRAM FYDP CONTAINED IN THE LATEST PRESIDENT’S BUDGET. WHERE THIS METRIC DEPARTS FROM THE TYPICAL PROGRAM CHART IS IN THE EVALUATION OF BUDGET SUFFICIENCY FOR EACH PROGRAM APPROPRIATION. SUFFICIENCY IS DEFINED AS THE DEGREE TO WHICH THE AMOUNT AND PHASING OF EACH APPROPRIATION FOR A PROGRAM RETIRES PROGRAMMATIC RISK. HIGH SUFFICIENCY EQUATES TO LOW BUDGETARY RISK, AND VICE VERSA. THIS EVALUATION IS REPRESENTED BY A COLOR CODE BEHIND EACH BUDGET NUMBER IN THE TABLE (GREEN FOR HIGH SUFFICIENCY; YELLOW FOR MODERATE SUFFICIENCY; RED FOR LOW). A COLUMN FOR EACH APPROPRIATION’S OVERALL SUFFICIENCY VALUE IS PROVIDED ON THE RIGHT HAND SIDE OF THE CHART. Budget metric calculation (maximum value is 14 points) Green (11 to 14): Budget is sufficient to allow approved program to be executed with low risk. No more than one overall sufficiency “yellow” rating across all appropriations, across FYDP Yellow (8 to <11): Budget is sufficient to allow program to be executed with moderate risk. No more than two overall sufficiency “yellow” ratings across all appropriations, across FYDP Red (<8, or killer blow): Budget is insufficient to allow program to be executed without high risk. Three or more overall sufficiency “yellow” ratings and/or one or more overall sufficiency “red” ratings across all appropriations, across FYDP G Predictive G Historical Comments:
11
Program Acronym ACAT XX Date of Review: dd mmm yy
PEO XXX Program Acronym ACAT XX RESOURCES - MANNING COL, PM Date of Review: dd mmm yy PERSONNEL RESOURCES ARE CRITICAL TO THE ABILITY OF ANY PROGRAM TO EXECUTE ITS RESPONSIBILITIES…THIS IS WHY THE PERSONNEL STATUS OF THE GOVERNMENTAL AND CONTRACTOR PARTS OF THE PROGRAM TEAM WERE SELCETED AS THE REMAINING LEVEL TWO METRICS FOR RESOURCES. GOVERNMENTAL MANNING IS INTENDED TO SHOW SEVERAL KEY ASPECTS OF PROGRAM OFFICE STAFFING STATUS: CIVILIAN, MILITARY, MATRIXED AND CSS ASSET STATUSES ARE SHOWN SEPARATELY BUT ON THE SAME CHART, AS BUNDLED STOVEPIPES THE CURRENT (RIGHTMOST) AND PREVIOUS FIVE STATUS BARS SHOULD BE KEPT AND DISPLAYED. ALL BILLETS BELONGING TO THE PROGRAM OFFICE SHOULD BE ACCOUNTED FOR ACROSS THE CATEGORIES IF ANY CATEGORY(IES) HAVE VACANT BUT FUNDED/AUTHORIZED BILLETS ABOVE THE CURRENT MANNED COUNT, THE PM CAN (IF DESIRED) INDICATE THIS BY AN UN-COLORED EXTENSION OF THE COLUMN TO THE LEVEL OF THE TOTAL (MANNED + VACANT BILLETS) SPECIFIC PERSONNEL ISSUES IMPACTING ON THE PROGRAM’S ABILITY TO SUCCESSFULLY EXECUTE THE PROGRAM (I.E. WHAT KEY SPECIALTIES ARE MISSING; WHAT KEY BILLETS ARE UNFILLED/ABOUT TO BE VACATED) SHOULD BE HIGHLIGHTED IN THE COMMENTS SECTION BENEATH THE GRAPHIC. Manning and Qualification metric calculation (maximum value is 3 points) Green (2 to 3): 90% or above of all Program Office authorized/funded billets are filled; 90% (or more) of all DAWIA-qualified billets are filled with personnel possessing at least the required qualification level. CSS personnel funding levels below Congressionally-mandated limits Yellow (1 to<2): 80% -89% of all Program Office authorized/funded billets are filled; 80%-89% of all DAWIA-qualified billets are filled with personnel possessing at least the required qualification level. CSS personnel funding levels at or below Congressionally-mandated limits Red (<1): <80% of all Program Office authorized/funded billets are filled; <80% of all DAWIA-qualified billets are filled with personnel possessing at least the required qualification level. CSS personnel funding levels above Congressionally-mandated limits OCT 00 MAR 01 OCT 01 MAR 02 OCT 02 MAR 03 Comments: What Key Billets are Vacant? DPM Billet Still Vacant (Estimate Fill in Two Months) Lead Software Engineer (Emergent Loss) – Tech Director Filling In Need S/W Experienced GS-14 ASAP Is the Program Office Adequately Staffed? Yes (except as noted above) G Predictive G Historical
12
RESOURCES – CONTRACTOR HEALTH
PEO XXX Program Acronym ACAT XX COL, PM Date of Review: dd mmm yy Corporate Indicators Company/Group Metrics Current Stock P/E Ratio Last Stock Dividends Declared/Passed Industrial Base Status (Only Player? One of __ Viable Competitors?) Market Share in Program Area, and Trend (over last Five Years) Significant Events (Mergers/Acquisitions/ “Distractors”) Program Indicators Program-Specific Metrics “Program Fit” in Company/Group Program ROI (if available) Key Players, Phone Numbers, and their Experience Program Manning/Issues Contractor Facilities/Issues Key Skills Certification Status (e.g. ISO 9000/CMM Level) PM Evaluation of Contractor Commitment to Program High, Med, or Low IN ORDER TO EFFECTIVELY PARTNER WITH INDUSTRY, THE PROGRAM MANAGER HAS TO UNDERSTAND WHAT IS TRULY IMPORTANT TO HIS INDUSTRY PARTNER. THIS METRIC (CONTRACTOR HEALTH) PROVIDES AN EVALUATION OF THE STATE OF THE CONTRACTOR’S BUSINESS, AND HIS TEAM, TO THE PM , THE PEO AND THE SAE. THE METRIC IS BROKEN INTO TWO AREAS. THE FIRST, CORPORATE INDICATORS, IDENTIFIES SOME OF THE MORE INPORTANT METRICS (PRICE TO EARNINGS RATIO; HISTORY OF DIVIDENDS) THE COMMERCIAL WORLD USES TO EVALUATE CONTRACTOR HEALTH. OTHERS, LIKE BACKLOG, COULD BE SUBSTITUTED AT THE PM’S DISCRETION. ADDITIONALLY, THE COMPANY’S STATUS IN THE DEFENSE INDUSTRIAL BASE FOR THE PARTICULAR PROGRAM AREA, AND ANY SIGNIFICANT EVENTS WITH COMPANY-WIDE IMPACT , ARE DISCUSSED. FINANCIAL DATA FOR THE CONTRACTOR CAN BE FOUND ON THE SEC WEBSITE (START AT , AND SEARCH USING THE COMPANY’S NAME – ONCE THE REPORTS ARE CALLED UP, ACCESS THE COMPANY’S REPORT 10-K (PART II, SECTION 6) – THIS WILL PROVIDE THE P/E RATIO AND DIVIDEND DATA (AND A HOST OF OTHER USEFUL FINANCIAL METRICS)) THE SECOND, PROGRAM INDICATORS, SPEAKS SPECIFICALLY TO THE ASSIGNED PROGRAM/PROJECT TEAM. THIS PORTION OF THE METRIC PROVIDES AN EVALUATION OF HOW WELL THE CONTRACTOR HAS SET UP THE PROGRAM TEAM EXECUTING THE PROGRAM, ALONG WITH ANY SIGNIFICANT ISSUES AND CHALLENGES FACED BY THE CONTRACTOR. DATA FOR THIS CHART SHOULD BE DEVELOPED IN CONJUNCTION WITH THE CONTRACTOR AND THE ASSIGNED DCMA ORGANIZATION – BUT PM HAS RESPONSIBILITY (AND FINAL SAY) ON THE EVALUATION COLOR AND TREND DIRECTION ASSIGNED. DATA FOR PRIME AND KEY SUBCONTRACTORS SHOULD BE PROVIDED. OVERALL RATING SHOULD REFLECT THE AGGREGATE STATE OF THE CONTRACTOR/SUBCONTRACTOR TEAM Resources and Health - Contractor metric calculation (maximum value is 3 points) Green (2 to 3): No significant corporate / group issues affecting program; program is aligned with core business of business unit; program is properly staffed (team and key personnel); contractor facilities have no significant issues; corporate management demonstrates high commitment to program Yellow (1 to <2): Some corporate / group issues affecting program; program is peripheral to core business of business unit; program has some manning issues (team and/or key personnel) which are affecting program execution; contractor facilities have some issues affecting program execution; corporate management demonstrates moderate commitment to program Red (<1): Major corporate / group issues affecting program; program is not aligned with core business of business unit; program has significant manning issues (team and/or key personnel) which impede program execution; contractor facilities have major issues which impede program execution; corporate management demonstrates low commitment to program Level 1 Resources Factor Calculation (max value 20 points) = value (Budget) + value (gov’t manning/qual) + value (Ktr health): Green (16 to 20); Yellow (12 to <16); Red (<12 or “Killer Blow” in any Level 2 subfactor) The three Level 2 factors are weighted as follows: Budget (max value 14); Gov’t manning and qual (max value 3); Ktr health (max value 3) Y Predictive Y(2) Historical
13
EXECUTION – CONTRACT EARNED VALUE METRICS [give short contract title]
PEO XXX Program Acronym ACAT XX COL, PM Date of Review: dd mmm yy Axxxxx-YY-Cxxxx Contractor Name [Prime or Significant Sub] [TCPIEAC = 0.76] CV = $2.0 M SV = $2.9 M 04/00 01/02 04/02 04/04 08/04 Total Calendar Schedule PM’s EAC $M 0 % 42% 50% 100% 108% 122% $110 EAC 1.18 Behind Schedule and Underspent Ahead of Schedule and Underspent 111% $100 TAB 1.14 100% $90 BAC 04/99 1.10 (-0.95, 1.1) 07/99 (1.1,1.1) 10/99 04/00 1.06 PM’s Projected Performance at Completion for CPI and Duration. 01/00 07/00 10/00 1/01 1.02 02/02 04/01 CPI 03/02 56% $50 0.98 THE THIRD LEVEL 1 FACTOR IS PROGRAMMATIC EXECUTION, FOCUSING ON CONTRACT, TECHNICAL, AND RISK MANAGEMENT EFFORTS. LEVEL 2 METRICS FOR EXECUTION FOCUS IN THESE AREAS. THE FIRST LEVEL 2 METRIC IN THIS AREA LAYS OUT COST-PLUS CONTRACT PERFORMANCE FROM AN EARNED VALUE PERSPECTIVE. IT DEPARTS FROM TYPICAL DEPICTIONS OF PROGRAM EV DATA TO CREATE A “BULLS-EYE VIEW” OF PROGRAM EV PERFORMANCE AROUND A DESIRED EV “OPS POINT”. A CONTRACT PERFORMANCE CHART SHOULD BE CONSTRUCTED FOR EACH OF THE MAJOR DEVELOPMENTAL CONTRACTS SUPERVISED BY THE PROGRAM OFFICE. THE CHART IS CENTERED AT AN EV VALUE (SPI/CPI) OF (1.0, 1.0). ADDITIONALLY, AN “OPS BOX” BEEN ESTABLISHED (SPI AND CPI WITHIN A RANGE OF 0.95 TO 1.1) IF THE PROGRAM EV OPERATING POINT IS WITHIN THE OPS BOX, PROGRAM EV STATE IS CONSIDERED TO BE SATISFACTORY. CONTRACT EV OPERATING POINTS FOR THE HISTORY OF THE CONTRACT ARE PLOTTED BY (SPI, CPI). THE LAST FIVE OPS POINTS AND THE CURRENT OPS POINT ARE BOLDED AND JOINED BY ARROWS TO SHOW THE “DIRECTION OF MOTION” OF CONTRACT EV. IF THE PROGRAM EV OP POINT PLOTS OUTSIDE THE “OPS BOX”, THE REASON SHOULD BE SUMMARIZED IN A TEXT CALLOUT BOX LINKED TO THE OP POINT SCHEDULE AND BUDGET EXPENDITURES SHOULD BE PLOTTED IN THE APPROPRIATE MARGIN BARS OUTSIDE THE MAIN SPI/CPI GRAPH. OTHER SIGNIFICANT CONTRACT/EV DATA (E.G. EAC, BAC, BCWG SHOULD BE PROVIDED AROUND THE PERIMETER OF THE “BULLS-EYE” CHART DEFINITIONS: Terms used in this chart (e.g. CPI, SPI, EAC, ACWP, etc) are standard EVMS terms. “OPS Box” - the box offset around the SPI/CPI point of (1.0,1.0). Program SPI/CPI points within this box (SPI values between 0.95 and 1.1; CPI values between 0.95 and 1.1) indicate satisfactory program EVMS status from the ASA(ALT) perspective SPI/CPI points for the last six EV reports should be plotted, and joined with arrows showing the “direction of motion” of SPI/CPI. Data points are plotted using SPI/CPI as the x,y coordinates for each point. Earlier points may be left on the plot (gray toned) to display EV history as desired. The “Ops Box” (the offset box around the SPI/CPI “”origin” point) is considered the normal operating region for Army programs. If SPI/CPI plots outside (particularly to left and/or below the “Ops Box”), the reason(s) should be discussed in the brief (using a text callout box like the white box above) Contract Performance metric calculation (maximum value is 2 points) Green (2): Value of most recent SPI/CPI point lies in the “ops box” (inner box) or the GREEN zone of the chart Yellow (1):Value of most recent SPI/CPI point lies in YELLOW zone of the chart which is outside of the “ops box” on the chart Red (0): Value of most recent SPI/CPI point lies in the RED zone of the chart 05/02 07/01 ACWP 0.960 04/02 0.94 (-0.95, -0.95) (1.1, -0.95) EV % Spent 10/01 01/02 0.90 Total Spent 0.86 0.940 Behind Schedule and Overspent Ahead of Schedule and Overspent 0% 0.82 KTR’s EAC: 104M Y Predictive 0.82 0.86 0.90 0.94 0.98 1.02 1.06 1.10 1.14 1.18 1.18 Y(3) Historical SPI YYMMDD Date of Last Award Fee: MMM YY Date of Next Award Fee: MMM YY Date of Last Rebaselining: JAN02 Number of Rebaselinings: 1 Date of Next Rebaselining: MMM YY
14
EXECUTION – CONTRACTOR PERFORMANCE
PEO XXX Program Acronym ACAT XX COL, PM Date of Review: dd mmm yy Contractor Performance Assessment (Drawn From CPARS/PPIMS, etc) Last Evaluation (Provide Summary of Evaluation Last Provided to Contractor, Along with PM evaluation of Current Status) Highlight Successes as Well as Areas of Concern Performance Trend (over the Contract Period of Performance) Award/Incentive Fee History Summary of Actual Award/Incentive Fees Provided to Contractor If Different than Specified in Fee Plan, Discuss Reasons/Actions Indicated from the Situation Are Fee Awards Consistent with Contractor Performance Assessments? THE SECOND LEVEL 2 EXECUTION METRIC PROVIDES THE TRACK RECORD OF THE CONTRACTOR ON DEVELOPMENTAL, COST PLUS-TYPE CONTRACT VEHICLES, BY LOOKING AT THE PPIMS HISTORY FOR THE CONTRACT(S) IN QUESTION, AND THE HISTORY OF AWARD FEE INCREMENTS PROVIDED TO THE CONTRACTOR (AS COMPARED TO THE AMOUNTS SPECIFIED IN THE AWARD FEE PLAN) PPIMS/AF/IF metric calculation (maximum value is 2 points) Green (2): Average value (coloration) of factors a. – e. above is GREEN or above (with no factor RED), and Contractor is at 80% (or above) of possible award fee for duration of contract to date Yellow (1): Average value (coloration) of factors a. – e. above is YELLOW to GREEN (with no more than one factor RED), and/or Contractor is at 50% (or above) of possible award fee for duration of contract to date Red (0): Average value (coloration) of factors a. – e. above is RED to YELLOW; or two or more factors RED; or Contractor is below 50% of possible award fee for duration of contract to date G Predictive G(2) Historical
15
EXECUTION – FIXED PRICE PERFORMANCE
PEO XXX Program Acronym ACAT XX COL, PM Date of Review: dd mmm yy DCMA Plant Rep Evaluation Major Issues Delivery Profile Graphic (Plan vs Actual) Progress Payment Status FIXED PRICE CONTRACTS REQUIRE THEIR OWN EVALUATION SCHEME. EARNED VALUE METRICS, WHILE KEY TO MANAGING COST-PLUS CONTRACTS, ARE NOT USEFUL IN EVALUATING FIXED PRICE VEHICLES. THEREFORE THE LEVEL 2 METRIC FOR FIXED PRICE CONTRACTS INCLUDES: : A DCMA PLANT REP EVALUATION – THE DCMA REPRESENTATIVE FOR THE PLANT PRODUCING THE ITEM SHOULD PROVIDE HIS INPUT ON OVERALL CONTRACTOR PERFORMANCE, IDENTIFYING ANY PARTICULARLY SUPERIOR PERFORMANCE AND/OR ANY ONGOING/EMERGENT PROBLEMS, ALONG WITH HIS ASSESSMENT OF ROOT CAUSES AND POTENTIAL SOLUTIONS. A PRODUCTION/DELIVERY PROFILE GRAPHIC – THIS GRAPHIC IS A STANDARD GRAPHIC DISPLAYING BOTH THE CLIN-DEFINED PLANNED PRODUCTION/DELIVERY SCHEDULE PROFILE, AND THE “ACTUALS” PRODUCTION/DELIVERY DATA A PROGRESS PAYMENTS STATUS – TO DISCUSS THE ACTUAL STATUS OF PROGRESS PAYMENTS ON THE SPECIFIC CONTRACT, ALONG WITH REASONS FOR LESS-THAN-PLANNED PAYMENTS (IF APPLICABLE) A FIXED PRICE PERFORMANCE SLIDE SHOULD BE CONSTRUCTED FOR EACH MAJOR FIXED PRICE CONTRACT SUPERVISED BY THE PROGRAM OFFICE. Contract Performance (fixed price) metric calculation (maximum value is 2 points) Green (2): Actual Production/delivery profile is ahead or on contract schedule; no DCMA Plant Rep issues; progress payments are on schedule per the contract Yellow (1):Actual Production/delivery profile is less than 10% behind contract schedule; DCMA Plant Rep issues are minor and being resolved; progress payments are less than 10% behind schedule per the contract Red (0): Actual Production/delivery profile is 10% or greater behind contract schedule; DCMA Plant Rep issues are major; progress payments are 10% or greater behind schedule per the contract G Predictive G(3) Historical
16
EXECUTION - PROGRAM RISK ASSESSMENT
PEO XXX Program Acronym ACAT XX COL, PM Date of Review: dd mmm yy A brief description of Issue # 2 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 5 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 1 and rationale for its rating. Approach to remedy/mitigation A brief description of Issue # 6 and rationale for its rating. Approach to remedy/mitigation ( ) 5 High A brief description of Issue # 3 and rationale for its rating. Approach to remedy/mitigation ( ) 4 THE FOURTH LEVEL 2 METRIC IN THE EXECUTION FACTOR IS THE RISK ASSESSMENT COVERING ALL INTERNAL FACTORS (REQUIREMENTS, RESOURCES AND EXECUTION). IT IS DESIGNED TO PROVIDE A CONCISE, ONE PAGE SUMMARY OF THE KEY RISKS IDENTIFIED BY THE PM. IT USES A STANDARD “RISK SQUARE” DISPLAY (WITH CONSEQUENCE OF THE RISK ON THE X-AXIS, AND LIKELIHOOD OF THE RISK ON THE Y-AXIS). COLORATION OF THE INDIVIDUAL SQUARES CORRESPONDS TO THE RISK DEFINITIONS (LOW, MEDIUM OR HIGH) ASSIGNED TO EACH SQUARE OF THE 25-SQUARE DISPLAY. INDIVIDUAL SIGNIFICANT RISKS ARE PLOTTED IN THE RISK SQUARE BY THE (CONSEQUENCE., LIKELIHOOD) X/Y COORDINATES ASSIGNED. CALL-OUT TEXT BOXES ARE USED TO PROVIDE A SHORT SUMMARY OF THE PARTICULAR RISK IDENTIFIED. TREND OF THE RISK (UP ARROW: RISK IMPROVING; DOWN ARROW: RISK DETERIORATING; (#): RISK LEVEL STEADY FOR # REPORTING PERIODS) DEFINITIONS: Risk Assessment: Each issue which might affect the success of the program (technical, schedule, fiscal, etc) needs to be identified and assessed as to likelihood and consequences (performance or financial) of occurrence. The following is a rough key to scoring: Likelihood: (1)Negligible - One can reasonably assume no occurrence (<10%) (4)Highly Probable - Very high changes of occurrence (65-90%) (2)Unlikely - Occurrence possible but less than likely (10-40%) (5)Near Certainty - Assume and anticipate occurrence (>90%) (3)Likely - significant chance of occurrence (40-65%) Consequences: (1)Marginal - Remedy will cause disruption to the program (4)Very Serious - Potentially fails a KPP or OPEVAL. (2)Significant -.Shorts an significant mission need (5)Catastrophic - Jeopardizes an exit criterion of current Phase (3)Serious - Shorts a critical mission need but expect no breech If the assessment is done formally by a standing advisory board (good program management) then please list the members and their affiliations. Each issue box should contain a brief statement of intended approach. Presenter should be prepared for more detailed discussion on these issues and alternative courses of action Reqm’t/Resources/Execution Risk Assessment metric calculation (maximum value is 8 points) Green (6 to 8): Value (coloration) of majority of program risk issues lies in GREEN zone of the risk chart (no more than two risk issues in YELLOW zone; zero areas in RED zone) Yellow (4 to <6):Value (coloration) of majority of program risk issues lies in YELLOW zone of the risk chart (no more than three risk issues in YELLOW zone and/or one risk issue in RED zone) Red (<4, or killer blow in sub-factor): Value (coloration) of majority of risk issues lies in YELLOW to RED zone of the risk chart (two or more risk issues in RED zone) (4) 3 Medium Likelihood (2) (3) 2 Trends: Up Arrow: Situation Improving (#): Situation Stable (for # Reporting Periods) Down Arrow: Situation Deteriorating 1 Low 1 2 3 4 5 Y Predictive Consequence Y(5) Historical
17
EXECUTION – SUSTAINABILITY RISK ASSESSMENT Program Acronym ACAT XX
PEO XXX Program Acronym ACAT XX COL, PM Date of Review: dd mmm yy Low Risk Medium Risk High Risk RISK # 6 Brief description of Issue and rationale for its rating. Approach to remedy/mitigation. Likelihood 5 4 3 2 1 6 7 5 RISK #5 Brief description of Issue and rationale for its rating. Approach to remedy/mitigation. 4 3 1 2 Consequence AFTER REVIEW OF THE LEVEL 1 FACTORS AND LEVEL 2 METRICS PROPOSED FOR THIS PROCESS, MR. BOLTON (THE ARMY SAE) DIRECTED THAT AN ADDITIONAL LEVEL 2 METRIC BE INCLUDED – THE SPECIFIC ASSESSMENT OF PROGRAM RISKS IN THE SUSTAINABILITY AREA THIS METRIC CALLS OUT THE MAJOR AREAS IN SUSTAINABILITY (WHICH INCLUDE, BUT ARE NOT LIMITED TO, THE MAJOR ELEMENTS IN THE PROGRAM’S LOGISTICS SUPPORT ANALYSIS) TO CREATE THE METRIC EVALUATION. A DATA POINT FOR EACH MAJOR SUSTAINABILITY AREAS IS PLOTTED, ALONG WITH A BRIEF DESCRIPTION/MITIGATION PLAN FOR THOSE AREAS EVALUATED AS BEING IN THE HIGH (RED) AND MEDIUM (YELLOW) RISK AREAS OF THE RISK SQUARE. SUSTAINABILITY PLANNING AREAS INCLUDE, BUT ARE NOT LIMITED TO: SUPPLY SUPPORT, TRAINING (INCLUDING TRAINING EQUIPMENT), SUPPORT EQUIPMENT (INCLUDING TEST EQUIPMENT), FACILITIES AND PUBLICATIONS. THE PROGRAM’S OVERALL SUSTAINABILITY ASSESSMENT IS PLOTTED WITH A TRIANGLE. CONSIDER PROGRAM RELIABILITY (MEAN TIME BETWEEN FAILURES) AND MAINTAINABILITY (MAINTENANCE MAN HOURS PER OPERATING HOUR) IN POSITIONING THE TRIANGLE. IF THE PROGRAM IS NOT ON TRACK TO ACHIEVE RELIABILITY AND MAINTAINABILITY TARGETS, ESPECIALLY RELIABILITY, SUSTAINABILITY SUPPORT WILL BE NEGATIVELY IMPACTED. DEFINITIONS: Risk Assessment: Each issue which might affect the success of the program (technical, schedule, fiscal, etc) needs to be identified and assessed as to likelihood and consequences (performance or financial) of occurrence. The following is a rough key to scoring: Likelihood: (1)Negligible -One can reasonably assume no occurrence (<10%) (4)Highly Probable-Very high chance of occurrence (65-90%) (2)Unlikely - Occurrence possible but less than likely (10-40%) (5)Near Certainty - Assume and anticipate occurrence (>90%) (3)Likely - significant chance of occurrence (40-65%) Consequences: (1)Marginal - Remedy will cause disruption to the program (4)Very Serious - Potentially fails a KPP or OPEVAL. (2)Significant -.Shorts an significant mission need (5)Catastrophic - Jeopardizes an exit criterion of current Phase (3)Serious - Shorts a critical mission need but expect no breach If the assessment is done formally by a standing advisory board (good program management) then please list the members and their affiliations. Each issue box should contain a brief statement of intended approach. Presenter should be prepared for more detailed discussion on these issues and alternative courses of action Sustainability Risk Assessment metric calculation (maximum value is 2 points) Green (2): Value (coloration) of majority of program risk issues lies in GREEN zone of the risk chart (no more than two risk issues in YELLOW zone) Yellow (1):Value (coloration) of majority of program risk issues lies in YELLOW zone of the risk chart (no more than three risk issues in YELLOW zone and/or one risk issue in RED zone) Red (0): Value (coloration) of majority of risk issues lies in YELLOW to RED zone of the risk chart (two or more risk issues in RED zone) RISK # 4 Brief description of Issue and rationale for its rating. Approach to remedy/mitigation. : Overall Assessment 1: Training 2: Support Equipment 3: Publications 4: Facilities 5: Maintenance Concept 6: Supply Support 7: MTBF/Ao/Reliability Sustainability Areas (examples) Y Predictive Y(3) Historical
18
EXECUTION – TESTING STATUS
PEO XXX Program Acronym ACAT XX COL, PM Date of Review: dd mmm yy Contractor Testing (e.g. Qualification, Integration) - Status (R/Y/G) Major Points/Issues Developmental Testing – Status (R/Y/G) Operational Testing – Status (R/Y/G) Follow-On Operational Testing – Status (R/Y/G) Special Testing – Status (R/Y/G) (Could Include LFT&E, Interoperability Testing (JITC), Etc.) TEMP Status Other (DOT&E Annual Report to Congress, etc – As Necessary) TESTING IS A KEY LEVEL 2 EXECUTION METRIC FOR ANY PROGRAM, BOTH AS AN INDICATOR OF PRODUCT CAPABILITY, AND AS A PREREQUISITE FOR MILESTONE APPROVALS AND BUDGET RELEASE. THIS METRIC SUMMARIZES THE TESTING STATUS OF THE PROGRAM, ALONG WITH IDENTIFYING ANY SIGNIFICANT TESTING FOR THE ACQUISITION LEADERSHIP. TESTING TERMS REFER TO THE STANDARD PROGRAMMATIC TESTING PHASES AS USED BY DOT&E AND THE SERVICES. For the testing phases: GREEN: Testing on/ahead of schedule per the TEMP/contractor plan; no significant problems exist (significant problems include such items as KPPs falling below threshold values; serious Reliability/Maintainability/ Availability issues; contractor first article/integration failures, etc) YELLOW: Testing behind schedule per the TEMP/contractor plan but not seriously impacting (i.e., to the APB breach level or creating serious budgetary impact) the program plan; significant problems exist but are resolvable within normal programmatic execution at the PM/PEO level RED: Testing significantly behind schedule per the TEMP/contractor plan and seriously impacting (i.e. APB breach or serious budgetary impact) the program plan; significant problems exist that are not resolvable within normal programmatic execution at the PM/PEO level. Testing Performance metric calculation (maximum value is 2 points) Green (2): Current Testing Phase is GREEN Yellow (1): Current Testing Phase is YELLOW Red (0): Killer Blow (e.g. adverse testing phase report from service testing component/DOT&E), or Current Testing Phase is RED G Predictive G(2) Historical
19
EXECUTION – TECHNICAL MATURITY
PEO XXX Program Acronym ACAT XX COL, PM Date of Review: dd mmm yy CRITICAL TECHNOLOGY MATURITY CRITICAL TECHNOLOGY DESCRIPTION/ISSUE TRL G/Y/R PROGRAM DESIGN MATURITY ENGINEERING DRAWINGS G/Y/R PERCENTAGE OF DRAWINGS APPROVED /RELEASED FOR USE ISSUES PROGRAM INTEGRATION/PRODUCTION FACTORS INTREGRATION/PRODUCTION FACTOR DESCRIPTION/ISSUE IRL/PRL G/Y/R PROGRAM PRODUCTION MATURITY KEY PRODUCTION PROCESSES G/Y/R PERCENTAGE OF KEY PROD. PROC. UNDER STAT. PROCESS CONTROL THE FINAL LEVEL 2 METRIC IN THE EXECUTION FACTOR IS TECHNICAL MATURITY. ANALYSES OF MULTIPLE MAJOR PROGRAMS HAVE SHOWN THAT THE LEVEL OF TECHNICAL MATURITY POSSESSED BY A PROGRAM AT KEY STAGES OF PROGRAM CONCEPTION, DEVELOPMENT AND PRODUCTION IS AN EXCELLENT PREDICTOR OF WHETHER OR NOT THE PROGRAM WILL MEET ESTABLISHED COST AND SCHEDULE GOALS. THE GAO HAS DONE AN EXTENSIVE STUDY OF BEST PRACTICES IN ACQUISITION PROGRAMS AND HAS FURTHER DEFINED METRICS IN SUPPORT OF THIS PRINCIPLE (CRITICAL TECHNICAL MATURITY ISSUES AT KEY PROGRAM “FLOW POINTS”) : EARLY IN THE PROGRAM: (From Concept Exploration to program initiation (Milestone B)), a match is achieved between the user’s needs and the developer’s technical resources: Metric: the set of Critical Technologies (CT) has been (a) identified; and (b) what percentage of this set is at Technology Readiness Level 8 or above GREEN: %; YELLOW: <80%; RED: <60% DURING THE DESIGN/EARLY PRODUCTION PHASE: (From Program Initiation to Critical Design Review for the Program), the goal is achieving product design stability: Metric: The Program Manager has the option of using either: what percentage of the total number of required program Engineering Drawings have been approved and released for use; or, The percentage of program integration factors/program production factors which are at a Integration Readiness Level /Production Readiness Level of 8 or higher (If integration/production factors are the selected metric, use TRL chart format to display) DURING RATE PRODUCTION: At Milestone C (or at stable Low Rate Production), the product has demonstrated that it can be produced within cost, schedule and quality targets: Metric: the key production processes have been (a) identified and (b) what percentage of them are under statistical process control, GREEN: %; YELLOW: <80%; RED: <60% Technical Maturity factor calculation (maximum value is 2 points) Green (2): Metric for current stage of program is GREEN Yellow (1): Metric for current stage of program is YELLOW Red (0): Metric for current stage of program is RED (NOTE: Subtract 1 point from above total for each previous stage metric (if applicable) that is not at a GREEN level) Level 1 Execution factor calculation (max value 20 points) = sum of all 7 advocacy metrics Allocated Values: Reqm’ts/Resources/Execution Risk Assessment metric (8 points max); the remaining six metrics are of equal value (each 2 points max) GREEN: (16 to 20); YELLOW: (10 to <16); RED: (<10) G Predictive G(2) Historical
20
PROGRAM “FIT” IN CAPABILITY VISION
PEO XXX Program Acronym ACAT XX COL, PM Date of Review: dd mmm yy AREA(Examples) STATUS TREND DoD Vision G (2) Transformation G (2) Interoperability Y (3) Joint G (3) Army Vision Y (4) Current Force Y (4) Stryker Force Y Future Force (N/A) (N/A) Other (N/A) (N/A) Overall Y (2) THE FIRST OF THE TWO EXTERNAL LEVEL 1 FACTORS IS FIT WITHIN THE CAPABILITY VISION. HOW WELL A PROGRAM IS SUPPORTED IN THE LARGER SERVICE AND OSD ARENAS IS IN LARGE PART DETERMINED BY HOW WELL ITS PRODUCT SUPPORTS THE SPECIFIC CAPABILITY VISION(S) IT IS DESIGNED TO MEET. NOTE THAT BOTH THE SERVICE AND THE OSD VISIONS ARE ADDRESSED IN THIS FACTOR… SINCE OSD HAS STRONGLY ASSERTED ITS PREROGATIVES IN THIS AREA (TO INCLUDE PROGRAM TRUNCATION/TERMINATION FOR PROGRAMS WHICH IGNORED THE OSD VISION IN FAVOR OF CONCENTRATING ON THE SERVICE VISION). LEVEL 2 METRICS THAT SUPPORT THE LEVEL 1 FACTOR INCLUDE: WITHIN THE DOD VISION: TRANSFORMATION: THE EXTENT TO WHICH THE PROGRAM POSSESSES THE TRANSFORMATIONAL ATTRIBUTES (E.G. PRECISION, LETHALITY, STREAMLINED/COMMON LOGISTICS, ETC) SPECIFIED BY OSD LEADERSHIP INTEROPERABILITY: THE EXTENT TO WHICH THE PROGRAM COMPLIES WITH/HAS EMBEDDED WITHIN IT THE ARCHITECTURAL /ENGINEERING CHARACTERISTICS (E.G. COMPLIANCE WITH THE GLOBAL INFORMATION GRID (GIG)/INFORMATION DISSEMINATION MANAGEMENT (IDM) CRDS, DII, OPEN ARCHITECTURE PROTOCOLS) WHICH WOULD ALLOW IT TO INTEROPERATE ACROSS SERVICES (JOINT FORCES)/WITHIN COALITIONS JOINTNESS: THE EXTENT TO WHICH THE PROGRAM IS USABLE BY OTHER SERVICES, JOINT OPERATIONS, AND COALITIONS WITHOUT UNIQUE SUPPORT ARRANGEMENTS BEING MADE BY THOSE USERS WITHIN THE HQDA VISION: IS THE PROGRAM A PART OF THE CURRENT, STRYKER, OR FUTURE FORCE (OR MORE THAN ONE OF THOSE FORCES)? IF CURRENT OR STRYKER FORCE, IS IT SCHEDULED FOR NEAR TERM OR MID TERM PHASEOUT? IF INITIAL OR FULL OPERATIONAL CAPABILITY DATES ARE ESTABLISHED, WILL THE PROGRAM MEET THOSE DATES (AND IF NOT, WHAT ARE THE REASONS) DoD Vision (7.5 points max): Transformation, Interoperability and Jointness metrics are equally weighted (2.5 points apiece): GREEN (5 to 7.5): program is transformational, compliant with DoD interoperability guidance/standards, and interoperable by other services, joint forces, and coalitions on a “come s you are” basis YELLOW (3 to <5): Significant deficiencies in at least one of the three above areas RED (<3): Significant deficiencies in at least two of the three above areas TREND: Appropriate arrow, or, if unchanged, indicated # of reporting periods unchanged HQDA Vision (7.5 points max): Points assigned by how well program supports the Force(s) it plays within: GREEN (5 to 7.5): program is a planned key/core supporter of its Force(s) and is on track to provide planned capability on schedule YELLOW (3 to <5): program is a secondary/peripheral supporter of its Force(s) or is a key/core supporter and is encountering problems impacting its ability to provide planned capability on schedule RED (<3): Killer blow; OR program is encountering problems which will prevent it from providing planned capability on schedule TREND: Appropriate arrow, or, if unchanged, indicated # of reporting periods unchanged Level 1 “Fit in the Vision” factor calculation = value (DoD vision) + value (Army vision) = 15 points max GREEN: (10 to 15); YELLOW: (6 to <10); RED: “Killer Blow”, or <6 points Y Predictive Y(2) Historical
21
Program Acronym ACAT XX Date of Review: dd mmm yy
PROGRAM ADVOCACY PEO XXX Program Acronym ACAT XX COL, PM Date of Review: dd mmm yy AREA(Examples) STATUS TREND OSD Y (2) (Major point) Joint Staff Y (2) War Fighter Y (4) Army Secretariat G Congressional Y Industry G (3) (Major Point) International G (3) Overall Y THE FINAL LEVEL 1 FACTOR IS ADVOCACY. ADVOCACY IS DEFINED AS ACTUAL, TANGIBLE SUPPORT FOR A PROGRAM ON THE PART OF A SENIOR ADVOCATE IN A POSITION TO AFFECT THE PRIORITY OF THE LEVEL OF RESOURCES RECEIVED BY A PROGRAM. (AN ADVOCATE IS DEFINED AS AN ELECTED OR APPOINTED GOVERNMENTAL OFFICIAL; A FLAG OFFICER; OR A CAREER SES IN A LEADERSHIP POSITION WITHIN AN ADVOCACY GROUP). ADVOCATES/ADVOCACY GROUPS INCLUDE, BUT ARE NOT LIMITED TO, THE FOLLOWING: OSD: FLAG/SES LEVEL DECISION MAKERS IN OSD ORGANIZATIONS (E.G. USD(AT&L); ASD (C3I); DIRECTOR, PA&E; DIRECTOR, DOT&E; ASD (COMPTROLLER); USECAF (FOR SPACE PROGRAMS)) JOINT STAFF : FLAG/SES LEVEL IN JOINT STAFF, (PARTICULARLY JRB, JRP AND JROC PROCESSES) WARFIGHTER: FLAG/SES LEVEL IN SERVICE AND JOINT WARFIGHTING COMMANDS, CSA STAFF ARMY SECRETARIAT: SES/FLAG INCUMBENTS AT DASA LEVEL AND ABOVE CONGRESSIONAL: SENATORS/MEMBERS OF CONGRESS/PROFESSIONAL STAFF OF THE FOUR COMMITTEES (HASC/SASC/ HAC/SAC) INDUSTRY: SENIOR EXECUTIVES OF INVOLVED CORPORATIONS INTERNATIONAL: (AS APPLICABLE): SENIOR GOVERNMENTAL DECISION MAKERS / EXECUTIVES OF FOREIGN INDUSTRY PARTNERS WEIGHTING OF THESE METRICS ARE AS FOLLOWS: WARFIGHTER ADVOCACY IS MOST IMPORTANT. ADVOCACY AT THE CONGRESSIONAL/JOINT STAFF LEVEL IS AT THE NEXT LOWER LEVEL OF IMPORTANCE; ALL OTHER ADVOCACIES ARE LESS IMPORTANT THAN CONGRESSIONAL/JOINT STAFF ADVOCACIES TWO IMPORTANT POINTS: (1) PER ASA(ALT) DIRECTION: THIS SLIDE IS THE PEO’S SLIDE – INPUTS WILL BE PROVIDED BY THE PM, BUT THE FINAL EVALUATION IS THE PEO’S POSITION. (2) R/Y/G EVALUATIONS SHOULD BE BASED ON STATEMENTS/DOCUMENTS/ DECISIONS THAT ARE “MATTERS OF RECORD” (VOICE OVER BY PEO WHILE BRIEFING CAN PROVIDE AMPLIFYING/SUPPORTING DATA THAT DOES NOT MEET THAT CRITERION GREEN(0.8 – 1.0 of allocated point value): Strong support for program demonstrated (e.g. plus up or protection of program budget; acceleration of program; public statements specifically identifying program in favorable light) YELLOW: ( of allocated point value): No position on program taken; no actions (positive or negative) taken on program budget RED: (below 0.6 of allocated point value): Killer blow by any advocacy party; Negative support for program demonstrated (e.g. program repeatedly used as a “bill payer” for other, higher priority efforts; program “stringout” (length of buy increased while yearly quantities dropped)); negative statement/decisions/actions on program by decision makers TREND: Appropriate arrow, or if unchanged, indicated # of reporting periods Level 1 Advocacy factor calculation (max value 25 points) = sum of all level 2 Advocacy metrics Allocated Values: Warfighter (9 points max); Congressional/Joint Staff (each 5 points); all others are 2 points each (unless an international program, in which case international and industry together are 2 points) GREEN: (20 to 25); YELLOW: (15 to <20); RED: (<15) Y Predictive Y Historical
22
Program Acronym ACAT XX Date of Review: dd mmm yy
PEO XXX Program Acronym ACAT XX FINDINGS / ACTIONS COL, PM Date of Review: dd mmm yy Comments/Recap – PM’s “Closer Slide” THIS SLIDE PROVIDES FINDING AND ACTIONS. IT CONTAINS THE GRAPHIC FOR ALL LEVEL 1 FACTORS, AND SPACE FOR THE PM/PEO RECAP OF IMPORTANT PROGRAM ISSUES/CONCLUSIONS/ RECOMMENDATIONS AGAIN, THE INTENT WOULD BE TO BRIEF ONLY THE SUMMARY SLIDE (SLIDE 1), THE FINDING/ACTIONS SLIDE, AND THOSE FACTORS/METRICS THAT MET CRITERIA AS A SIGNIFICANT ISSUE (I.E. LEVEL 2 FACTORS SHOWING YELLOW AND A “DOWN ARROW”, OR RED) – THUS PROVIDING CRITICAL INFORMATION EFFICIENTLY TO THE PM AND THE SERVICE ACQUISITION LEADERSHIP AGAIN, THIS INITIATIVE IS DESIGNED TO: IMPROVE THE ARMY’S ABILITY TO ACCURATELY ASSESS A PROGRAM’S PROBABLITY OF SUCCESS, AND CLEARLY/CONCISELY REPRESENT THAT SUCCESS PROBABILITY TO ARMY LEADERSHIP BY PROVIDING AN ACCURATE, COMPREHENSIVE METHOD OF ASSESSING A PROGRAM’S PROBABILITYOF SUCCESS, AND A PROCESS/BRIEFING PACKAGE THAT WOULD ALLOW THIS ASSESSMENT TO BE CLEARLY AND CONCISELY CONVEYED TO ARMY LEADERSHIP AS QUICKLY AS POSSIBLE ONCE DEVELOPED
23
STATUS/FUTURE PLANS Status
Multiple Acquisition Staffs (Navy, Air Force, USD(AT&L), NSA, and MDA) Have Requested the Product and are Reviewing /Considering It for Use Multiple DoD and Industry Program Managers (including the F/A-22 Program Manager) have Adopted It as an Assessment/ Reporting Tool GAO, MITRE and IDA have Requested/Received Copies of the Tool for Their Use OCT 2002 – ASA(ALT) Briefed on Effort; Expressed Intent to Implement Program Success Factors Across Army DEC 2002 – Program Success Factors Pilot Commences in Two Army Programs (ACS; Phoenix JULY 2003 – Army Decides to Phase-Implement Program Success Factors Across Army Acquisition (Fall 2003); Automation Effort Begins on Army AIM System SUBJECT TO QUESTIONS YOU MAY HAVE, THIS CONCLUDES MY BRIEFING. THANKS FOR YOUR ATTENTION.
24
BACKUP SLIDES
25
QUANTIFICATION PROCESS
First Three Factors (Requirements, Resources, and Execution) Represent How the Program is Operating Nominally 60% in Aggregate Last Two Factors (Fit in Strategic Vision and Advocacy) Represent Whether or Not the Program Should/Could be Pursued Nominally 40% in Aggregate First Three Factors (in Aggregate) Have “Greater Effect” on Program Success than Last Two Factors, but NOT a “Much Greater Effect” THESE “LEVEL 1” FACTORS ARE COMMON ACROSS ALL PROGRAMS. WEIGHTING THESE FACTORS TO CREATE AN ACCURATE ASSESSMENT OF PROGRAM SUCCESS PROBABILITY, HOWEVER, DEPENDS ON BOTH THE TYPE OF PROGRAM (WEAPON, PLATFORM, IT), AND WHERE IT IS IN THE PROGRAM LIFE CYCLE. FOR THE PILOT, WE SET THE WEIGHTING OF THE LEVEL 1 FACTORS TO REPRESENT A VERY COMMON SITUATION: A WEAPON/PLATFORM PROGRAM FINISHING DEVELOPMENT AND IN INITIAL (LOW RATE) PRODUCTION. FOR THAT SITUATION: THE INTERNAL FACTORS (REQUIREMENTS, RESOURCES, AND EXECUTION) ARE EACH WEIGHTED AT 20%, FOR A TOTAL WEIGHT OF 60% IN THE EXTERNAL FACTORS, FIT IN THE VISION IS WEIGHTED AT 15%, AND ADVOCACY AT 25%, FOR A TOTAL OF 40%
26
PROBABILITY OF PROGRAM SUCCESS “BANDS”
Green (80 to 100) Program is On Track for Providing Originally-Scoped Warfighting Capability Within Budgeted Cost and Approved Schedule Issues are Minor in Nature Yellow (60 to <80) Program is On Track for Providing Acceptable Warfighting Capability With Acceptable Deviations from Budgeted Cost and Approved Schedule Issues May Be Major but are Solvable within Normal Acquisition Processes Red (< 60, or Existing “Killer Blows” in Level 2 Metrics) Program is OFF Track Acceptable Warfighting Capability will NOT be Provided, or Will ONLY be Provided with Unacceptable Deviations from Budgeted Cost and Approved Schedule Issues are Major and NOT Solvable within Normal Acquisition Processes (e.g. Program Restructure Required)
27
“KILLER BLOW” “Killer Blow” at the Sub-Factor (Level II) Level
Action Taken By A Decision Maker In The Chain Of Command (Or An “Advocacy” Player) Resulting In Program Non-Executability Until Remedied For Example: Zeroing Of Program Budget By Congressional Committee/Conference Results In Immediate “Red” Coloration Of Associated Level 2, Level 1 And Overall PS Metrics Until Remedied
28
Program Acronym ACAT XX Date of Review: dd mmm yy
PEO XXX Program Acronym ACAT XX EXECUTION – TECHNICAL MATURITY COL, PM Date of Review: dd mmm yy Milestone C CDR THE FINAL LEVEL 2 METRIC IN THE EXECUTION FACTOR IS TECHNICAL MATURITY. ANALYSES OF MULTIPLE MAJOR PROGRAMS HAVE SHOWN THAT THE LEVEL OF TECHNICAL MATURITY POSSESSED BY A PROGRAM AT KEY STAGES OF PROGRAM CONCEPTION, DEVELOPMENT AND PRODUCTION IS AN EXCELLENT PREDICTOR OF WHETHER OR NOT THE PROGRAM WILL MEET ESTABLISHED COST AND SCHEDULE GOALS. THE GAO HAS DONE AN EXTENSIVE STUDY OF BEST PRACTICES IN ACQUISITION PROGRAMS AND HAS FURTHER DEFINED METRICS IN SUPPORT OF THIS PRINCIPLE (CRITICAL TECHNICAL MATURITY ISSUES AT KEY PROGRAM “FLOW POINTS”) : EARLY IN THE PROGRAM: (From Concept Exploration to program initiation (Milestone B)), a match is achieved between the user’s needs and the developer’s technical resources: Metric: the set of Critical Technologies (CT) has been (a) identified; and (b) what percentage of this set is at Technology Readiness Level 8 or above GREEN: %; YELLOW: <80%; RED: <60% DURING THE DESIGN/EARLY PRODUCTION PHASE: (From Program Initiation to Critical Design Review for the Program), the goal is achieving product design stability: Metric: The Program Manager has the option of using either: what percentage of the total number of required program Engineering Drawings have been approved and released for use; or, The percentage of program integration factors/program production factors which are at a Integration Readiness Level /Production Readiness Level of 8 or higher (If integration/production factors are the selected metric, use TRL chart format to display) DURING RATE PRODUCTION: At Milestone C (or at stable Low Rate Production), the product has demonstrated that it can be produced within cost, schedule and quality targets: Metric: the key production processes have been (a) identified and (b) what percentage of them are under statistical process control, GREEN: %; YELLOW: <80%; RED: <60% Technical Maturity factor calculation (maximum value is 2 points) Green (2): Metric for current stage of program is GREEN Yellow (1): Metric for current stage of program is YELLOW Red (0): Metric for current stage of program is RED (NOTE: Subtract 1 point from above total for each previous stage metric (if applicable) that is not at a GREEN level) Level 1 Execution factor calculation (max value 20 points) = sum of all 7 advocacy metrics Allocated Values: Reqm’ts/Resources/Execution Risk Assessment metric (8 points max); the remaining six metrics are of equal value (each 2 points max) GREEN: (16 to 20); YELLOW: (10 to <16); RED: (<10) Y Predictive Y(3) Historical Program Initiation
29
EXECUTION – CONTRACTOR PERFORMANCE
PEO XXX Program Acronym ACAT XX EXECUTION – CONTRACTOR PERFORMANCE COL, PM Date of Review: dd mmm yy THE SECOND LEVEL 2 EXECUTION METRIC PROVIDES THE TRACK RECORD OF THE CONTRACTOR ON DEVELOPMENTAL, COST PLUS-TYPE CONTRACT VEHICLES, BY LOOKING AT THE CONTRACTOR PERFORMANCE ASSESSMENT REPORTING SYSTEM (CPARS) RATING HISTORY FOR THE CONTRACT(S) IN QUESTION, AND THE HISTORY OF AWARD FEE INCREMENTS PROVIDED TO THE CONTRACTOR (AS COMPARED TO THE AMOUNTS SPECIFIED IN THE AWARD FEE PLAN) TRAINING/INFORMATION ON CPARS CAN BE FOUND AT THIS INFORMATION PROVIDES THE PM’S FORMAL EVALUATION OF THE CONTRACTOR’S PERFORMANCE ON HIS PROGRAM, AS WELL AS THE RESULT (PERCENTAGE OF AWARD FEE PROVIDED TO THE CONTRACTOR). ONE CHART WILL BE PREPARED FOR EACH PROGRAM CONTRACT, AS APPLICABLE, WHICH WILL COVER ALL CPARS AND IPARS REPORTS THROUGH THE FULL PERIOD OF PERFORMANCE FOR THE CONTRACT Color Code: (more info on slides 25 – 31of the CPARS Training Brief at the above site) BLUE GOLD GREEN YELLOW RED exceptional very good satisfactory marginal unsat Contract Reqm’ts exceeds exceeds meets some not most not many some met met Contract Problems few,minor some,minor more,minor serious very serious Corrective Action highly effective satisfactory marginally ineffective effective effective CPAR/IPAR/AF/IF metric calculation (maximum value is 2 points) Green (2): Average value (coloration) of factors a. – e. above is GREEN or above (with no factor RED), and Contractor is at 80% (or above) of possible award fee for duration of contract to date Yellow (1): Average value (coloration) of factors a. – e. above is YELLOW to GREEN (with no more than one factor RED), and/or Contractor is at 50% (or above) of possible award fee for duration of contract to date Red (0): Average value (coloration) of factors a. – e. above is RED to YELLOW; or two or more factors RED; or Contractor is below 50% of possible award fee for duration of contract to date Y Predictive Y(2) Historical
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.