Presentation is loading. Please wait.

Presentation is loading. Please wait.

10-1 GENERAL GUIDANCE ON CONDUCT OF TECHNICAL REVIEWS Selection should be based on the complexity of the program and the phase of development (modification).

Similar presentations


Presentation on theme: "10-1 GENERAL GUIDANCE ON CONDUCT OF TECHNICAL REVIEWS Selection should be based on the complexity of the program and the phase of development (modification)."— Presentation transcript:

1 10-1 GENERAL GUIDANCE ON CONDUCT OF TECHNICAL REVIEWS Selection should be based on the complexity of the program and the phase of development (modification). Participants Type and Number Include tasking and performing activity personnel responsible for the area or item being reviewed, key representatives for lower level items, and other personnel who have a stake in the specific objectives of the review. Objectives 1. Demonstrate progress 3. Verify the expected maturity 2. Ensure issues have been resolved 4. Confirm that risks are acceptable

2 10-2 Confirm that issues addressed during previous reviews have been satisfactorily resolved. Only areas requiring close scrutiny should be addressed in detail. Proper integration and management of interim, subsystem, and functional reviews (as planned in SEMP) should make a detailed total evaluation unnecessary. Opportunity to modify program emphasis for the next phase based on risk levels. In General FOCUS IN MAJOR REVIEWS

3 10-3 TYPES OF REVIEWS Major: Demonstrate that risk levels are acceptable and provide an opportunity to modify program emphasis for next phase/effort. (Examples: ASR, SRR, SFR, PDR, CDR, SVR, PCA) Functional: System wide review conducted by involved disciplines to address progress and issues associated with one functional area. (Examples: Development, Training, Support, Verification, Manufacturing, Disposal, etc.) Subsystem: Multidisciplinary reviews to assess progress in defining and satisfying subsystem requirements. Interim: Reviews held between major reviews that cover spectrum of system, segments, and configuration items.

4 10-4 Integrated Technical Reviews (Cont.) Major Technical Reviews include: - Alternate System Review (ASR): The ASR forms the basis for determining which system concepts should be continued as candidate development programs. It provides the data for recommendations to eliminate potential systems concepts. - System Requirements Review (SRR): The SRR forms the basis early in System Development and Demonstration whether customer requirements are understood. It should confirm the progress and direction of technology verifications and demonstrations and convergence on viable system requirements

5 10-5 Integrated Technical Reviews (Cont.) Major Technical Reviews include: - System Functional Review (SFR): The SFR (formally called the SDR) forms the basis to establish and verify an appropriate set of functional and performance requirements for the complete system. - Preliminary Design Review (PDR): The PDR forms the basis for determining whether a preliminary physical architecture and design approach for configuration item or aggregation of configurations (including software) is ready to start detailed design. A configuration item development spec must be complete and ready for use in detailed design

6 10-6 Integrated Technical Reviews (Cont.) Major Technical Reviews include: - Critical Design Review (CDR): The CDR confirms the detailed design for each configuration item (including software), each aggregation of configuration items, and that requirements are satisfied. - System Verification Review (SVR): The SVR demonstrates the total system (people, products, and processes) satisfies functional and allocated configuration documentation requirements and confirms readiness for production, support, training, deployment, operations, an disposal. The Production Readiness Review (PRR) is a part of the SVR.

7 10-7 Customary System Level Review Accomplishments SRR - Review system functional and performance requirements. SDR/SFR - Review allocation of system requirements to configuration item (CI) aggregates. Assess maturity of system spec. and CI aggregates. PDR - Review design progress for each CI and aggregate of CIs: 1. Review predicted performance 2. Review physical and functional interfaces 3. Verify integration of specialty eng. req. 4. Evaluate risk resolution

8 10-8 Customary System Level Review Accomplishments (Cont.) CDR - Review maturity of design for each CI and aggregate CIs: 1. Verify that functional and performance requirements of CI specs are satisfied. 2. Verify that engineering specialty requirements have been satisfied. 3. Verify compatibility between CIs and hardware, software, facilities and personnel. 4. Review preliminary product specs. 5. Assess producibility risks.

9 10-9 Customary System Level Review Accomplishments (Cont.) SVR - Confirms readiness to enter full rate production: 1. Verify work on all CIs – all functions & performance levels met. 2.FCA conducted for CIs. See page 107 of Sys. Eng. Fundamentals for additional accomplishments.

10 10-10 Systems Engineering Master Schedule (SEMS) Purpose: Contractual Agreement Between Contractor and Customer on Critical Tasks Required for Completing Each Major Program Milestone and Defining the Criteria for Successful Completion of Those Tasks.

11 10-11 Role of SEMS in Acquisition Planning The SEMS Outlines the Integration of All Major Tasks Necessary to Design, Develop, Test, Produce, Deploy and Support the System. Support Training System Prime Mission Equipment Performance Producibility Reliability Cost Schedule Risk Management Balanced Process/Product to Satisfy User Needs

12 10-12 SEMS Key Tasks Typical Tasks Develop Test Plans Develop Software Plans Tasks Define Interim Steps to Developing and Producing System Tasks Should Include Verification Criteria

13 10-13 SEMS Accomplishment Guidelines Typical Accomplishment for Sample Tasks –Test Plan Complete –Software Development Plans Complete Accomplishment Should Be Easily Measurable Example of Difficult to Measure Accomplishment –Test Plan 85% Complete

14 10-14 SEMS Sample Accomplishment Milestone 3 - Test Readiness Review (TRR) Accomplishments 3.1 Complete Software Test Procedures 3.2 Complete System Integration Test Procedures 3.3 Complete All Major Subsystem TRRs 3.4 Complete All Action Items from Milestone 2

15 10-15

16 10-16

17 10-17 Section L - Instructions, Conditions, and Notices to Offerors Systems Engineering Master Schedule (SEMS) volume 3 attachment 1 Systems Engineering Master Schedule (SEMS) volume 3 attachment 1

18 10-18 Section L - Instructions, Conditions, and Notices to Offerors (Cont.) 3.0 SEMS PROGRAM EVENTS Listed below are the Major Program Events and their scheduled time of achievement. Each event shall be considered achieved upon successful demonstration of criteria accomplishments. Criteria for each event is defined in Section 4.0. Event NumberEvent Title Event Date 1.System Hardware Preliminary5 MACA Design Review(PDR) 2.System Software PDR7 MACA 3.ILS Management Team (ILSMT)9 MACA Meeting 4.System Hardware Critical Design11 MACA Review (CDR) 5.System Software CDR13 MACA 6.System Test Readiness Review22 MACA 7.Ground Durability/Qualification24 MACA Test Release

19 10-19 Section L - Instructions, Conditions, and Notices to Offerors (Cont.) Event NumberEvent Title Event Date 8.(Reserved) 9.RF-4C #1 Development Flight29 MACA Test Release 10.RF-4C DT&E/IOT&E start29 MACA 11.UARS #1 Development Flight29 MACA Test Release 12. UARS DT&E/IOT&E start29 MACA 13.F/A-18 DT&E/IOT&E Start29 MACA 14.FSD Maintainability Demonstration34 MACA 15.Production Readiness Review35 MACA 16.System Function Configuration 36 MACA Audit (FCA) 17.RF-4C DT&E/IOT&E Complete38 MACA 18.UARS DT&E/IOT&E Complete38 MACA

20 10-20 Section L - Instructions, Conditions, and Notices to Offerors (Cont.) Event NumberEvent Title Event Date 19.F/A-18 DT&E/IOT&E Complete38 MACA 20.Formal Qualification Review (FQR)38 MACA 21.Milestone IIIA39 MACA 22.System Physical Configuration57 MACA Audit (PCA) 23.LRIF Maintainability Evaluation57 MACA 24.Milestone IIIB66 MACA 25.F-14D/TARPS DT&E/IOT&E72 MACA Start 26. F-14D/TARPS DT&E/IOT&E81 MACA Complete 27.Initial Operational Capability See (IOC)Appendix 28.Delivery Per Table 4-2 As Required *MACA: Months After Contract Award

21 10-21

22 10-22

23 10-23

24 10-24

25 10-25

26 10-26

27 10-27

28 10-28

29 10-29 EXERCISE #2 Exercise: Develop the portion of a Systems Engineering Master Schedule (SEMS) for the AJS design and development tasks. Assume the system level schedule for Peace Whey includes a SRR, SFR, PDR, CDR, and PRR. Your SEMS should identify a list of AJS program reviews with accomplishment criteria. Part 1: Referring to the system level Peace Whey reviews, identify and list the major reviews which need to be supported by inputs from the AJS design and development tasks. Based upon this list, determine a set or reviews for the AJS development effort required to support the Peace Whey program. Part 2: Identify the accomplishment criteria for one AJS review. You will have 20 minutes to complete this exercise.

30 10-30 IMP/SEMS & IMS/SEDS Relationships Integrated Master Plan (IMP) Integrated Master Schdule (IMS) Program Office Contracts Security System Engineering Master Schedule (SEMS) Program Office Contracts Security System Engineering Detailed Schedule (SEDS)

31 10-31 IMP/IMS Development Steps A SYSTEM PRODUCT HIERARCHY SPEC TREE WBS IPT CBDE LMKJ IHGF SPEC TREE WBS IPT SPEC TREE WBS IPT SPEC TREE WBS IPT Product Event No. Accomplishment Criteria IMS Task ANN 1. Define Events, Accomplishments (RFP “Shalls”), and Criteria 2. Define a Product Oriented Management Structure (SPEC Tree, WBS, and IPT Organization) 3. Build a Top Down, Layered IMP/IMS with Activity Numbers as Shown Below

32 10-32 Integrated Master Plan (IMP) and Integrated Master Schedule (IMS) Terms Product - Hardware, Software, Facilities, Data, or Materials Event - Decision Point at End of Major Project Activity (EX. - CDR) Accomplishment - Desired Result at Specified Events Criteria - Measure of Meeting Accomplishment Task - Specific Activity to Complete a Criteria

33 10-33 Top Level Task Relationships Requirements For Design Requirements For Integrated Test Software Implementation Software Design Hardware Design Suppliers Component & System Testing Requirements Development Primary Derived Production Inventory Control Parts Production Assembly Weapon System Integration & Testing FEEDBACK Requirements Design Product Definition Delivered Product Parts Assemblies Product

34 10-34 Network Chart Example Network Layout Conversion to Milestone Chart K 2468 10 12 14 J CG AD B EH FI S C (D) 6 Weeks (A) 4 Weeks(E) 3 Weeks (I) 4 Weeks (H) 2 Weeks (C) 1 Week(G) 4 Weeks (J) 5 Weeks (B) 2 Weeks(F) 8 Weeks (K) 8 Weeks

35 10-35 Integrated Master Schedule (IMS) and Network Schedules IMS Show Dates for Major Tasks and Milestones - Usually Presented as Gantt (Bar) Chart - Shows Tasks Corresponding to Entry/Exit Criteria - Must Include All IMP Milestones Network Schedules Are Essential to the IMS - Helps Determine if IMS Dates and IMP Events Are Achievable - Critical-Path Method (CPM) Schedule Software is Essential - Essentials: Tasks, Relationships, Durations, Resources

36 10-36 IMP/IMS Summary Negotiated IMP Will Be Place on Contract, Changes Will Require a Contract Change Proposal IMS Shall Support the IMP. IMS Updates Shall Be Approved Prior to Implementation of Changes. IMS is Time-Based Customer May Provide Government IMP in RFP. Government IMP Should Provide Detailed Information for Next Acquisition Phase to Identify Specific Events, Accomplishments and Criteria Necessary to Satisfy Planned and Required Technical Exit Criteria IMP Is a Key Control Element of SE Process. IMP Can Be Used as the Basis for Quantitative Requirements for Award Fees

37 10-37 Exercise # 3 Exercise: Referring to the SEMS from Exercise #2, develop a System Engineering Detailed Schedule (SEDS) for the AJS design and development task. The AJS Safety of Flight unit must be available in month 26 and first production aircraft in month 36. Be prepared to report your results to the class. You will have 20 minutes for this exercise. Hint – Use building block tasks on next page.

38 10-38 Electronic Warfare

39 10-39 Backup

40 10-40 Progressive Subsystem Informal Subsystem Formal Subsystem Limited Formality Formal Issues Roll-Up

41 10-41 Integrated Technical Reviews (Cont.) Technical reviews are a series of systems engineering activities by which the technical progress of a program is demonstrated relative to its technical or contractual requirements. They are conducted at logical transition points in the development effort to reduce risk by assessing progress against SEMS requirements and identifying and correcting problems/issues resulting from the work completed before the program is disrupted or delayed. They provide a method for the contractor and government to determine if the development of a system and/or configuration item and its documentation have met contract requirements.

42 10-42 Planning Guidance from DoDD 5000.1 (Defense Acquisition) Part 2.B.3 Acquisition Strategies, Exit Criteria, and Risk Management “Event driven acquisition strategies and program plans must be based on rigorous, objective assessments of a program’s status and the plans for managing risk during the next phase and the remainder of the program. The acquisition strategy and associated contracting activities must explicitly link milestone decision reviews to events and demonstrated accomplishments in development, testing and initial production. The acquisition strategy must reflect the interrelationships and schedule of acquisition phases and events based on logical sequence of demonstrated accomplishments not on fiscal or calendar expediency.”

43 10-43 Integrated Technical Reviews The government assesses the status, maturity, risk, and verification status by conducting reviews that will: Be adequately planned prior to the initiation technical work Have event-driven entry and exit criteria with defined accomplishments and success metrics via the SEMS. Be conducted incrementally (as appropriately) to measure progress towards achieving major accomplishments and obviate unexpected issues resulting from incomplete or inconclusive development activities. Be conducted by multidisciplinary teams to ensure that all the system's functions are addressed and all system elements are integrated and balanced across the system.

44 10-44 DoD Risk Management Policies and Procedures

45 10-45 Risk Assessment and Management (DoD 5000.1) “Program Managers and other acquisition managers shall continually assess program risks. Risks must be well understood, and risk management approaches developed, before decision authorities can authorize a program to proceed into the next phase of the acquisition process. To assess and manage risk Program Managers and other acquisition managers shall use a variety of techniques, including technology demonstrations, prototyping, and test and evaluation. Risk management encompasses identification, mitigation, and continuous tracking, and control procedures that feed back through the program assessment process to decision authorities. To ensure an equitable and sensible allocation of risk between government and industry, Program Managers and other acquisition managers shall develop a contracting approach appropriate to the type of system being acquired.”

46 10-46 Cost, Schedule, and Performance Risk Management (DoD 5000.2-R) “The Program Manager shall establish a risk management program for each acquisition program to identify and control performance, cost, and schedule risks. The risk management program shall identify and track risk drivers, define risk abatement plans, and perform periodic assessments to determine how risks have changed. Risk reduction measures shall be included in cost-performance tradeoffs, where applicable. The risk management program shall plan for back- ups in risk areas and identify design requirements where performance increase is small relative to cost, schedule, and performance risk. The acquisition strategy shall include identification of the risk areas of the program and a discussion of how the Program Manager intends to manage those risks.”

47 10-47 Risk Management Structure (DoD Risk Management Study) Risk Management Risk PlanningRisk Assessment Risk Handling Risk Monitoring Risk Documentation & Communications Risk Identification Risk Analysis

48 10-48 Definitions Risk - A measure of likelihood to achieve objectives - Two components (probability and consequences) Risk Management - Act or practice of controlling risk + Identifying and tracking risk drivers + Defining risk mitigation plans + Performing periodic risk assessments

49 10-49 Risk Planning Process has two segments –Implementing a comprehensive and active strategy to continuously identify, mitigate and track program risks Who does it What do they do When do they do it How risk is shared –Documenting risk elements of program activities How do I get there from here?

50 10-50 Risk Assessment Process of identifying and analyzing program risks to increase the chances of meeting performance, schedule, and cost objectives Two segments –Risk Identification –Risk Analysis

51 10-51 Risk Identification Process of specifying, describing and documenting program risks and their sensitivities to other risks –Internal –External

52 10-52 Risk Analysis Process of evaluating program risks for their impacts to performance, cost, and schedule objectives Process includes assessing each risk’s: –Probability of occurrence, and –Consequences of failure to mitigate the risk

53 10-53 Risk Handling Process that identifies, evaluates, selects, and implements risk handling options –to set risk at acceptable levels –give program constraints Typical risk handling strategies can include: –replan to eliminate the identified risk –avoid risk by changing requirements –transfer the risk –control the risk through active steps –assume the risk without special efforts

54 10-54 Risk Monitoring Process that systematically tracks and evaluates the performance of risk mitigation actions - against established metrics throughout the acquisition* process, and - develops further risk handling options as appropriate * Acquisition includes any procurement from government or contractor sources within all phases from early research through logistics, operations, support, and disposal

55 10-55 Risk Management Process Evaluate Risk Handling Options Avoid Information Gathering Transfer Assume Control Program Requirements Assess Risk Identify Analyze Quantify Risk Evaluate Subcontractor Risks Analyze Impacts Performance Cost Schedule Manage Risk Review Indicators Abatement Actions

56 10-56 Assess Risks Program Requirements Assess Risks Evaluate Risk Handling Options Evaluate Subcontractor Risks Establish Cost, Schedule, & Performance Impacts Manage Risks Establish Approach Develop Team Identify Risk Areas Analyze Risk Create Risk Management Tools

57 10-57 Risk Identification The WBS is normally used to organize and ensure completeness of the risk identification effort. Identification is generally performed at the 3 rd or 4 th level of the WBS.

58 10-58

59 10-59

60 10-60 Risk Categories and Consequences Risk Categories –Requirements - Are the Necessary Requirements ( Operational or Design) Fully Defined? Is the Basis for the Requirements Stable (e.g. No Expected Threat Change) –Technology - Is the Technology Available Proven in Previous Use? –Engineering - How Much New Design is Needed to Achieve Requirements? –Manufacturing - Are the Required Manufacturing Processes, Facilities, and Sources of Materials Known and Available? –Support - Are the Required Support Resources Defined and Available? –Management - Are the Processes, Resources, and Experience Available to Successfully Perform this Program? Risk Consequences –Performance - Can the Item Meet Its Requirements (Operational, Support and Manufacturing)? –Cost - Can the Item Be Developed and Operated within the Funding Allocated to It? –Schedule - Can the Item Be Developed and Deployed with the Time Allocated to It?

61 10-61 Sample Risk Identification WBS 113 - Guidance System Issues: New Design – Uses new chipset from CHIPLEAP program. Some concern exists on producibility and thermal characteristics. Assumptions: Megalith Corporation will design guidance system. They were a participant in the CHIPLEAP effort.

62 10-62 Additional risk categories such as customer satisfaction or customer expectation can be developed for identification purposes. Risk assessment templates (shown on later charts) for subcategories can be developed and maintained by organizations. Organization of Risk Assessment

63 10-63

64 10-64 Ref. – Best Practices

65 10-65 Funding Computer Aided Design Built –In Test Parts and Materials Selection Trade Studies Design Process Design Ref Mission Profile Subcontractor Control Design Reviews Logistics Support Analysis Design Requirements Design Policy Design Analysis Software Design Design For Testing Configuration Control Design Release Integrated Test Failure Reporting System Uniform Test Report Software Test Design Limit Life Test Analyze And Fix Field Feedback Manufacturing Screening Computer-Aided MFG Special Test Equipment Tool Planning Defect Control Piece Part Control Quality MFG Process Manufacturing Plan Factory Improvements Productivity Center Manufacturing Strategy Production Breaks Modernization Manpower And Personnel Support and Test Equipment Training Material And Equipment Personnel Requirements Data Requirements Technical Risk Assessment Design Test ProductionFacilitiesLogisticsManagement Product Funding Money Phasing Spares Technical Manuals Transition Plan Ref. – Best Practices

66 10-66 Ref. – Best Practices

67 10-67 Ref. – AFMC 63-101

68 10-68

69 10-69 Ref. – AFMC 63-101

70 10-70 Probability (Likelihood) 1 0 Consequences Traditional Risk Analysis Performance Cost Schedule Potential Degradation Sys Reqt not Achieved Element Increase > 50% System Increase > 40% Element Increase Element Increase < 10% x x x Probability is Assigned Consequences are Estimated x

71 10-71 Probability (Likelihood) 1 0 Consequence Traditional Risk Analysis Performance Cost Schedule Potential Degradation Sys Reqt not Achieved Element Increase > 50% System Increase > 40% Element Increase Element Increase > 50% x x x x x High Risk – Severe disruption expected to performance, cost, and / or schedule even with risk mitigation plans in place. Moderate Risk –Expected disruption to performance, cost, and / or schedule can be overcome by implementing risk mitigation plans. Low Risk – Little disruption expected to performance, cost, and / or schedule.

72 10-72 Weaknesses of Traditional Risk Analysis Process 1.Roll-up of risks characterized as high, moderate, or low at the 3 rd or 4 th WBS level are difficult. Example – Are 10 low risks and 1 high risk at WBS Level 4 elements expressed as low, moderate, or high at the parent WBS at Level 3? 2.Characterizing a risk as high, moderate, or low alerts the customer to the severity of the outcomes without giving insight into the likely capability of the delivered product.

73 10-73 Insert Melbourne Paper Here


Download ppt "10-1 GENERAL GUIDANCE ON CONDUCT OF TECHNICAL REVIEWS Selection should be based on the complexity of the program and the phase of development (modification)."

Similar presentations


Ads by Google