Presentation is loading. Please wait.

Presentation is loading. Please wait.

Jack Taylor Office of the Deputy Assistant Secretary of

Similar presentations


Presentation on theme: "Jack Taylor Office of the Deputy Assistant Secretary of"— Presentation transcript:

1 Department of Defense (DoD) Technology Readiness Assessment (TRA) Process
Jack Taylor Office of the Deputy Assistant Secretary of Defense for Research Ph: (571) ,

2 What is a TRA Definition: A systematic, metrics-based process that assesses the maturity of, and the risk associated with, critical technologies Required by Statute (10 U.S.C. §2366b) and DoD Regulation (DoD and Guidance Issued May 11, 2011) Purpose: To assist in the determination of whether the technologies of the program have acceptable levels of risk—based in part on the degree to which they have been demonstrated (including demonstration in a relevant environment) To support risk-mitigation plans prepared by the PM To support certification under 10 U.S.C. §2366b To support technology maturity language for an Acquisition Decision Memorandum (ADM) TRA identified maturity considerations are becoming explicit in Test & Evaluation Strategies (TES) and Test and Evaluation Master Plans (TEMP), e.g. WIN-T, GMLRS, CH-53K

3 10 USCS § 2366b Certification Required before Milestone B
Business case analysis shows The program is affordable Appropriate trade-offs among cost, schedule, and performance objectives have been made Reasonable cost and schedule estimates have been developed Funding is available On the basis of a PDR, the program demonstrates a high likelihood of accomplishing its intended mission Market research has been conducted prior to technology development to reduce duplication of existing technology and products An analysis of alternatives was completed The technology in the program has been demonstrated in a relevant environment The program complies with all relevant policies, regulations, and directives The MDA may waive the certification, if he determines that, but for such a waiver, the Department would be unable to meet critical national security objectives

4 When is a TRA Conducted Milestone B or any other certification decision event Preliminary TRA to be completed prior to the Milestone Decision Authority (MDA) Defense Acquisition Board (DAB) Pre-Milestone B Program Review that precedes Engineering and Manufacturing Development (EMD) Request for Proposal (RFP) release Update after Preliminary Design Review (PDR) and prior to Milestone B (Basis for 2366b Certification) When designated by the MDA  An early evaluation of technology maturity conducted shortly after Milestone A should help refine critical technologies A B C Materiel Solution Analysis Development Decision PDR CDR - Post Assessment Engineering and Manufacturing (Program Initiation) Technology Operations and Support FRP Review Production and Deployment Competitive Prototyping

5 New TRA Guidance Issued May 11, 2011

6 Improving Milestone Process Effectiveness – Preliminary TRA
PDUSD(AT&L) Memo June 23, 2011 “Our current milestone review process does not provide adequate opportunity for the MDA to review program plans prior to release of the final RFP, the point at which the Department’s requirements, schedule, planned program content, and available funding should be available for review. Program managers shall plan for and MDAs shall conduct a Pre EMD Review before releasing the final RFP for the EMD phase. Draft documents include: LFTE Plan, APB, Exit Criteria, SEP, TEMP Supporting Documents : CARD, CDD, ICE, DoD Component Cost Estimate, Preliminary TRA, STAR

7 The TRA Process Establish a TRA Plan and schedule Form a Subject Matter Expert (SME) Team Identify technologies to be assessed Collect evidence of maturity Assess technology maturity Prepare, coordinate, and submit a TRA report Review and Evaluation

8 SME Team Qualifications
Subject matter expertise and independence from the program are the two principal qualifications for SME team membership Works closely with the PM throughout the TRA process Reviews the performance, technical requirements, and designs being considered for inclusion in the program In conjunction with the PM and ASD(R&E), reviews the PM-provided list of critical technologies to assess and recommends additions or deletions Assesses whether adequate risk reduction to enter EMD (or other contemplated acquisition phase) has been achieved for all technologies under consideration, including, specifically, demonstration in a relevant environment Prepares the SME comments in the TRA report including (1) the SME team credentials and (2) SME team findings, conclusions, and supporting evidence

9 Critical Technologies
Critical technologies - those that may pose major technological risk during development, particularly during the Engineering and Manufacturing Development (EMD) phase of acquisition Identified in the context of a program’s systems engineering process, based on a comprehensive view of the most current system performance and technical requirements and design and the program’s established technical work breakdown structure

10 Technology Readiness Levels
Basic principles observed and reported Technology concept and/or application formulated Analytical and experimental critical function and/or charac- teristic proof of concept Component and/or breadboard validation in a laboratory environment Component and/or breadboard validation in a relevant environment System/subsystem model or prototype demonstration in a relevant environment System prototype demonstration in an operational environment Actual system completed and qualified through test and demonstration Actual system proven through successful mission operations Increasing maturity Technology Readiness Levels (TRLs) can serve as a helpful knowledge-based standard and shorthand for evaluating technology maturity, but they must be supplemented with expert professional judgment

11 Example: Armor Technologies
Armor Development Activity TRL Level New design based on measured material properties and knowledge of existing armor performance TRL 3 Ballistic measurements of armor coupons, environmental and structural testing of armor materials TRL 4 Demonstration of armor panels at environmental extremes; measure effects of panel size and support TRL 5 Ballistic Hull and Turret testing of armor design ready for fabrication TRL 6 Ballistic Testing of production armor installed on full-up prototype vehicle TRL 7

12 TRA Report Contents Short description of the program
SME team membership and credentials List of critical technologies that pose a potential risk to  program execution success, with the PM’s assessment of the maturity of those technologies as demonstrated in a relevant environment and a description of any risk-mitigation plans SME team findings, conclusions, supporting evidence, and major dissenting opinions Cover letter to signed by the CAE approving the report, forwarding any requests for waivers of the 10 U.S.C. §2366b certification requirement with supporting rationale, and providing other technical information deemed pertinent by the CAE and PM

13 ASD(R&E) Roles & Responsibilities
Reviews the TRA plan provided by the PM and provides comments regarding TRA execution strategy as appropriate In conjunction with the PM and SME team, reviews the PM-provided list of critical technologies to assess and recommends additions or deletions Provides the MDA independent recommendations concerning 10 U.S.C. §2366b certification If a 10 U.S.C. §2366b waiver has been requested, provides a recommendation to the MDA, with supporting rationale, as to whether a waiver should be granted Recommends technology maturity language for an Acquisition Decision Memorandum (ADM), noting, in particular, conditions under which new technology can be inserted into the program

14 Summary The DoD TRA Guidance can be downloaded at the ASD(R&E) website at:

15 BACKUP

16 TRL 5 Definition: Component and/or breadboard validation in a relevant environment. Description: Fidelity of breadboard technology increases significantly. The basic technological components are integrated with reasonably realistic supporting elements so they can be tested in a simulated environment. Examples include “high-fidelity” laboratory integration of components. Supporting Information: Results from testing a laboratory breadboard system are integrated with other supporting elements in a simulated operational environment. How does the “relevant environment” differ from the expected operational environment? How do the test results compare with expectations? What problems, if any, were encountered? Was the breadboard system refined to more nearly match the expected system goals? A relevant environment example may be the electromagnetic environment or conditions.

17 TRL 6 Minimum Maturity at Milestone B
Definition: System/subsystem model or prototype demonstration in a relevant environment. Description: Representative model or prototype system, which is well beyond that of TRL 5, is tested in a relevant environment. Represents a major step up in a technology’s demonstrated readiness. Examples include testing a prototype in a high-fidelity laboratory environment or in a simulated operational environment. Supporting Information: Results from laboratory testing of a prototype system that is near the desired configuration in terms of performance, weight, and volume. How did the test environment differ from the operational environment? Who performed the tests? How did the test compare with expectations? What problems, if any, were encountered? What are/were the plans, options, or actions to resolve problems before moving to the next level? The level of maturity is achieved after you have tested the system or subsystem in a relevant environment. You should be testing something very close to its final configuration.

18 TRL 7 Definition: System prototype demonstration in an operational environment. Description: Prototype near or at planned operational system. Represents a major step up from TRL 6 by requiring demonstration of an actual system prototype in an operational environment (e.g., in an aircraft, in a vehicle, or in space). Examples include testing the prototype in a test bed aircraft. Supporting Information: Results from testing a prototype system in an operational environment. Who performed the tests? How did the test compare with expectations? What problems, if any, were encountered? What are/were the plans, options, or actions to resolve problems before moving to the next level?

19 Software Technology Readiness Levels from 2005 to 2011

20 Trends in SW Development Methods, Tools and Techniques
Agile Development values: Individuals and Interactions over processes and tools Working software over comprehensive documentation Customer collaboration over contract negotiation Responding to change over following a plan Design Patterns Refactoring Team Software Process/Personal Software Process Use of Technology Readiness Levels (TRLs) for software A profound shift in mindset distinguishes agile development from conventional approaches heavy on planning and up-front analysis. As expressed by the Agile Alliance [3], agile development emphasizes: "Individuals and interactions over processes and tools. Working software over comprehensive documentation. Customer collaboration over contract negotiation. Responding to change over following a plan. That is, while there is value in the items on the right, we value the items on the left more." In short, agile development regards system development primarily as a learning process. This perception alters the conventional risk/benefit tradeoff. For example, detailed planning for the long-term is seen as having little value; the details (at least) will undoubtedly change as the project team learns more during the early delivery cycles. Similarly, agile development prefers to substitute frequent, informal person-to-person (and optimally, face-to-face) interaction for formal textual or graphical documents, which always fail to be comprehensive and are impossible to maintain at an appropriate rate of change the closer they come to being comprehensive. Refactoring Refactoring is the process of changing a software system to improve the software's structure and performance without altering its functional behavior. Refactoring is used to eliminate, replace, or rewrite code to improve its efficiency and understandability or to transform applications to use a suitable set of modern infrastructure support functions. Refactoring extracts and parameterizes methods; merges and consolidates similar methods; reduces the set of methods associated with a class to the minimal set of well-understood operations; improves the coupling, cohesion and comprehensibility of the overall application; and reduces overall code duplication and code redundancy. Refactoring. The system design is improved throughout the entire development process. This is done by keeping the software clean, without duplication, as simple as possible, and yet complete - ready for any change that comes along. (Martin Fowler defines refactoring as "the process of changing a software system in such a way that it does not alter the external behavior of the code yet improves its internal structure" [2]). The Software Engineering Institute has developed the Team Software Process (TSP) to help integrated engineering teams more effectively develop software-intensive products. This process method addresses many of the current problems of developing software-intensive products and shows teams and their management explicitly how to address them. For example, hardware-software projects in even very experienced groups generally have cost, schedule, and quality problems. Testing is generally expensive and time consuming, and often followed by many months of user trials before the products are fully usable. The PSP framework is a data-driven feedback system that allows individual software engineers to continuously improve their personal processes by applying statistical process control techniques at the individual level. A PSP practitioner uses a defined software process to apply a set of practices to develop products, while collecting data as part of the development process. Figure 1 illustrates how the collected measurements are used to analyze and assess the impact of a practice on the product and/or process using a feedback loop. This feedback becomes an inherent part of all future product development processes. The framework therefore, offers a road map for collecting data. By analyzing the data, engineers are able to modify their practices and thus improve predictability and quality. In "Understanding and Using Patterns in Software Development", Dirk Riehle and Heinz Zullighoven give a nice definition of the term "pattern" which is very broadly applicable: A pattern is the abstraction from a concrete form which keeps recurring in specific non-arbitrary contexts. The above authors point out that, within the software patterns community, the notion of a pattern is "geared toward solving problems in design." More specifically, the concrete form which recurs is that of a solution to a recurring problem. But a pattern is more than just a battle-proven solution to a recurring problem. The problem occurs within a certain context, and in the presence of numerous competing concerns. The proposed solution involves some kind of structure which balances these concerns, or "forces", in the manner most appropriate for the given context. Using the pattern form, the description of the solution tries to capture the essential insight which it embodies, so that others may learn from it, and make use of it in similar situations. The pattern is also given a name, which serves as a conceptual handle, to facilitate discussing the pattern and the jewel of information it represents. So a definition which more closely reflects its use within the patterns community is: A pattern is a named nugget of instructive information that captures the essential structure and insight of a successful family of proven solutions to a recurring problem that arises within a certain context and system of forces. A slightly more compact definition which might be easier to remember is: A pattern is a named nugget of insight that conveys the essence of a proven solution to a recurring problem within a certain context amidst competing concerns. It solves a problem: Patterns capture solutions, not just abstract principles or strategies. It is a proven concept: Patterns capture solutions with a track record, not theories or speculation. The solution isn't obvious: Many problem-solving techniques (such as software design paradigms or methods) try to derive solutions from first principles. The best patterns generate a solution to a problem indirectly -- a necessary approach for the most difficult problems of design. It describes a relationship: Patterns don't just describe modules, but describe deeper system structures and mechanisms. The pattern has a significant human component .... All software serves human comfort or quality of life; the best patterns explicitly appeal to aesthetics and utility. Transition with the last bullet on SW TRLs… ISAM 05

21 Technology Readiness Assessments (TRAs) Definitions
A TRA is a systematic, metrics-based process that assesses the maturity of Critical Technology Elements (CTEs) A technology element is “critical” if the system being acquired depends on this technology element to meet operational requirements with acceptable development cost and schedule and with acceptable production and operation costs and if the technology element or its application is either new or novel Environment is key to “new or novel” May be hardware, software, manufacturing, or life cycle related Uses Technology Readiness Levels (TRLs) as the metric Adequate maturity (TRL 6 or greater) at MS B is largely based on experience with prototypes or previous usage in a relevant environment A TRA is not a risk assessment or a design review Does not address system integration or imply that the technology is right for the job SOFTWARE DEVELOPMENT CAN BE RISKY. IF YOU HAVE A COTS OR GOTS PRODUCT THAT IS ALREADY MATURE, THAT RISK CAN BE REDUCED. TRAs ATTEMPT TO IDENTIFY THE RISK OF SOFTWARE DEVELOPMENT. The TRA is an assessment of technology maturity at a single point in time. TRLs do not capture obsolescence via expiring licenses/lack of vendor support. WHAT REGULATION OR DIRECTIVE MAKES THIS MANDATORY? DoD Instruction of May 12, 2003 makes TRAs a regulatory information requirement at MS B and C for all programs. For ACAT ID and ACAT IAM programs, the TRAs are submitted to DUSD(S&T). Note that all programs are required to conduct a TRA but only the large programs of special interest are subject to OSD and Congressional oversight. The TRA differs from nearly all other information requirements in that it is a scientifically based report prepared by a panel of subject matter experts independent of the program. The report identifies the scientific evidence used to assess the maturity of CTEs. WHO DETERMINES THE Critical Technology Elements (CTEs)? The CTE definition shown on the slide is taken from the TRA Deskbook (reference later in the presentation). The PM and his/her technical support staff make an initial cut at determining the CTEs by working through the systems architecture (WBS for hardware) and determining whether the technologies meet the criteria in this definition—criticality to an operational requirement or affordability and “new or novel.” (More on “new or novel” in the next slide.) A best practice is for the independent panel (that makes the maturity assessment) to also be part of the CTE identification process. The independent panel should review the PM’s initial determination and make recommendations for additions and deletions based on its subject matter expertise. 1. Are you saying that the assessment is done by this “independent panel?” I assume this panel is from the organization’s S&T Executive? The panel is not from the S&T Executive’s organization. They are not staffed for this. The independent panel should be composed of subject matter experts from the field. They may be from industry, universities, FFRDCs, other Services or elsewhere in the Government. Non government people are paid for their time (and maybe some government people too depending on how they account for things). Everyone is reimbursed for travel. 2. Since all programs must have a TRA, do all programs get assessed by the organization’s S&T independent panel or just the ACAT IAMs because of their congressional oversight? The extent to which the military departments do TRAs below ACAT IAM varies. The Service determines who is responsible. For example, it may be a PEO or it may be a systems command. 3. I could not find who makes the final TRL rating and who approves it? The TRA will contain the panel’s recommendation. For ACAT IAM, it is endorsed by the S&T Executive (usually in a cover letter) but I have seen instances where the S&T Executive has changed it in the cover letter. Since TRAs are signed out by the Component Acquisition Executive to OSD, he is the final approving authority. The TRA is a regulatory information requirement at MS B and C for all programs! ISAM 05

22 Lessons Learned Identifying and Assessing Software CTEs
CTEs must both impact an operational requirement and be “new or novel” CTE is “new or novel” if it is to be used in a different environment, e.g., Physical Environment. For instance, mechanical components, processors, servers, and electronics; kinetic and kinematic; thermal and heat transfer; electrical and electromagnetic; climatic—weather, temperature, particulate; network infrastructure Logical Environment. For instance, software (algorithm) interfaces; security interfaces; Web-enablement Data Environment. For instance, data formats and databases; anticipated data rates, data delay and data throughput; data packaging and framing Security Environment. For instance, connection to firewalls; security appliqués; rates and methods of attack User and Use Environment. For instance, scalability; upgradability; user behavior adjustments; user interfaces; organizational change/realignments with system impacts; implementation plan. TRL assignment Readiness in relevant environment (TRL 6) requires detailed architecture Readiness in an operational environment (TRL 7) requires evidence of acceptable performance under system loading, user interaction, and a realistic communications environment The program does not have to be the first application of a technology for the technology to meet the “new or novel” CTE criterion. A technology may have been applied previously, but if that was in an environment different from the planned operational environment, then the technology still satisfies the “new or novel” criteria and should be identified as a CTE if it is also critical to affordability or an operational requirement. For example, something should not be automatically excluded as a CTE possibility on the basis being a mature COTS product, because the environment in which it is to be applied may be new. Consider the DoD networked security environment. The security environment includes hardware components (e.g., firewalls, network gateways), logical components, (e.g., potential virtual circuits), and data. Requirements for the security environment can often be derived from IA requirements. In addition, the systems architecture can be a source of information. The rates and methods of attack during wartime and peacetime may also be ele­ments of the security environment. Technical experts in IT and network security can be helpful in defining and evaluating the security environment. An important question is the anticipated differences in environment in wartime as compared with the environments in peacetime. Often, the security requirements tighten during wartime, and evaluators should take care in defining those differences. It may well be the case that a COTS product may never have used in a security environment relevant to DoD needs. In which case, neither TRL 6, the entry criterion for MS B, or TRL 7, the entry criterion for MS C, will have been achieved. ISAM 05

23 Measuring Technology Maturity (Software)
System Test, Launch & Operations System/Subsystem Development Technology Demonstration Technology Development Research to Prove Feasibility Basic Technology Research TRL 9 TRL 8 TRL 7 TRL 6 TRL 5 TRL 4 TRL 3 TRL 2 TRL 1 Actual system proven through successful mission proven operational capabilities Actual system completed and mission qualified through test and demonstration in an operational environment System prototype demonstration in an operational high fidelity environment Module and/or subsystem validation in a relevant end-to-end environment Module and/or subsystem validation in a relevant environment Module and/or subsystem validation in a laboratory environment, i.e. software prototype development environment Analytical and experimental critical function and/or characteristic proof of concept Technology concept and/or application formulated Basic principles observed and reported TRLs were originally developed by NASA for measuring the technology maturity of hardware systems. Their purpose was to avoid transition to the next stage of program development until the technology was “ready.” (Next slide shows how moving forward with immature technologies leads to schedule slips and cost increases.) In practice, DoD found it difficult to use these NASA-based hardware TRLs for software. What you see on this slide is a slight modification of the TRL definition as adapted for software. The Deskbook also provides a description of each TRL and supporting information that the independent assessor would look for in order to make the TRL determination. See below for a comparison of TRL 6 for hardware and software. Definition: Hardware System/subsystem model or prototype demonstration in a relevant environment Software Module and/or subsystem validation in a relevant end-to-end environment. Description: Hardware Representative model or proto­type system, which is well beyond that of TRL 5, is tested in a relevant environment. Represents a major step up in a technology’s demonstrated readiness. Examples include testing a prototype in a high-fidelity laboratory environment or in a simulated operational environment Software Level at which the engineering feasibility of a software technol­ogy is demonstrated. This level extends to laboratory prototype implementations on full-scale realistic problems in which the software technology is partially integrated with existing hard­ware/software systems. Supporting Information: Hardware Results from laboratory testing of a prototype system that is near the desired configuration in terms of performance, weight, and volume. How did the test environment differ from the operational environment? Who performed the tests? How did the test compare with expectations? What problems, if any, were encountered? What are/were the plans, options, or actions to resolve problems before moving to the next level? Software Results from laboratory testing of a prototype package that is near the desired configuration in terms of performance, including physical, logical, data, and secu­rity interfaces. Comparisons between tested environment and operational environment analyti­cally understood. Analysis and test measurements quantifying contribution to system-wide requirements such as through­put, scalability, and reliability. Analysis of human-computer (user environment) begun. ISAM 05

24 Defense Bill, HR 1815, Section 801 requires MDA to certify, prior to
Technology Readiness Assessments Importance to the Milestone Decision Authority (MDA) The MDA uses the information to support a decision to initiate a program Trying to apply immature technologies has led to technical, schedule, and cost problems during systems acquisition TRA established as a control to ensure that critical technologies are mature, based on what has been accomplished According to a GAO review of 54 DoD programs*: Only 15% of programs began systems development and demonstration with mature technology (TRL 7) Programs that started with mature technologies averaged 9% cost growth and a 7 month schedule delay Programs that did not have mature technologies averaged 41% cost growth and a 13 month schedule delay As the DAE, DUSD(AT&L) Krieg has already expressed his interest in TRAs. At a September 27, 2005 Senate Armed Services Committee, he testified: “Technology maturity is a factor in reducing program risk, thereby reducing near and long-term program costs. We implemented Technology Maturity assessments to assess if acquisition programs require more mature technology before entering the next phase. In addition, we have increased the number of demonstrations and prototypes, further ensuring adequate technology maturity and military utility by ‘trying before buying.’ ” Note that the words ‘trying before buying’ parallel the Packard Commission recommendation to ‘fly before you buy.’ The GAO data shown include both ACAT ID and ACAT IAM programs. Note that GAO’s definition of maturity for MS B is TRL 7, vice TRL 6 that DoD adopted. Figures above represent development cost and schedule. The banner shows new Congressional interest. For the past several years, Congress received an annual report describing programs initiated with immature technologies (less than TRL 6). Now the reporting is more continuous. Note that this reporting requirement is for MDAPs only. 1. I don’t understand why GAO would have a different TRL rating acceptance than DoD—DoD would always be wrong then? Why aren’t they the same so we can avoid GAO scrutiny? In its report, GAO recommended that DoD require TRL 7 at MS B. DoD did not accept that recommendation, and instead required TRL 6. It the subsequent report that I quoted, GAO made the comparison to its own recommendation, not to the DoD policy. Defense Bill, HR 1815, Section 801 requires MDA to certify, prior to Milestone B, that the technology in the program has been demonstrated in a relevant environment (TRL 6) * Defense Acquisitions: Assessments of Selected Major Weapon Programs, GAO , March 2005 ISAM 05

25 Technology Readiness Assessments Value to the Program
The PM uses the expertise of the assessment team and the rigor and discipline of the process to allow for: Early, in depth review of the conceptual product baseline Periodic in-depth reviews of maturation events Highlighting (and in some cases discover) critical technologies and other potential technology risk areas that require management attention (and possibly additional resources) The PM, PEO, and CAE use the results of the assessment to: Optimize the acquisition strategy and thereby increase the probability of a successful outcome Determine capabilities to be developed in the next increment Focus technology investment The information on this slide is not theoretical. The material is an adaptation of things that have actually happened in the Navy. A best practice is to include all CTEs in the program’s risk database and develop Technology Maturation Plans (TMP) for how and when each CTE will mature to TRL 9. Aside from the formal TRAs at MS B and MS C, the status of maturation activities should be monitored during the program’s technical reviews so that actions can be taken if there is a problem. This also facilitates how the program’s leadership team keeps tabs on something that has proven to be a driver of cost growth and schedule slippage. TRAs also help PMs with a problem documented by GAO in its report recommending that DoD adopt TRAs. GAO said “Program managers’ ability to reject immature technologies is hampered by (1) untradable requirements that force acceptance of technologies despite their immaturity and (2) reliance on tools that fail to alert the managers of the high risks that would prompt such a rejection.” [GAO/NSIAD ] GAO also noted that a commercial best practice was to minimize technology development during product development and match requirements with technological capability before product development is launched. For more information: Contact Mr. Jack Taylor or see the TRA Guidance at ISAM 05

26 TRA Process Overview Collect data PM responsibility
Coordinate with S&T Exec Keep DUSD(S&T) informed PM responsibility Best Practice: Independent review team appointed by S&T Exec verifies Collect data PM responsibility Coordinate with S&T Exec Keep DUSD(S&T) informed PM responsibility S&T Exec responsibility Appoints independent review team to do it; PM funds it The Component S&T Executive is responsible for conducting TRAs. For programs from independent agencies such as DLA or DISA, the CIO has often been designated as the S&T Executive. The PM also plays a big role. The PM needs to set the schedule for the TRA, starting well before MS B to allow sufficient time for review and iteration. It is recommended that the process begin at least 6 months prior to MS B for ACAT IAM programs. It is also a best practice to integrate the TRA schedule into the program’s Integrated Master Schedule. The PM funds the TRA. When the TRA requirement was established, no additional resources were allocated to the Component S&T Executives for this purpose. Costs cover convening the independent panel (at least twice physically for each TRA, once for CTE identification and once for CTE maturity assessment) and preparation of the report. Although the PM (with the independent panel as a best practice) is responsible for identifying the CTEs, other organizations may play a role. Since the Component S&T Executive is responsible for the TRA, that office may add CTEs. In addition, DUSD(S&T) may want something included as a CTE. It is obviously a good idea to accept such recommendations since DUSD(S&T) reports to the DAB on this subject. The DUSD(S&T) review of the TRA may have three outcomes: (1) the TRA is accepted; (2) there is disagreement and DUSD(S&T) conducts it own independent assessment. (never been done); or (3) changes are requested, sometimes before the DAB and sometimes after the DAB as directed in an ADM. S&T Exec coordinates Acquisition Executive submits DUSD(S&T) responsibility ISAM 05


Download ppt "Jack Taylor Office of the Deputy Assistant Secretary of"

Similar presentations


Ads by Google