Operational Test & Evaluation Industrial Methods for Effective Development and Test of Defense Systems Ernest.Seglie@osd.mil Science Advisor Operational Test & Evaluation
The Situation Previous* Current** Require an operational perspective in DT to identify failure modes early Require technologies to have demonstrated maturity before becoming part of a program Combine data across development to improve design - Improve expertise - Archive data to support Field new capability rapidly Engage early in requirements determination and test planning Integrate all testing using design of experiments to facilitate Substantially improve suitability; assure reliability growth to meet requirements * NRC Report, 2006, Testing of Defense Systems in an Evolutionary Environment ** DOT&E Annual Report, December, 2009
Three Aspects of the Problem Determination of Technological Maturity Early identification of Failure Modes Integration of Test Data, field performance, modeling and simulation
Technological Maturity Congress Immature technologically causes of delays in system development, cost increases, and reduced system performance when fielded. The suitability and effectiveness of individual components of defense systems are often assessed with respect to given requirements and specifications. Assessment of technological maturity in isolation is difficult in and of itself, but in addition, while individual components may be reliable in isolation, there may still be difficulties involving interoperability with the remaining system. How much of testing should be of the components as isolated systems and how much should be devoted to tests of the components within the functioning of the parent system? Evaluate components efficiently to determine whether they are or are not sufficiently mature for use. This will include a discussion of how the assessment of maturity of software components is different than that of hardware components.
Finding Failure Modes The earlier the identification, the less expensive are the design changes that are needed for correction. Identify failure modes earlier in system development and whether such techniques are system dependent or whether some generally applicable principles and practices can be discovered. Extent to which it is important to utilize operational realism in testing schemes ---what realism needs to be present, what can be safely simulated, and what can be safely ignored? Also, what is meant by the envelope of operational performance for a system and how far beyond that envelope should one test to discover design flaws and system limitations?
Integrated Testing, Combining Information Can testing be planned more effectively to enable combining test data in order to facilitate early and efficient learning? What statistical tools? How to make best use of data from component level testing and modeling and simulation? Field performance data can be extremely useful for identifying design deficiencies for systems developed in stages. Field use versus data collection; system performance in DT, and OT; modeling and simulation; results of these various sources for earlier stages of development - and for closely related systems, or systems with identical components. - However, these disparate sources of information are from the operation of a system in very different contexts and also involve different systems, and it is therefore a challenge to develop statistical models for combining these data. Further, field performance data could be used as part of a feedback loop to produce a review of the results from modeling and simulation and developmental and operational testing. How should this feedback mechanism operate to improve modeling and simulation and developmental and operational test and evaluation?
Approach Assemble a panel of industry and defense systems experts to collectively examine: - Best practices - DoD challenges to implementing the best practices - Ways to address the challenges