Presentation is loading. Please wait.

Presentation is loading. Please wait.

5/6/2018 Modeling Architecture for Technology, Research and EXperimentation (MATREX) Program Briefing to VV&A Technical Working Group 6 Aug 2004 Ms.

Similar presentations


Presentation on theme: "5/6/2018 Modeling Architecture for Technology, Research and EXperimentation (MATREX) Program Briefing to VV&A Technical Working Group 6 Aug 2004 Ms."— Presentation transcript:

1 5/6/2018 Modeling Architecture for Technology, Research and EXperimentation (MATREX) Program Briefing to VV&A Technical Working Group 6 Aug 2004 Ms. Phil Zimmerman (with lots of help from Ken Steiner, and John Tufarolo) Good Morning I’m CPT Steve Nerenberg, Deputy STO Manager of the MATREX STO For Those Of You Who Came to See The MATRIX Movie, I’m Sorry to disappoint you…. Purpose of this morning’s briefing is to Inform the RDA Domain Community about the MATREX Program

2 Agenda Current: STO Overview Current: Problem Domain
VV&A Useful (or more problematic?): Aim: Architectural Approach Future: Enterprise Architecture Current: Using other good ideas… The Challenges within the federation because of MATREX usage

3 MATREX STO – Support System of System Issues
5/6/2018 Purpose: Design a persistent, distributed simulation capability leveraging Government & industry assets Provide a persistent, distributed simulation environment for evaluation of Future Combat Systems and Future Force concepts Product: Realistic engineering representation of systems and environments for SMART Reusable environment where sub-system models can be integrated into an established architecture for analysis & technology trade offs Component Repository to support “building block” implementations Code Generation to bind components automatically Meta Model to ensure appropriate model interactions Provide RDECOM expertise, M&S tools and solutions to LSI, PEOs, & PMs Primary RDECOM M&S base for FCS increment 2 Payoff: Realistic representation of components Common reference implementation for component and server integration Accredited facilities for distributed applications Multi-resolution for enhanced large scale exercises Reduced time and cost for development and evaluation of FF concepts and products Assist PMs in pursuing feasible system options and approaches Joint Virtual Battlespace (JVB) Virtual Distributed Laboratories for Modeling & Simulation Developing Flexible Architecture to support modeling and simulation (M&S) for SMART Developing NEBC Systems of Systems M&S Capability M&S system integration expertise Engineering Component Models Domain Expertise (SMEs) Best of Breed Models Developing Distributed Network Infrastructure to support Distributed Virtual Laboratory capability Our focus is system design tradeoffs and how technology can be leveraged for force transformation We need to operate within the context of scenarios such as a TRADOC high resolution scenario and approved system books Our environment provides for engineering level models within an Unit of Action We federate with Joint and Unit of Employment level simulations to provide larger context We leverage TRADOC for Scenarios (both high and low resolution) TTPs We leverage ATEC DT, Component testing – we support with an interface OT, SOS Testing owe support with operational context interfaces AEC We leverage OneSAF OTB – exposing interfaces to plug in engineering level models (uses the OTB unit behaviors) OOS – Coordination to align architecture and leverage tool sets

4 Feedback WARFIGHTER CONCEPT M&S Tool Reuse MATREX Payoffs
5/6/2018 The RDECOM M&S Framework for Systems of Systems RD&E The basis of FCS LSI Simulation Virtual Framework (SVF) Critical component to enable “concept to product” for the warfighter Enables critical collaboration between Requirements, S&T, and T&E Formal collaboration with TRADOC Futures Center and ATEC WARFIGHTER CONCEPT BLCSE VPG MATREX S&T ACDEP TEST TRADOC Futures Center RDECOM ATEC M&S Tool Reuse Feedback From DTO: This M&S toolset and architecture, with improved representations of C4ISR, weapons effects, force effectiveness, etc., will support multiresolution experiments in support of the evaluation of system-of-systems concepts and developments (e.g. Army Transformation). The simulation environment will support engineering trade studies on the impact of information, information systems (sensors, communications, decision aids), and new tactics, techniques and procedures (TTPs). It will also enhance the design, development, test and evaluation, training, and analysis of network-centric warfare systems and concepts…It will also enhance the design, development, test and evaluation, training, and analysis of network-centric warfare systems and concepts. In addition, this M&S toolset will; reduce timelines and cost for fielding of systems through the implementation of SMART; enable the warfighter to experiment with current and future systems, systems of systems, and doctrine in a physics-based commons synthetic environment; and provide an improved understanding of complex problems and systems for acquisition decisions that accelerate fielding of equipment.”

5 MATREX Domain 5/6/2018 Our focus is system design tradeoffs and how technology can be leveraged for force transformation We need to operate within the context of scenarios such as a TRADOC high resolution scenario and approved system books Our environment provides for engineering level models within an Unit of Action We federate with Joint and Unit of Employment level simulations to provide larger context We leverage TRADOC for Scenarios (both high and low resolution) TTPs We leverage ATEC DT, Component testing – we support with an interface OT, SOS Testing owe support with operational context interfaces AEC We leverage OneSAF OTB – exposing interfaces to plug in engineering level models (uses the OTB unit behaviors) OOS – Coordination to align architecture and leverage tool sets How do we do this? Well, we don’t do it alone…. Lots of Expertise out there…We work with other agencies: TRADOC, ATEC and leverage other programs, ie OneSAF. For example: We plug Engr Level Models into OTB -Conducting coord. W/ Objective OneSAF to align our architecture in order to leverage the OOS tool set So WE don’t have to reinvent the Wheel.

6 Mission: Support System of Systems Issues
5/6/2018 Unit of Action in the context of Unit of Employment and Joint Systems and C2 Nodes Consistent Representation of natural and manmade environmental factors System of Systems Command & Control with Communication, Network, and HPM effects From DTO: “A framework will be built for integrating into a common simulation environment Army/Joint simulations of various fidelity, both legacy and new simulations, with dynamic C2 and data flows that span the full battlefield spectrum from JTF to entity level.” EMPHASIZE JOINT Dismounted Infantry and Behaviors Variable Resolution & Fidelity Platform Representation (Comms, Sensors, Mobility, Weapons, etc.) Provide capability for flexible, traceable, & measurable System of Systems simulations for Army & Joint RD&E

7 MATREX Architectural Approach
5/6/2018 Current Approach MATREX Approach Single Point of Execution, Run-time monitoring & DCA MLRS MLRS Simulation 1 Data 1 Weather 1 Simulation 1 Terrain 1 Data Algorithms 1 M1A2 M1A2 Weather Core Services Data 2 Simulation 2 Simulation 2 Weather 2 Terrain 2 Terrain Algorithms 2 Current Approach – Ea. Model maintains own data, terrain,etc…NOT A PROBLEM WHEN YOU ARE USING THE SIM BY ITSELF, but when you use them together – lots of potential for error – mismatch in data representation ie Line of sight calculation. A Tank in One Sim. Has trees, the tank in another doesn’t. One tank can see and shoot the other. Unfair playing field. Additionally – Overhead cost is high. (lots of Time, Resources to integrate different models to play together – for one use. Must make many code changes for the next event. MATREX Approach Flexible Environment. Consistent data and algorithm representation. The architecture allows you to introduce different models but maintain consistent data and algorithms. Comanche Comanche Algorithm Simulation 3 Data 3 Simulation 3 Weather 3 Terrain 3 Algorithms 3 All Simulations Use Consistent Data and Algorithms Comon Functional Components Flexible, Composable, Reusable Cost Effective Analysis Precoordinate to Ensure Everything is as Consistent as Possible. Post Coordinate to Interpret Inconsistencies Between Models

8 MATREX Enterprise Architecture To Be
5/6/2018 JOINT, Other MATREX Exercise Meta-Model MATREX Object / Component Meta-Model TRADOC BLCSE OneSAF Composition Tools ATEC SEIT, VPG MATREX Exercise Description MATREX Domain Model Descriptions Collaboration Infrastructure (RTI, TENA MW, J2EE) Adaptation Layer User Defined Algorithms/ Behaviors Container Support Framework OneSAF Scenario Tools Deployment Code Generator From DTO: This framework will have a standardized process for integrating physics-based models provided by the RDECOM laboratories and other organizations as appropriate Object Meta-Model Describes the basic object building blocks that are used to develop components and applications Attributes, methods, inheritance, containment An example would be a TSPI (time-space position information) object that encapsulates entity state information and provides conversion utilities Component Meta-Model Describes reusable elements (more coarse than objects) that have configurability and deployment characteristics Execution Meta-Model Describes physical allocation of components to process and computing resources Describes scenario initialization information and operating performance requirements Object / Component Code Generator Deployment & Execution Scripts Development Phase Exercise characteristics using stds (e.g., XML XSLT) Generate config files, deployment & execution scripts Generate configuration files and/or code to specialize infrastructure runtime aspects Deployment Phase Define object / component using standard definition template (e.g., XML XSLT) Generate container “glue” code to assist user development and/or integration Generate adaptor code to support appropriate infrastructure (e.g., HLA, TENA, J2EE) and supervisor functions Configuration Files Color Legend: Yellow – Standard Definitions or Tools Blue – User Defined Definitions or Code

9 Process Definition to Simulation Technology
5/6/2018 Simulation Architecture Definition Architecture Decomposition Army Doctrine OV-6b State Diagrams OV-6c Mission Threads Technical Challenge

10 A Standard Format for Simulation Scenarios
Independent Definition (SID) Simulation Dependent Definition (SSD) Scenario Definition Tools/Processes Scenario Transformation Tools/Processes SSDE OOS Simulation Specific <SSDL/> CERDEC-M System Engineering (TCAT) <MSDL/> All users of the MSDL format should cooperate with each other to further the standard format and share lessons learned MSDE MATREX Simulation Specific <Remote Creates/> Transformer To MSDL CSAT (MATREX) Legacy Scenario Tools MATREX Simulation Specific Config Files Common, controlled, extensible, standardized format that completely describes a military scenario for simulation. Simulation Architecture Definition Federate/Platform Mapping Federate/Platform Behavior Mapping Routing Number / Platform Mapping C3 Node generation algorithms Naming scheme

11 Validation Methods (DA PAM 5-11)
Conceptual Model/Structural Validation: Peer/Independent Review Output Validation: Face Validation Comparison to other M&S Stress Test and Sensitivity Analysis Animation, Graphics Playback, Visualization Turing Tests (Real or Model Results?) Model-Test-Model (Historical Use) Both: Functional Decomposition (SME Review)

12 MATREX VV&A Challenges
Federation Agreements FOM enumerations FOM encoding Use Variations

13 Challenge: Informal mechanism for capturing Federation Agreements
5/6/2018 Data exchange via HLA is defined via a Federation Object Model (FOM) Precise in data definitions and types Lacks semantic information describing when data will be sent and received by what components) Auxiliary documents are used to capture this additional information (“Federation agreements” and “Functional Descriptions Documents” – FDDs are used in MATREX) These documents are created in many forms in various federations, but all serve the same basic purpose This information is a critical part of describing and conveying how federate interfaces are intended to work. The informal means to capture this data allows for incomplete / ambiguous / non-existent interface definitions FOM Data exchanged over the RTI is defined in the the Federation Object Model (FOM). The FOM data includes precise data definitions and types, but does not include all of the semantic information describing when data will be sent and received. This additional information may be captured in federation agreement document(s). These documents are instantiated in many different forms in other HLA federations, but all serve the same basic purpose. The downside to capturing this information in a document format is that it virtually eliminates any chance for automated testing of the interfaces described in the documents. For example, a federation’s FOM data may define a specific interaction name called DETONATION with specific parameters and data types. However the FOM does not prescribe what circumstances would trigger a federate to generate this interaction, or what a receiving federate should do upon receipt of this interaction. This interface description of FOM object attribute exchanges and/or interaction exchanges is a critical part of describing and conveying how federate interfaces are intended to work together. If a means to capture these interface descriptions and FOM data in a testing tool is developed and evolved, then an automated means to test federates and their use of the interfaces is possible.

14 Challenge: FOM Enumeration values
5/6/2018 The FOM includes many enumerated list data types e.g., Platform types, munitions types Different systems use these enumerations differently Sending model takes internal values and aligns (maps) them to a FOM value. Receiving model does the inverse This mapping is typically in code, data files, or combinations thereof Issues include Missing values; duplicate values; mis-aligned values; illegal values; values that are not represented (implemented) in the actual model, etc. Many, many, many challenges exist in this area Enumerations have one or more relationships to internal model specifics Changing eumeration mapping data can be as invasive as changing code: A “validated” model that is subject to updated mapping files can quickly become inoperable

15 Challenge: FOM Data encoding
Another federation agreements item For Example: XDR is a typical, popular choice Insufficient to simply state “XDR encoding” Ambiguity remains in a few areas Optional attributes “Unsetting” attributes Others Common mis-encoding can go by un-detected Two cooperative components both encoding “incorrectly” in the same manner can in-fact, pass testing when a third party is added that is encoding correctly, the three may not work together; The impression is that the new addition is the cause of the problem. This issue can grow rapidly when more and more components are added

16 Challenge: Differences in Composition and Scale
Test environment is comprised of a specific collection of components often a representative subset of the intended fielded system Practicality and cost limits the size & scale of testing systems Operation of the same system at a fielded location often has different compositions than the tested system Typical operation of the same system at a fielded location often has different equipment than the tested system These differences can lead to a tested system working correctly at test, but the system fails to operate properly in the final fielded system configuration Robust evaluation of “Fair Fight” issues, including effects of network disconnectivity and variable transmission rates, for various configurations of live, virtual, and constructive hardware and software components linked together via wide-area-network.

17 Challenge: Differences in data and personnel
Many data-driven elements cause the tested system to operate differently than the fielded system Data sets for testing are rarely (for a variety of reasons) the same data as used in applications of the system Many parts of the tested system incorporate human inputs These data differences can drastically change the output of the piece-parts Test personnel become ultra-familiar with the system and operate the system in one manner Development of component-level V&V test plans, including level of testing fidelity and validation of human performance model, and rolling those up into a V&V plan for the entire federation….

18 Maj. Don Carter: Don.carlo.carter@us.army.mil
Questions Maj. Don Carter: Ken Steiner: John Tufarolo: Many issues noted herein are taken both personal experiences and from the following reference paper: Tufarolo, J., Canova, B., and Page, E A Case Study of Verification, Validation, and Accreditation for Advanced Distributed Simulation. ACM Transactions on Modeling and Computer Simulation, Vol. 7, No. 3, July 1997, Pages

19 Backups

20 DA PAM 5-11 – Method Overview
ASSESSMENT PLANNING PLANNING AVAILABLE, AS-IS MODIFY EXISTING DEVELOP NEW M&S REQUIREMENTS M&S APPROACH VV&A REQUIREMENTS REPLACE IT DEVELOPMENT CONCEPTUAL MODEL DESIGN M&S FIX IT V&V CHANGE PLANS ACCREDITATION CONCEPTUAL MODEL DESIGN M&S USE M&S Based on DA PAM 5-11


Download ppt "5/6/2018 Modeling Architecture for Technology, Research and EXperimentation (MATREX) Program Briefing to VV&A Technical Working Group 6 Aug 2004 Ms."

Similar presentations


Ads by Google