Download presentation
Presentation is loading. Please wait.
Published byEthelbert Stevenson Modified over 9 years ago
1
1 ABSVal Goal: Adapt accepted Best Practices of VV&A* to simulation models that... a.Display emergent behavior b.Are used to model military effects on population dynamics and social phenomena as well as military decision making c.Support analyses *maybe refine the general Best Practices
2
2 B.L.U.F. WHAT I THINK WE LEARNED or CONFIRMED VV&A universally reviled. –EXCEPTION: DMSO employees and alumni VV&A general principles do not map well to analysis domain. –describing anticipated use is problematic –analysis = exploration ABS translates to Models exhibiting Emergent Behavior. –pre-production experimentation focused on achieving top-down control (building predictive capability) on dynamics that initially display emergence –scientifically examining emergence is interesting We can help analysts and decision makers distinguish good analysis from bad analysis. Minimally acceptable sim-to-app validation is recognizable, achievable, useful, and rare.
3
3 OUTLINE 1.SIMULATION MODELS 2.A LITTLE EMERGENT BEHAVIOR 3.THINKING ANALYSIS 4.EXPOSING THE MATCH
4
4 “In contrast to this interest in model-related technology, there has been far too little interest in the substance of the models and the validity of the lessons learned from using them. In our view, the DoD does not appreciate that in many cases the models are built on a base of sand.” The Base of Sand Problem: A White Paper on the State of Military Combat Modeling. Paul K. Davis Donald Blumenthal
5
5 WORKABLE DEFINITIONS Conceptual Model – Description of the system in some abstracted/symbolic formalism, usually mathematics. Verification – The simulation executable faithfully reflects the Conceptual Model Validation – The degree to which the system described in the conceptual model is appropriate in supporting the intended use. Accreditation – The judgement, made by someone responsible for the outcome, that the simulation is adequately verified and valid for the intended use.
6
6 COMMENTS Conceptual models are always incomplete. Verification of a simulation is a scientific endeavor if the conceptual model is complete. A simulation is never “Valid.” Analytical intended uses are difficult to deal with… –Repetition is very rare. –Analysts have no way to scientifically express the intended use. –Analysts often accept very poor data and models, and often express grave caveats for their results.
7
7 formal transitions T(M) implementation and design executable code conceptual model ideal simnatural system IDEALIZED DEVELOPMENT PROCESS
8
8 formal transitions T(M) implementation and design executable code conceptual model ideal simnatural system IDEALIZED DEVELOPMENT PROCESS abstraction modeling mapping to sim design pattern coding and testing software design
9
9 formal transitions T(M) implementation and design executable code conceptual model ideal simnatural system IDEALIZED DEVELOPMENT PROCESS abstraction modeling mapping to sim design pattern coding and testing software design Driven by analytic task. More later...
10
10 executable code ideal simnatural system REALITY FOR BIG-IRON SIMS abstraction data development
11
11 formal transitions T(M) implementation and design executable code conceptual model ideal sim FOR A GIVEN ANALYSIS natural system FOR ANOTHER DAY FOCUS
12
12 "The more complex the model, the harder it is to distinguish unusual emergent behavior from programming bugs." Douglas Samuelson Renowned Operations Research Analyst
13
13 ABSVal PROJECT ABSVal Framework Test Examples –Pythagoras COIN –SZ/BZ Obstacle Reduction Analysis Conclusions
14
14 VALIDATION First-Principles Validation –Assess the theory behind the conceptual model And predict its impact on the ensuing analysis –Examine the implementation of the theory And predict its impact on the ensuing analysis –Examine the combinations of theories used together Results Validation –Compare output data to data from another source Historical case Another model Intuition
15
15 VALIDATION First-Principles Validation –Assess the theory behind the conceptual model And predict its impact on the ensuing analysis –Examine the implementation of the theory And predict its impact on the ensuing analysis –Examine the combinations of theories used together Results Validation –Compare output data to data from another source Historical case Another model Intuition VALIDATION = EXPOSITION + ASSESSMENT The EVIDENCE on which acceptance is based.
16
16 formal transitions T(M) implementation and design executable code conceptual model ideal simnatural system Simulations displaying emergent behavior are difficult to validate because it is difficult to predict their behavior from the Conceptual Model Therefore there is greater pressure to use results validation.
17
17 “All models are wrong, some are useful.” George Box Wartime Statistician
18
18 ANALYSIS Predict the response (absolute) Predict the response (as compared to a baseline) Predict the functional form of the response for a set of independent variables Predict the sign of the gradient (set of 1 st derivatives) Is there any response? Predict the min/max of the response over a high- dimensional domain Predict x i in [L i, U i ] such that response > c Characterize the probabilistic nature of the response Compare to physical sciences -- These are very humble goals. Might a medical/biological mindset be more appropriate?
19
19 …more ANALYSIS Provide the best decision support possible, and include useful assessment of the value of the analysis vis. the questions & issues at hand.
20
20 IDEAL STUDY PROCESS DETERMINE THE QUESTION DETERMINE THE QUESTION DETERMINE THE MOE’s and the EEA’s DETERMINE THE MOE’s and the EEA’s FIND or BUILD the BEST (SIMULATION) MODEL FIND or BUILD the BEST (SIMULATION) MODEL PRODUCTION RUNS PRODUCTION RUNS PRESENT (and DEFEND) RESULTS PRESENT (and DEFEND) RESULTS
21
21 SIMULATION-SUPPORTED ANALYSIS Baseline/Excursion or Factorial Experiment Driven to answer Analysis Questions Key Elements of Analysis Constraints, Limitations, and Assumptions
22
22 Schism Agent-based simulations use modular rules and local reasoning to produce realistic and/or interesting emergent aggregate behavior. –Surprise is good** Successful simulation testing (core to face/results validation) based on demonstrating credibility across the range of potential input. –Surprise not good** ** Refined later in this talk
23
23 GOAL: STOP BEING SURPRIZED Surprise Explore Explain Accept/reject Production Runs How do we tell about this experience? In control, no more surprises 1.“Unnatural acts” reflect negatively on a sim 2.Once we achieve top-down control, is there still emergent behavior?
24
24 ELEMENTS Adaptation of the Yost Scale SIMULATION DYNAMICS based on accepted physical laws based on accepted social dynamics based on common sense distillation –simple model relic required to facilitate actions –simple model relic required to maintain consistency top-down human intervention DATA authoritative value measured witnessed argued by logic sensible range guess/arbitrary dimensionless RELEVANT DYNAMICS + REQUIRED DATA = ELEMENT e.g. underwater detection using observed detection range data in a cookie-cutter model
25
25 ELEMENTS Adaptation of the Yost Scale SIMULATION DYNAMICS based on accepted physical laws based on accepted social dynamics based on common sense distillation –simple model relic required to facilitate actions –simple model relic required to maintain consistency top-down human intervention DATA authoritative value measured witnessed argued by logic sensible range guess/arbitrary dimensionless RELEVANT DYNAMICS + REQUIRED DATA = ELEMENT e.g. underwater detection using observed detection range data in a cookie-cutter model CONTROLABLE ABSTRACTION ANALYTICALLY DESIRABLE
26
26 “It’s the Data, Stupid.” Phalanx, DEC 07 George Akst
27
27 Constraints, Limitations, and Assumptions Guide TRADOC Analysis Center 255 Sedgwick Avenue Fort Leavenworth, KS 66027-2345 TRAC-TD-05-011 (rev. 1) January 2008 Mike Bauman
28
28 PARSING SOURCES OF VARIABILITY CORE: drive the results of your experiment, align with the key elements of analysis DYNAMIC CONTEXT: has impact on the circumstances relevant to exercising the core model dynamics, create situations, not elements of analysis CASES: details necessary to support the model, cases to be considered to achieve analytical goals C.L.A.: Constraints, Limitations, and Assumptions necessary to give scope the analysis, and interpret the results
29
29 IMPACT ON ANALYSIS Agent-based design is reputed to enable fast and easy construction of dynamic context Dynamic Context elements can display emergent behavior to add variability –Emergent behavior is often not predictable/controllable Big-iron simulations often have parametric (knob) control over Case elements –impossible to promote these to Dynamic Context or Core elements –should NOT be elements of analysis Ideally, analysts should have the most faith in their Core elements –should have high-quality data (high on the YOST scale) –should have well-studied dynamics (high on the YOST scale) –must not display uncontrolled emergent behavior Limitations on the Core = Limitation of the simulation for analytical purposes Core and Dynamic Context Elements should results-proven to be consistent with SME (explainable 1 st derivative) Core elements should be results-proven to be highly influential (see Scientific Method of Choosing Model Fidelity)
30
30 IMPACT ON ANALYSIS Agent-based design is reputed to enable fast and easy construction of dynamic context Dynamic Context elements can display emergent behavior to add variability –Emergent behavior is often not predictable/controllable Big-iron simulations often have parametric (knob) control over Case elements –impossible to promote these to Dynamic Context or Core elements –should NOT be elements of analysis Ideally, analysts should have the most faith in their Core elements –should have high-quality data (high on the YOST scale) –should have well-studied dynamics (high on the YOST scale) –must not display uncontrolled emergent behavior Limitations on the Core = Limitation of the simulation for analytical purposes Core and Dynamic Context Elements should results-proven to be consistent with SME (explainable 1st derivative) Core elements should be results-proven to be highly influential (see Scientific Method of Choosing Model Fidelity) Taxonomy for sources of variability reflecting the relationship between model dynamics and analytical goals. ** Jargon for communicating how a sim element relates to the analysis. ** Identifies appropriate role for elements with emergent behavior in an analysis.
31
31 GOLDEN GATE BRIDGE a solid connection… simulation/data capabilities analytical requirements
32
32 TACOMA NARROWS BRIDGE or, not so much. simulation/data capabilities analytical requirements
33
33 CoreExperimental n Parametric Settings Dynamic Context Stochastic 1 Parametric Setting CasesDiscrete Casesm Cases CLAStatic 1 fixed set of assumptions RECOMMENDED HANDLING
34
34 EXAMPLE Question: What is the tactical value of LW components to a rifle squad? Core: weapon, computer/comm/SA, sight/NVG Dynamic Context: paths of maneuver, acquisitions and detections, paths & actions of threat, attrition, … Cases: terrain type (urban, jungle, alpine), scale (company, platoon), mission (HVT, defend a FOB) CLA: kinetic outcome, unambiguous threat, terrain representation
35
35 EXAMPLE Question: What is the tactical value of LW components to a rifle squad? Core: weapon, computer/comm/SA, sight/NVG Dynamic Context: paths of maneuver, acquisitions and detections, paths & actions of threat, attrition, … Cases: terrain type (urban, jungle, alpine), scale (company, platoon), mission (HVT, defend a FOB) CLA: kinetic outcome, unambiguous threat, terrain representation emergent behavior dynamics fit here don’t average over these cases
36
36 “Those claims to knowledge that are potentially falsifiable can then be admitted to the body of empirical science, and then further differentiated according to whether they are (so far) retained or indeed are actually falsified.” Carl Popper Philosopher of Science
37
37 NEGATIVE INFORMATION for IN-VALIDATION Elements not data-driven Elements not controllable Element displays undesired emergent behavior Element displays unexplainable 1 st -order influence (results schism unexplainable) Element not in the anticipated layer –level of influence is more/less than anticipated by analyst –dynamics or data are... too low on the Yost scale mismatched vis. the Yost scale
38
38 NEGATIVE INFORMATION = IN-VALIDATION? no concernshow stopper Negative information: scopes analytical value of results. Analyst’s art: responsibly expand this scope. “This approach uses a very unrealistic model of certain dynamics, but it creates adequate dynamic context to stimulate the core elements in a way useful to our analytic goals.”
39
39 “Computer programs should be verified, Models should be validated, and Analysts should be accredited.” Alfred G. Brandstein Renowned Military Operations Research Analyst Founder of Project Albert
40
40 THE ANALYST Prior to any experience with the simulation, can the Analyst... –Pose analytic questions mathematically? –Describe the experiment? –Identify Core vs. Dynamic Context elements? –Specify CLA elements? –Evaluate Core elements on Yost scale? –Disclose all outcomes the analyst anticipates matching with the simulation (Test Cases)? Once experience has been gained, can the Analyst... –Explain changes to anticipated Core/Dynamic Context/Case/CLA classification? –Describe all testing and tuning required? –Quantify level of influence of each Core & Dynamic Context statistically? –Avoid integrating (averaging) Cases? –Explain the impact of each CLA on the results? –Statistically determine the level of agreement of the simulation outcomes with the Test Cases? Resulting analysis should be peer-reviewed
41
41 Bottom line
42
42 Have lots of computational experience with your model. Understand and be able to control it emergent behavior. Plan and execute experiments, document. Disclose relationship between each important sim element and the analytical goal. –Core –Dynamic Context –Cases –CLA
43
43 QUESTIONS?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.