Validation Methodology for Agent-Based Simulations Workshop Perspectives on Agent-Based Simulation and VV&A Dr. Bob Sheldon Joint and External Analysis Branch Operations Analysis Division Marine Corps Combat Development Command 01 May 2007
Overview VV&A and Agent-Based Simulation (ABS) thoughts from Dr. George Akst, Senior Analyst, Marine Corps Combat Development Command (MCCDC) MORS historical perspectives on VV&A and ABS Personal reflections
It’s the data, stupid! How do you come up with data for parameter Z = x.x %? Especially a problem for Irregular Warfare (IW) Sometimes, model developers who are structuring algorithms don’t worry about data & assume data can be developed after the fact Dr. Kirk Yost data triage – consider data sources when building models –Generally accepted (produced regularly by some believable source) –Semi-valid (reasonable information derived from various sources) –Judgment and knobs If you start with meaningless data, and execute a design of experiments with 2 10 runs (just because you can), then you will have 2 10 useless results To be useful, ABS need to provide more than just simplistic insights ABS should go beyond being an automated tool that regurgitates SME intuition Perspectives from Dr. Akst
More Dilbert Data
Perspectives from Dr. Akst Two ends of the spectrum Engineering-level model: should very closely predict how system would operate in the real world Campaign-level model: measure relative differences that changes to forces, tactics, or equipment have on the outcome Trying to literally match a combat model’s results with some other set of results (real world, experiment, or another model) is not realistic What validation is: Failure to invalidate after concerted effort Ascertaining that results are “plausible” – no obvious logic flaws and results are “reasonable” and “relatively consistent” with past modeling results From “Musings on Verification, Validation, and Accreditation (VV&A) of Analytical Combat Simulations” Phalanx, September 2006
MORS Meetings on VV&A Simulation Validation (SIMVAL), October 1990 SIMVAL II, April 1992 SIMVAL '94, September 1994 Simulation Validation tutorial, MORSS & ALMC, 1995 (Pete Knepell) SIMVAL '99: Making VV&A Effective and Affordable, January 1999 Evolving Validation Topics in MORS Descriptive validity, Structural validity, Predictive validity Structural validation, Output validation Conceptual Model validation, Data validation, and Output validation
MORS Meetings on ABS New Techniques: A Better Understanding of their Application to Analysis, November 2002 Included 1-day tutorial on Agent-Based Models Agent-Based Models and Other Analytic Tools in Support of Stability Operations, October 2005 Plus substantial coverage in MORSS working groups, e.g., WG 31 – Computing Advances in Military OR and WG 32 - Social Science Methods
Personal Reflections How to validate (or invalidate) counter- intuitive results (e.g., Surprise) Clay Thomas “Analysis either verifies your intuition or educates your intuition.” Simple visualization helps validation Gantt chart example for sortie generation Provide visualization that SMEs understand
Personal Reflections (Cont’d) Ready access to source code helps Example: Effect of (0,1) parameter Good mathematical documentation a plus
Comparing counter-intuitive results to “intuitive” results: a case study At a Project Albert workshop, the agent-based model Socrates gave counter-intuitive results Simulation attrition results varied over 3 phases with 2 breakpoints When I fit a Lanchester linear model to the results, the regions where the fit was “bad” corresponded to the counter-intuitive results Drill-down investigation explained these anomalies Mysterious results were due to scenario data & tuning parameters Personal Reflections (Cont’d) “Comparing the Results of a Nonlinear Agent-Based Model to Lanchester’s Linear Model” Maneuver Warfare Science 2002
Questions? Juan Muñoz, Five Seated Figures, 1996 Hirshhorn Museum and Sculpture Garden