VV&A in Human, Social, Cultural Behavior (HSCB) Simulation Dr. Jimmie McEver Dr. David T. Signori Dr. Mike Smeltzer Evidence Based Research, Inc.

Slides:



Advertisements
Similar presentations
Testing Relational Database
Advertisements

 To explain the NATURAL WORLD and how it got to be the way it is.  NOT merely to collect “facts” or describe.  Natural here means empirically sensible—that.
Design of Experiments Greenbrier Circle Suite 305 Chesapeake, VA Phone: Fax: Presenter: Chris Hauser.
INTRODUCTION TO MODELING
May 2, May 2, 2015May 2, 2015May 2, 2015 Azusa, CA Sheldon X. Liang Ph. D. Software Engineering in CS at APU Azusa Pacific University, Azusa, CA.
Formal Methods in Software Engineering Credit Hours: 3+0 By: Qaisar Javaid Assistant Professor Formal Methods in Software Engineering1.
Systems Engineering in a System of Systems Context
Discrete-Event Simulation: A First Course Steve Park and Larry Leemis College of William and Mary.
Software Engineering General Project Management Software Requirements
Lecture 7 Model Development and Model Verification.
Happy semester with best wishes from all nursing staff Dr Naiema Gaber
1 Simulation Modeling and Analysis Verification and Validation.
6/30/20151 Decision Making 3 Factors in decision- making.
Introduction to Communication Research
Model Calibration and Model Validation
Science and Engineering Practices
Science Inquiry Minds-on Hands-on.
Engineering Systems of.
Modeling and Simulation
RSBM Business School Research in the real world: the users dilemma Dr Gill Green.
Section 2: Science as a Process
Applying Modeling and Simulation Verification, Validation and Accreditation (VV&A) Techniques to Test and Laboratory Facilities Dr. James Elele, Jeremy.
1 Validation & Verification Chapter VALIDATION & VERIFICATION Very Difficult Very Important Conceptually distinct, but performed simultaneously.
CPIS 357 Software Quality & Testing
Chapter © 2012 Pearson Education, Inc. Publishing as Prentice Hall.
1 ISA&D7‏/8‏/ ISA&D7‏/8‏/2013 Systems Development Life Cycle Phases and Activities in the SDLC Variations of the SDLC models.
Verification and Validation Overview References: Shach, Object Oriented and Classical Software Engineering Pressman, Software Engineering: a Practitioner’s.
Centro de Estudos e Sistemas Avançados do Recife PMBOK - Chapter 4 Project Integration Management.
High Level Architecture Overview and Rules Thanks to: Dr. Judith Dahmann, and others from: Defense Modeling and Simulation Office phone: (703)
Types of Research (Quantitative and Qualitative) RCS /11/05.
1 Science as a Process Chapter 1 Section 2. 2 Objectives  Explain how science is different from other forms of human endeavor.  Identify the steps that.
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
Generic Approaches to Model Validation Presented at Growth Model User’s Group August 10, 2005 David K. Walters.
Measuring the Quality of Decisionmaking and Planning Framed in the Context of IBC Experimentation February 9, 2007 Evidence Based Research, Inc.
Software Development Cycle What is Software? Instructions (computer programs) that when executed provide desired function and performance Data structures.
Government Procurement Simulation (GPSim) Overview.
1 Introduction to Software Engineering Lecture 1.
Information Systems Engineering. Lecture Outline Information Systems Architecture Information System Architecture components Information Engineering Phases.
Combining Theory and Systems Building Experiences and Challenges Sotirios Terzis University of Strathclyde.
Thomson South-Western Wagner & Hollenbeck 5e 1 Chapter Sixteen Critical Thinking And Continuous Learning.
Systems Analysis and Design in a Changing World, Fourth Edition
NCHRP Project Development of Verification and Validation Procedures for Computer Simulation use in Roadside Safety Applications SURVEY OF PRACTITIONERS.
Chap. 5 Building Valid, Credible, and Appropriately Detailed Simulation Models.
MODES-650 Advanced System Simulation Presented by Olgun Karademirci VERIFICATION AND VALIDATION OF SIMULATION MODELS.
Introduction to Earth Science Section 2 Section 2: Science as a Process Preview Key Ideas Behavior of Natural Systems Scientific Methods Scientific Measurements.
Chapter 10 Verification and Validation of Simulation Models
Building Simulation Model In this lecture, we are interested in whether a simulation model is accurate representation of the real system. We are interested.
Irregular Warfare Modeling/Data Validation Best Practices 11 th Annual MOVES Research Education Summit 12 July 2011.
Three Critical Matters in Big Data Projects for e- Science Kerk F. Kee, Ph.D. Assistant Professor, Chapman University Orange, California
Ch 10 Methodology.
Introduction to Research. Purpose of Research Evidence-based practice Validate clinical practice through scientific inquiry Scientific rational must exist.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
Ms. Lisa Jean Moya WernerAnderson, Inc. 01 May 2007 Validation Methodology for Agent-Based Simulations Workshop DoD Validation Baseline.
A Framework and Methods for Characterizing Uncertainty in Geologic Maps Donald A. Keefer Illinois State Geological Survey.
5 September 2002AIAA STC Meeting, Santa Fe, NM1 Verification and Validation for Computational Solid Mechanics Presentation to AIAA Structures Technical.
Building Valid, Credible & Appropriately Detailed Simulation Models
What is a software? Computer Software, or just Software, is the collection of computer programs and related data that provide the instructions telling.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Dillon: CSE470: ANALYSIS1 Requirements l Specify functionality »model objects and resources »model behavior l Specify data interfaces »type, quantity,
Technology Readiness Assessment (TRA)
Chapter 4: Business Process and Functional Modeling, continued
Design Control What Will Be Covered
Project Integration Management
Chapter 10 Verification and Validation of Simulation Models
Lecture 09:Software Testing
Verification and Validation Unit Testing
ASSESS Initiative Update
Design Control What Will Be Covered
Building Valid, Credible, and Appropriately Detailed Simulation Models
Enabling Prediction of Performance
Presentation transcript:

VV&A in Human, Social, Cultural Behavior (HSCB) Simulation Dr. Jimmie McEver Dr. David T. Signori Dr. Mike Smeltzer Evidence Based Research, Inc Based on work funded by DARPA VV&A of COMPOEX Validating Large Scale Simulation of Socio-Political Phenomena (SBIR)

2 What is VV&A? VV&A: Verification, validation and accreditation (of models and simulations) –Verification: The process of determining that a model implementation and its associated data accurately represent the developer's conceptual description and specifications –Validation: The process of determining the degree to which a model and its associated data are an accurate representation of the real world from the perspective of the intended uses of the model –Accreditation: The official certification that a model, simulation, or federation of models and simulations and its associated data are acceptable for use for a specific purpose. U.S. DoD Modeling and Simulation Coordinating Office (MSCO) Glossary

3 VV&A Objective and Reality The stated objective of VV&A is to minimize the risk associated with the use of a model or simulation –Ensure that avoidable errors are eliminated –Signal that a model or simulation is suited for its intended use VV&A activities are most rigorously conducted during the model development –Even then, VV&A is often poorly resourced and underemphasized Despite this, VV&A is often seen as a universal “seal of approval” that a model is accurate and can be trusted –Models and simulations taken off the shelf for later use may undergo some review to examine applicability, but seldom are extensively re-validated for the new use

4 Sources of Risk in M & S Real World Proxy for or Perception of Real World Data Conceptual Model Coded Model Hard Data Soft Data CM Creation Model Creation Model Verification CM Validation Data Use Data V&V Data Verification Data V&V Model Validation Risk Model Use Risk Theory, Data & SME Validation Most PMESII values are not observable

5 Complex M&S Challenges for VV&A The simulation is usually complex –Very large code size –Behavior not decomposable into independent code elements –Simulation behavior results from interactions between elements of code during run-time –Massively multi-dimensional variable space The system being simulation is complex –History only one iteration of what could have happened –Phenomena not being modeled affect system behavior –Data collection needs are massive Different circumstances may require complete revalidation –Not possible to validate over the full range of possible configurations and uses Usually, no one entity was responsible for development –Individual components, validated separately, are integrated into a larger simulation –Integration of the components is a model in itself that is often not validated at all Almost impossible for new users to pick up a model and make good judgments about appropriate use

6 Zack’s Forms of Knowledge Ignorance Source: Zack, M.H., “Managing Organizational Ignorance,” Knowledge Directions, Vol. 1, Summer 1999, pp ; cited in Leedom, D., “Knowledge Representations for Military C2 Teams and Organizations”, Final Working Draft, March 2004.

7 Rethinking VV&A of Complex Simulations Traditional approaches to VV&A are poorly suited to HSCB simulations (and generally to simulations of complex phenomena) Need to recognize limitations and shift emphasis of VV&A activities –Away from eliminating risk of use –Toward eliminating risk that is possible and characterizing residual risk in ways that allow users to make good decisions about Whether to use the simulation How to use the simulation, while being aware of the risks involved Need methods and tools to improve the transparency of the simulation to the user and establish the credibility of a simulation for a given use

8 Characterization and Mitigation of Risk Associated with Simulation Use Risk can be mitigated by: Reducing uncertainty in the simulation Refining the model Improving the data Applying the simulation in ways consistent with degree of validity and appropriate level of risk to be assumed Notional for a given simulation and situation Potential consequences of simulation uncertainty (varies by type of use) Uncertainty within simulation Inform thinking Predictive COA selection Relative COA assessment Surprise avoidance Possible alternative futures Low Med Low Med High Low Med High Med High Med High

9 Building Credibility of Social Science Models Establishing face validity of a model is a critical step in building credibility among users –Need to understand how the model behaves –Determine whether the model behaves sensibly –How to ID expertise? Users check to see if the model responds to input changes in reasonable and useful ways –Matches user judgment of what should happen –Prompts user to rethink his/her mental model in light of the simulation’s description of what’s going on Attempting to establish face validity for large complex SS models can be challenging –Identifying which variables should be tested –Understanding the cause of the model output Need Framework and Tools to Guide and Support Users

10 HSCB Simulations Have Many User Communities User as a collective term refers to three different user classes in this briefing –Consumers: people and possibly processes that run simulations and interpret the results –R&D Managers: Individuals responsible for managing the development of a simulation model and ensuring that the product is meaningful and useful. –Developers: Model builders responsible for the construction of a simulation model under the guidance of R&D managers and/or Subject Matter Experts Important to reflect on all three classes

11 Simulation Modelers May be: –M&S experts –Social scientists –R&D managers –Software engineers –Smart people –Others Often the models integrate components from multiple complex domains Sometimes the SMEs are involved; sometimes not Users are rarely involved; often after the fact

12 Additional Complicating Factors Modelers make lots of assumptions and create very complicated black boxes that model even more complex dynamic systems Example: COMPOEX –Around 5000 simulation variables for one exercise –1000’s of equations –12,000 – 19,000 factors total M&S sometimes goes beyond the science Are there scientific techniques that can aid users (consumers, managers, modelers, SMEs) –Allow users to explore credibility of model –Inform users on whether use of model is appropriate for the user’s purpose –Facilitate users decisions on how to use the simulation

13 Questions How can we extract and communicate representations of the simulation behavior? –At the right level of accuracy and detail to enable understanding of what’s going on (i.e., suitable for use) –Are cognitively manageable (how do we help people manage complex information/knowledge?) –Sensitive to/captures temporal dynamics –Help users avoid oversimplification –Capture relevant interdependence between factors Generation of phase spaces? –Interactive, computer assisted rather than fully automated Use of correlation matrices? –Computational limitations Time-phased correlation matrices?