NASA Langley Research Center - 1Workshop on UQEE Prediction of Computational Quality for Aerospace Applications Michael J. Hemsch, James M. Luckring, Joseph.

Slides:



Advertisements
Similar presentations
System Integration Verification and Validation
Advertisements

Operations Control Key Sources: Data Analysis and Decision Making (Albrigth, Winston and Zappe) An Introduction to Management Science: Quantitative Approaches.
Reinaldo Garcia, PhD A proposal for testing two-dimensional models to use in the National Flood Insurance Program.
T. E. Potok - University of Tennessee Software Engineering Dr. Thomas E. Potok Adjunct Professor UT Research Staff Member ORNL.
CCMT Validation of Shock Tube Simulation Chanyoung Park, Raphael (Rafi) T. Haftka and Nam-Ho Kim Department of Mechanical & Aerospace Engineering, University.
Chapter 10 Schedule Your Schedule. Copyright 2004 by Pearson Education, Inc. Identifying And Scheduling Tasks The schedule from the Software Development.
May 14, May 14, 2015May 14, 2015May 14, 2015 Azusa, CA Sheldon X. Liang Ph. D. Software Engineering in CS at APU Azusa Pacific University, Azusa,
Modeling Human Reasoning About Meta-Information Presented By: Scott Langevin Jingsong Wang.
1 Approved for unlimited release as SAND C Verification Practices for Code Development Teams Greg Weirs Computational Shock and Multiphysics.
A Robust Process Model for Calculating Security ROI Ghazy Mahjub DePaul University M.S Software Engineering.
Center for Adaptive Aerospace Technology Biosensing and Bioactuation Workshop By Dr. James E. Hubbard, Jr. Samuel P. Langley Professor (University of Maryland)
Interdisciplinary Modeling of Aquatic Ecosystems Curriculum Development Workshop July 18, 2005 Groundwater Flow and Transport Modeling Greg Pohll Division.
Requirement Engineering – A Roadmap
1 Simulation Modeling and Analysis Verification and Validation.
SQM - 1DCS - ANULECTURE Software Quality Management Software Quality Management Processes V & V of Critical Software & Systems Ian Hirst.
The Calibration Process
Spatial data quality February 10, 2006 Geog 458: Map Sources and Errors.
System Engineering Instructor: Dr. Jerry Gao. System Engineering Jerry Gao, Ph.D. Jan System Engineering Hierarchy - System Modeling - Information.
Classroom Assessment A Practical Guide for Educators by Craig A
Introduction to Software Testing
1 Software Testing Techniques CIS 375 Bruce R. Maxim UM-Dearborn.
Codex Guidelines for the Application of HACCP
Expert Systems Infsy 540 Dr. Ocker. Expert Systems n computer systems which try to mimic human expertise n produce a decision that does not require judgment.
Using Six Sigma to Achieve CMMI Levels 4 and 5
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Visual 3. 1 Lesson 3 Risk Assessment and Risk Mitigation.
1 Validation & Verification Chapter VALIDATION & VERIFICATION Very Difficult Very Important Conceptually distinct, but performed simultaneously.
1 Performance Evaluation of Computer Networks: Part II Objectives r Simulation Modeling r Classification of Simulation Modeling r Discrete-Event Simulation.
The Modelling Process Dr Andy Evans. This lecture The modelling process: Identify interesting patterns Build a model of elements you think interact and.
 1  Outline  stages and topics in simulation  generation of random variates.
Hydrologic Modeling: Verification, Validation, Calibration, and Sensitivity Analysis Fritz R. Fiedler, P.E., Ph.D.
Fish Infectious Disease Model Case Study BSC417/517.
Discussion on Modeling Stefan Finsterle Earth Sciences Division Lawrence Berkeley National Laboratory 29. Task Force Meeting Lund, Sweden November 29-29,
Chapter 8 Introduction to Hypothesis Testing
Software Engineering Lecture # 17
Metrology Adapted from Introduction to Metrology from the Madison Area Technical College, Biotechnology Project (Lisa Seidman)
Today: Our process Assignment 3 Q&A Concept of Control Reading: Framework for Hybrid Experiments Sampling If time, get a start on True Experiments: Single-Factor.
PM2.5 Model Performance Evaluation- Purpose and Goals PM Model Evaluation Workshop February 10, 2004 Chapel Hill, NC Brian Timin EPA/OAQPS.
UMRIDA Kick-Off Meeting Brussels, october Partner 11 : INRIA.
Chapter 7: A Summary of Tools Focus: This chapter outlines all the customer-driven project management tools and techniques and provides recommendations.
Where the Research Meets the Road: Climate Science, Uncertainties, and Knowledge Gaps First National Expert and Stakeholder Workshop on Water Infrastructure.
Introduction Complex and large SW. SW crises Expensive HW. Custom SW. Batch execution Structured programming Product SW.
Lecture 7: Requirements Engineering
Business Process Change and Discrete-Event Simulation: Bridging the Gap Vlatka Hlupic Brunel University Centre for Re-engineering Business Processes (REBUS)
Sources of Uncertainty (from Morgan and Henrion) Jake Blanchard Spring 2010 Uncertainty Analysis for Engineers1.
Best-Fit, Bundle, & USMN Using SA to understand measurement uncertainty Zach Rogers NRK, EMEAR Application Engineer.
Validation Dr Andy Evans. Preparing to model Verification Calibration/Optimisation Validation Sensitivity testing and dealing with error.
Statistical approach Statistical post-processing of LPJ output Analyse trends in global annual mean NPP based on outputs from 19 runs of the LPJ model.
NCHRP Project Development of Verification and Validation Procedures for Computer Simulation use in Roadside Safety Applications SURVEY OF PRACTITIONERS.
Chapter 10 Verification and Validation of Simulation Models
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
Global Environmental Change and Food Systems Scenarios Research up to date Monika Zurek FAO April 2005.
© 2001 Six Sigma Academy© 2003 Six Sigma Academy1 Bank Exercise Champion Workshop.
Simulation is the process of studying the behavior of a real system by using a model that replicates the system under different scenarios. A simulation.
Climate Projections: From Useful to Usability Richard B. Rood, Maria Carmen Lemos, Donald E. Anderson Richard B. Rood
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Development of Research Methodologies in Various Disciplines By Dr Ranu Varshney & Mrs. Nisha Chaturbedi.
1 Phase Testing. Janice Regan, For each group of units Overview of Implementation phase Create Class Skeletons Define Implementation Plan (+ determine.
Project Management Strategies Hidden in the CMMI Rick Hefner, Northrop Grumman CMMI Technology Conference & User Group November.
5 September 2002AIAA STC Meeting, Santa Fe, NM1 Verification and Validation for Computational Solid Mechanics Presentation to AIAA Structures Technical.
Role of Data Quality in GIS Decision Support Tools
Testing Tutorial 7.
Requirements Analysis Scenes
The Calibration Process
Chapter 10 Verification and Validation of Simulation Models
Goal, Question, and Metrics
Introduction to Software Testing
Chapter 10 – Software Testing
Software Verification, Validation, and Acceptance Testing
Verification & Validation
Presentation transcript:

NASA Langley Research Center - 1Workshop on UQEE Prediction of Computational Quality for Aerospace Applications Michael J. Hemsch, James M. Luckring, Joseph H. Morrison NASA Langley Research Center Elements of Predictability Workshop November 13-14, 2003 Johns Hopkins University

NASA Langley Research Center - 2Workshop on UQEE Outline Breakdown of the problem (again) with a slight twist. The issue for most of aerospace is that non-computationalists are doing the applications computations. What are they doing now? What can we do to help?

NASA Langley Research Center - 3Workshop on UQEE Breakdown of tasks Measuring the measurement system Measuring the measurement system Random error characterization using standard artifacts Discrimination testing of the measurement system Discrimination testing of the measurement system Systematic error characterization QA checks against above measurements during customer testing QA checks against above measurements during customer testing Process output of interest Calibration of instruments Calibration of instruments Traceability to standards Off-line Measuring the computational process Measuring the computational process Model-to-model and model-to-reality discrimination Model-to-model and model-to-reality discrimination QA checks against above measurements during computation for customer QA checks against above measurements during computation for customer Verifying that the coding is correct Verifying that the coding is correct Off-line Characterization of process variation using standard problems Systematic error characterization Solution verification Traceable operational definition of the process Experimentation Computation

NASA Langley Research Center - 4Workshop on UQEE The key question for applications: “How is the applications person going to convince the decision maker that the computational process is good enough?”

NASA Langley Research Center - 5Workshop on UQEE Our tentative answer based on observation of aero engineers trying to use CFD on real-life design problems is that it is the quantitative explanatory force of any approach that creates acceptance.

NASA Langley Research Center - 6Workshop on UQEE How can quantitative "explanatory force“ be provided? Breakdown to two questions: –How do I know that I am predicting the right physics at the right place in the inference space? –How accurate are my results if I do have the right physics at the right place in the inference space?

NASA Langley Research Center - 7Workshop on UQEE Airfoil Stall Classification

NASA Langley Research Center - 8Workshop on UQEE Boundaries Among Stall Types

NASA Langley Research Center - 9Workshop on UQEE The applications person needs a process that can be  Controlled  Evaluated  Improved (i.e. a predictable process)

NASA Langley Research Center - 10Workshop on UQEE Process Predicted coefficients, flow features, etc. Geometry, flight conditions, etc. Controllable input (assignable cause variation) Uncontrolled input from the environment (variation that we have to live with, e.g. numerics, parameter uncertainty, model form uncertainty, users) Creating a predictable process …

NASA Langley Research Center - 11Workshop on UQEE Critical levels of attainment for a predictable process A defined set of steps Stable and replicable Measurable Improvable

NASA Langley Research Center - 12Workshop on UQEE What it takes to have an impact... Historically, practitioners have created their designs (and the disciplines they work in) with very little reference to researchers. Practitioners who are successfully using aero computations already know what it takes to convince a risk taker. If we want to have an impact on practitioners, we will have to build on what they are already doing.

NASA Langley Research Center - 13Workshop on UQEE What is takes to have an impact... Good questions: –Are researchers going to be an integral part of the applications uncertainty quantification process or are we going to be irrelevant? –What specific impact on practitioners do I want to have with a particular project? –What process/product improvement am I expecting from that project?

NASA Langley Research Center - 14Workshop on UQEE What is takes to have an impact... We can greatly improve, systematize and generalize the process that practitioners are successfully using right now. The key watchwords for applications are: –practicality, as in mission analysis and design –alacrity, as in "I want to use it right now." –impact, as in "Will my customer buy in?" and "Am I willing to bet my career (and my life) on my prediction?"

NASA Langley Research Center - 15Workshop on UQEE Actions Establish working groups like the AIAA Drag Prediction Workshop (DPW) –Select a small number of focus problems –Use those problems »to demonstrate the prediction uncertainty strategies »to find out just how tough this problem really is For right now … –Run multiple codes, different grid types, multiple models, etc. –Work data sets that fully capture the physics of the application problem of interest. –Develop process best practices and find ways to control and evaluate them. –Develop experiments to determine our ability to predict uncertainty and to predict the domain boundaries where the physics changes.

NASA Langley Research Center - 16Workshop on UQEE Breakout Questions/Issues 1.Defining predictability in the context of the application 2.The logical or physical reasons for lack of predictability 3.Possibility of isolating the reducible uncertainties in view of dealing with them (either propagating them or reducing them) 4.The role of experimental evidence in understanding and controlling predictability 5.The possibility of gathering experimental evidence 6.The role that modeling plays in limiting predictability 7.Minimum requisite attributes of predictive models 8.The role played by temporal and spatial scales and possibilities mitigating actions and models