February 2012. Offerred by: C.E. Brockway Brockway Engineering Jim Brannon Leonard Rice Engineers, Inc John Koreny HDR Inc Willem Schreuder Principia.

Slides:



Advertisements
Similar presentations
McGraw-Hill/Irwin Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. A PowerPoint Presentation Package to Accompany Applied Statistics.
Advertisements

Executive Session Office of Asset Management
Assessing Uncertainty when Predicting Extreme Flood Processes.
Design of Experiments Lecture I
Desktop Business Analytics -- Decision Intelligence l Time Series Forecasting l Risk Analysis l Optimization.
Hypothesis Testing Part I – As a Diagnostic Test.
Reinaldo Garcia, PhD A proposal for testing two-dimensional models to use in the National Flood Insurance Program.
Model calibration using. Pag. 5/3/20152 PEST program.
Sensitivity Analysis for Observational Comparative Effectiveness Research Prepared for: Agency for Healthcare Research and Quality (AHRQ)
Environmentally Impaired Property Transaction Analysis Combining Decision Trees and Monte Carlo Simulation Timothy Havranek and Poh Boon Ung October 2007.
Mitigating Risk of Out-of-Specification Results During Stability Testing of Biopharmaceutical Products Jeff Gardner Principal Consultant 36 th Annual Midwest.
Propagation of Error Ch En 475 Unit Operations. Quantifying variables (i.e. answering a question with a number) 1. Directly measure the variable. - referred.
MANAGEMENT SCIENCE The Art of Modeling with Spreadsheets STEPHEN G. POWELL KENNETH R. BAKER Compatible with Analytic Solver Platform FOURTH EDITION CHAPTER.
©2006 BAE Systems. COSYSMO Application At BAE Systems Gan Wang COSYSMO Workshop 23 rd International Forum on COCOMO and Systems/Software Cost Modeling.
Life Cycle Cost Analysis in Pavement Design - In Search of Better Investment Decisions - Office of Asset Management Federal Highway Administration Executive.
Steps of a sound simulation study
CE 498/698 and ERS 685 (Spring 2004) Lecture 181 Lecture 18: The Modeling Environment CE 498/698 and ERS 485 Principles of Water Quality Modeling.
1 Simulation Modeling and Analysis Verification and Validation.
Lecture 10 Comparison and Evaluation of Alternative System Designs.
Uncertainty in Engineering - Introduction Jake Blanchard Fall 2010 Uncertainty Analysis for Engineers1.
Decision analysis and Risk Management course in Kuopio
Monte Carlo Schedule Analysis The Concept, Benefits and Limitations Intaver Institute Inc. 303, 6707, Elbow Drive S.W, Calgary, AB, Canada Tel: +1(403)
2010 CEOS Field Reflectance Intercomparisons Lessons Learned K. Thome 1, N. Fox 2 1 NASA/GSFC, 2 National Physical Laboratory.
Creating the relationship between good science and informed policy John C. Tracy, Director Idaho Water Resources Research Institute University of Idaho.
Techniques for selecting projects
PMI Knowledge Areas Risk Management.
1 Validation & Verification Chapter VALIDATION & VERIFICATION Very Difficult Very Important Conceptually distinct, but performed simultaneously.
HIT241 - RISK MANAGEMENT Introduction
Hydrologic Modeling: Verification, Validation, Calibration, and Sensitivity Analysis Fritz R. Fiedler, P.E., Ph.D.
MBAD/F 619: Risk Analysis and Financial Modeling Instructor: Linda Leon Fall 2014
Probabilistic Mechanism Analysis. Outline Uncertainty in mechanisms Why consider uncertainty Basics of uncertainty Probabilistic mechanism analysis Examples.
SIMULATION USING CRYSTAL BALL. WHAT CRYSTAL BALL DOES? Crystal ball extends the forecasting capabilities of spreadsheet model and provide the information.
Creating a Shared Vision Model. What is a Shared Vision Model? A “Shared Vision” model is a collective view of a water resources system developed by managers.
United States Department of Agriculture Food Safety and Inspection Service 1 National Advisory Committee on Meat and Poultry Inspection August 8-9, 2007.
Decision and Cost-Effectiveness Analysis: Understanding Sensitivity Analysis Training in Clinical Research DCEA Lecture 5 UCSF Dept. of Epidemiology &
AMERICA’S ARMY: THE STRENGTH OF THE NATION Mort Anvari 1 Cost Risk and Uncertainty Analysis MORS Special Meeting | September.
Simulation is the process of studying the behavior of a real system by using a model that replicates the behavior of the system under different scenarios.
17 May 2007RSS Kent Local Group1 Quantifying uncertainty in the UK carbon flux Tony O’Hagan CTCD, Sheffield.
Debugging Simulation Models CS 780 Spring 2007 Instructor: Peter Kemper Dept of Computer Science, College of William and Mary Prerequisites: A first course.
Extreme values and risk Adam Butler Biomathematics & Statistics Scotland CCTC meeting, September 2007.
Chap. 5 Building Valid, Credible, and Appropriately Detailed Simulation Models.
Propagation of Error Ch En 475 Unit Operations. Quantifying variables (i.e. answering a question with a number) 1. Directly measure the variable. - referred.
Analyzing Statistical Inferences How to Not Know Null.
Statistics Presentation Ch En 475 Unit Operations.
MODES-650 Advanced System Simulation Presented by Olgun Karademirci VERIFICATION AND VALIDATION OF SIMULATION MODELS.
Chapter 10 Verification and Validation of Simulation Models
1 1 Slide Simulation Professor Ahmadi. 2 2 Slide Simulation Chapter Outline n Computer Simulation n Simulation Modeling n Random Variables and Pseudo-Random.
Eurostat Accuracy of Results of Statistical Matching Training Course «Statistical Matching» Rome, 6-8 November 2013 Marcello D’Orazio Dept. National Accounts.
Conference on Quality in Space & Defense Industries CQSDI ‘08 Probabilistic Technology Panel: What Is Probabilistic Technology? Mohammad Khalessi, Ph.D.
© 2014 Minitab, Inc. Justin Callahan Commercial Sales Representative.
Statistics Presentation Ch En 475 Unit Operations.
Karolina Kokurewicz Supervisors: Dino Jaroszynski, Giuseppe Schettino
5 September 2002AIAA STC Meeting, Santa Fe, NM1 Verification and Validation for Computational Solid Mechanics Presentation to AIAA Structures Technical.
A model which combines scientific and socio-economic aspects of climate change primarily for the purpose of assessing policy options for climate change.
Building Valid, Credible & Appropriately Detailed Simulation Models
Probabilistic Slope Stability Analysis with the
1 Life Cycle Assessment A product-oriented method for sustainability analysis UNEP LCA Training Kit Module k – Uncertainty in LCA.
Chapter 4 PowerPoint Spreadsheet Analysis.
Monte Carlo Simulation Random Number Generation
Prepared by Lloyd R. Jaisingh
Analysis Using Spreadsheets
Chapter 6 Calibration and Application Process
Chapter 10 Verification and Validation of Simulation Models
Monte Carlo Schedule Analysis
Professor S K Dubey,VSM Amity School of Business
Mohammad Khalessi, Ph.D. CEO/President PredictionProbe, Inc.
Uncertainty management
Building Valid, Credible, and Appropriately Detailed Simulation Models
Oklahoma Municipal Retirement Fund Asset Allocation Discussion
Agile Project Management and Quantitative Risk Analysis
Presentation transcript:

February 2012

Offerred by: C.E. Brockway Brockway Engineering Jim Brannon Leonard Rice Engineers, Inc John Koreny HDR Inc Willem Schreuder Principia Mathematica Dave Colvin Leonard Rice Engineers, Inc Reviewed by:Dave Blew Idaho Power Company Jon Bowline Idaho Power Company January 20, 2012

Need for Uncertainty Analysis Administrators and users need to analyze potential risk in relying on output Most likely predicted outcome needed Range or probability of occurrence of specific magnitudes or outcomes Administrative water rights decisions (discretionary or not) require technically sound and scientifically supported knowledge of model capabilities

Model Uncertainty Sources Conceptual Uncertainty Mathematical Uncertainty Parameter Uncertainty Internal Calibration Uncertainty(model overspecification) Calibration target uncertainty PREDICTIVE UNCERTAINTY-(Integrated effect or accuracy with which the model can simulate response of an output parameter representing all sources of uncertainty) Outlined by C. Brendeke

Current ESPAM2 Uncertainty Analysis Provides an estimate of the range of values for a specific output due to parameter adjustments within which the model will remain calibrated May be called a duel model approach “Remain calibrated”-modeller defined Assumes that the both the conceptural model and input data are correct (without error) when determining the range of predictions Separate analyses required to determine the impact of unadjustable input data or conceptual model uncertainty

Monte Carlo Uncertainty Analysis More rigorous, time consuming, more computing power Unconstrained-random range of input parameters Model may not remain calibrated Null Space Analysis-many calibrated models Monte Carlo analysis provides a probability distribution for any chosen output parameter Modeler may define range of input parameters as a measure of input parameter uncertainty

Probability and Risk Analysis Quantifiable range of simulated values with associated probability is helpful Sometimes the user is concerned about the range and probability of output if a single parameter is varied (personal knowledge of the uncertainty in a specific parameter) Current procedure provides no probability distribution or confidence limits on output Utilization of sensitivity analysis during calibration can provide insight into improvement in calibration

Recommendations Complete the ESPAM2 calibration as soon as possible Complete the uncertainty analysis as outlined as soon as possible Fully document to procedure and results Complete the proposed verification as soon as possible, first priority verification, lower priority for pre calibration period verification Compare similar output from ESPAM1.1 and ESPAM2 Officially adopt ESPAM2, modify and improve transfer tool and guidelines

USE of ESPAM2 MODEL ESHMC has a responsibility to advise the Director and staff to: 1. Develop the best available scientific tool for evaluating ESPA hydrologic relationships and 2.Providing hydrologic guidelines in the use of the model for administrative decisions 3. Provide guidance on technical deficiencies and the real meaning of simulation results 4. Provide adequate information to the Director and model users to understand and defend the model

USE OF ESPAM2-Uncertainty ESPAM2 is a tool for making decisions and developing policy. Neither ESPAM2 or ESHMC makes policy Uncertainty analysis can: 1. Provide insight into sensitive parameters and need for additional data 2. Provide guidance in improving calibration 3. Provide insight into output parameter variability and bias and assist in evaluating risk in use of model output

USE OF MODEL FOR WATER RIGHTS ADMINISTRATION ESHMC or IDWR staff do not necessarily tell the Director how to use the output of the model ESHMC and IDWR staff can tell the Director: 1. What the model can do (capabilities) 2. What the model can’t do 3. What the output really means, technically What are confidence limits? What is variability or bias in simulated output? How sensitive is specific spring output to ET? NOT: How much is impact from junior spring users costing spring users? Can junior ground water users afford to mitigate or ??? 4. Is a particular method of use of the model defensible (statistically or analytically)

SUMMARY ESPAM2 is the best scientific tool available to IDWR Model needs to be completed and adopted as soon as possible Current calibration, uncertainty analysis, validation, and ESPAM1.1 comparison needs to be completed as planned Complete analysis and documentation including model capabilities and limits should be completed The current utilization of the trim line concept as a surrogate for model uncertainty is not defensible and other protocol utilizing documented uncertainty analyses should be adopted