Uncertainty analysis and Model Validation.

Slides:



Advertisements
Similar presentations
Bayesian tools for analysing and reducing uncertainty Tony OHagan University of Sheffield.
Advertisements

NWS Calibration Workshop, LMRFC March, 2009 Slide 1 Sacramento Model Derivation of Initial Parameters.
Model calibration using. Pag. 5/3/20152 PEST program.
Upscaling and effective properties in saturated zone transport Wolfgang Kinzelbach IHW, ETH Zürich.
Groundwater Modeling Conversion to Transient Particle Tracking.
Final Project I. Calibration II.Drawdown Prediction III.Particle Tracking IV.Presentation of Results.
Schedule April 22, 24: Particle tracking and Profile Models (PS6). Work on Part III of Final Project. April 24: 30 min in-class, closed book quiz. April.
1 Distributed localization of networked cameras Stanislav Funiak Carlos Guestrin Carnegie Mellon University Mark Paskin Stanford University Rahul Sukthankar.
(Z&B) Steps in Transport Modeling Calibration step (calibrate flow model & transport model) Adjust parameter values.
Today’s Lecture: Grid design/boundary conditions and parameter selection. Thursday’s Lecture: Uncertainty analysis and Model Validation.
CE 498/698 and ERS 685 (Spring 2004) Lecture 181 Lecture 18: The Modeling Environment CE 498/698 and ERS 485 Principles of Water Quality Modeling.
Interdisciplinary Modeling of Aquatic Ecosystems Curriculum Development Workshop July 18, 2005 Groundwater Flow and Transport Modeling Greg Pohll Division.
1 Validation and Verification of Simulation Models.
11/1/2011 Summary statement on runoff generation
The Calibration Process
Monté Carlo Simulation MGS 3100 – Chapter 9. Simulation Defined A computer-based model used to run experiments on a real system.  Typically done on a.
Classification: Internal Status: Draft Using the EnKF for combined state and parameter estimation Geir Evensen.
Matt Robinson Thomas Tolman.  Describes Convective Heat Transfer  Needed for all external and internal flow situations.
BIOPLUME II Introduction to Solution Methods and Model Mechanics.
University of Toronto Department of Computer Science © 2001, Steve Easterbrook CSC444 Lec22 1 Lecture 22: Software Measurement Basics of software measurement.
Quantify prediction uncertainty (Book, p ) Prediction standard deviations (Book, p. 180): A measure of prediction uncertainty Calculated by translating.
Uses of Modeling A model is designed to represent reality in such a way that the modeler can do one of several things: –Quickly estimate certain aspects.
Gaussian process modelling
Calibration Guidelines 1. Start simple, add complexity carefully 2. Use a broad range of information 3. Be well-posed & be comprehensive 4. Include diverse.
1 Validation & Verification Chapter VALIDATION & VERIFICATION Very Difficult Very Important Conceptually distinct, but performed simultaneously.
Uncertainty Analysis and Model “Validation” or Confidence Building.
Discussion on Modeling Stefan Finsterle Earth Sciences Division Lawrence Berkeley National Laboratory 29. Task Force Meeting Lund, Sweden November 29-29,
III. Ground-Water Management Problem Used for the Exercises.
(Zheng and Bennett) Steps in Transport Modeling Calibration step (calibrate flow model & transport model) Adjust parameter values Traditional approach.
Grid design/boundary conditions and parameter selection USGS publication (on course website): Guidelines for Evaluating Ground-Water Flow Models Scientific.
VIII: Methods for Evaluating Model Predictions 1. Define predictive quantity and calculate sensitivities and standard deviations (Ex8.1a) 2. Assess data.
A More Accurate and Powerful Tool for Managing Groundwater Resources and Predicting Land Subsidence: an application to Las Vegas Valley Zhang, Meijing.
Why it is good to be uncertain ? Martin Wattenbach, Pia Gottschalk, Markus Reichstein, Dario Papale, Jagadeesh Yeluripati, Astley Hastings, Marcel van.
17 May 2007RSS Kent Local Group1 Quantifying uncertainty in the UK carbon flux Tony O’Hagan CTCD, Sheffield.
Calibration Guidelines 1. Start simple, add complexity carefully 2. Use a broad range of information 3. Be well-posed & be comprehensive 4. Include diverse.
Final Project Summary of Results & Conclusions. Generally predicted ARM at targets > Calibrated ARM Generally, predicted ARM at pumping wells > Predicted.
General governing equation for steady-state, heterogeneous, anisotropic conditions 2D Laplace Eqn. --Homogeneous and isotropic aquifer without a sink/source.
Chapter 10 Verification and Validation of Simulation Models
Experiences in assessing deposition model uncertainty and the consequences for policy application Rognvald I Smith Centre for Ecology and Hydrology, Edinburgh.
GoldSim Technology Group LLC, 2006 Slide 1 Sensitivity and Uncertainty Analysis and Optimization in GoldSim.
IX. Transient Model Nonlinear Regression and Statistical Analysis.
Final Project I. Calibration Drawdown Prediction Particle Tracking
Colleague Review GW Reports January Colleague Review of Ground-Water Reports n Mapping Reports n Ground-Water Modeling Reports n Geochemistry Reports.
Types of Boundary Conditions 1.Specified head (including constant head) h = f (x,y,z,t) 2.Specified flux (including no flow)  h/  l = -q l /K l 3.Head-dependent.
U.S. Department of the Interior U.S. Geological Survey Evaluating uncertainty in areas contributing recharge to wells for water-quality network design.
Calibration & Sensitivity Analysis. Head measured in an observation well is known as a target. Baseflow measurements or other fluxes (e.g., ET) are also.
(Z&B) Steps in Transport Modeling Calibration step (calibrate flow & transport model) Adjust parameter values Design conceptual model Assess uncertainty.
Final Project I. Calibration II.Drawdown Prediction III.Particle Tracking IV.Presentation of Results.
1 Groundwater Modeling. 2 Introduction 3 Lecture Outline What Is A Model? Modeling Axioms Guiding Thoughts and Protocol Governing Equations Practical.
Model validity, testing and analysis. Conceptual and Philosophical Foundations Model Validity and Types of Models –Statistical Forecasting models (black.
A Framework and Methods for Characterizing Uncertainty in Geologic Maps Donald A. Keefer Illinois State Geological Survey.
Goal of Stochastic Hydrology Develop analytical tools to systematically deal with uncertainty and spatial variability in hydrologic systems Examples of.
5 September 2002AIAA STC Meeting, Santa Fe, NM1 Verification and Validation for Computational Solid Mechanics Presentation to AIAA Structures Technical.
Objective: conceptual model definition and steady state simulation of groundwater flow.
Monte Carlo Methods CEE 6410 – Water Resources Systems Analysis Nov. 12, 2015.
Process and System Characterization Describe and characterize transport and transformation phenomena based reactor dynamics ( 반응공학 ) – natural and engineered.
AUTOMATED PARAMETER ESTIMATION Model Parameterization, Inverse Modeling, PEST.
Ground Water Modeling Concepts
The Calibration Process
Uncertainty and Non-uniqueness
Chapter 6 Calibration and Application Process
Uses of Modeling A model is designed to represent reality in such a way that the modeler can do one of several things: Quickly estimate certain aspects.
Chapter 10 Verification and Validation of Simulation Models
Calibration.
Professor S K Dubey,VSM Amity School of Business
Transport Modeling in Groundwater
PARAMETERIZATION, UNCERTAINTY – AND “REAL WORLD” MODELING
Which level of model complexity is justified by your data
Summary of Results & Conclusions
Transport Modeling in Groundwater
Presentation transcript:

Uncertainty analysis and Model Validation

Summary of Results & Conclusions Final Project Summary of Results & Conclusions

In a real-world problem we need to establish model specific calibration criteria and define targets including associated error. Calibration Targets associated error calibration value 0.80 m 20.24 m Target with smaller associated error. Target with relatively large associated error.

Smith Creek Valley (Thomas et al., 1989) Calibration Objectives Heads within 10 ft of measured heads. Allows for Measurement error and interpolation error. Absolute mean residual between measured and simulated heads close to zero (0.22 ft) and standard deviation minimal (4.5 ft). Head difference between layers 1&2 within 2 ft of field values. 4. Distribution of ET and ET rates match field estimates.

Also need to identify calibration parameters and their reasonable ranges.

Calibration Prediction Group ARM h ARM ET (x10e7) (at targets)   Prediction Group ARM h ARM ET (x10e7) (at targets) (at pumping wells) 1 0.92 1.38 1.60 4.16 2 0.73 1.11 1.99 3.03 3 0.69 0.51 0.95 1.76 4 1.34 1.27 1.46 2.57 5 1.56 0.89 2.79 1.43 6 1.29 0.16 2.58 2.92

Calibration to Fluxes When recharge rate (R) is a calibration parameter, calibrating to fluxes can help in estimating K and/or R.

In this example, flux information helps calibrate K. q = KI K = ? H1 H2

In this example, discharge information helps calibrate R.

In our example, total recharge is known/assumed to be 7 In our example, total recharge is known/assumed to be 7.14E08 ft3/year and discharge = recharge. All water discharges to the playa. Calibration to ET merely fine tunes the discharge rates within the playa area.

Calibration Prediction Group ARM h ARM ET (x10e7) (at targets)   Prediction Group ARM h ARM ET (x10e7) (at targets) (at pumping wells) 1 0.92 1.38 1.60 4.16 2 0.73 1.11 1.99 3.03 3 0.69 0.51 0.95 1.76 4 1.34 1.27 1.46 2.57 5 1.56 0.89 2.79 1.43 6 1.29 0.16 2.58 2.92

Includes results from 2000, 2001, 2003

Includes results from 2000, 2001, 2003

Particle Tracking Group Truth 802 PW1 1913 playa 620 playa 310 PW4 3.93E6 PW4 252 PW4 1084 playa 1576 playa 3 1.21E6 PW1 2.15E6 PW2 3.90E6 playa 1110 PW4 1.58E6 playa 1860 playa 893 playa 4 1200 PW1 1900 PW2 6.7E6 PW3 290 PW4 2800 PW5 760 PW1 1200 PW2 5 1295 PW1 3160 PW2 503 playa 986 PW4 605 PW4 316 PW1 3100 PW2 6   3100 PW1 982 PW1  4.9E5 playa  603 PW4 1450 PW4  2000 PW1 1380 PW2 Truth 802 PW1 1913 playa 620 playa 310 PW4 1933 PW5 690 playa 2009 PW2

Observations Predicted ARM > Calibrated ARM Predicted ARM at pumping wells > Predicted ARM at nodes with targets Flow predictions are more robust (consistent among different calibrated models) than transport (particle tracking) predictions.

Conclusions Calibrations are non-unique. A good calibration (even if ARM = 0) does not ensure that the model will make good predictions. You can never have enough field data. Modelers need to maintain a healthy skepticism about their results. Need for an uncertainty analysis to accompany calibration results and predictions.

Uncertainty in the Calibration Involves uncertainty in: Targets Parameter values Conceptual model including boundary conditions, zonation, geometry, etc.

Ways to analyze uncertainty in the calibration Sensitivity analysis Use an inverse model (automated calibration) to quantify uncertainties and optimize the calibration.

Uncertainty in the Prediction Reflects uncertainty in the calibration. Involves uncertainty in how parameter values (e.g., recharge) will vary in the future.

Ways to quantify uncertainty in the prediction Sensitivity analysis Stochastic simulation

MADE site – Feehley and Zheng, 2000, WRR 36(9).

A Monte Carlo analysis considers 100 or more realizations.

Stochastic modeling option in GW Vistas

Ways to quantify uncertainty in the prediction Sensitivity analysis Scenario analysis Stochastic simulation

Model Validation How do we “validate” a model so that we have confidence that it will make accurate predictions?

Modeling Chronology 1960’s Flow models are great! 1970’s Contaminant transport models are great! 1975 What about uncertainty of flow models? 1980s Contaminant transport models don’t work. (because of failure to account for heterogeneity) 1990s Are models reliable? Concerns over reliability in predictions arose over efforts to model a geologic repository for high level radioactive waste.

“The objective of model validation is to determine how well the mathematical representation of the processes describes the actual system behavior in terms of the degree of correlation between model calculations and actual measured data” (NRC, 1990)

What constitutes validation? (code vs. model) NRC study (1990): Model validation is not possible. Oreskes et al. (1994): paper in Science Calibration = forced empirical adequacy Verification = assertion of truth (possible in a closed system, e.g., testing of codes) Validation = establishment of legitimacy (does not contain obvious errors), confirmation, confidence building

How to build confidence in a model Calibration (history matching) steady-state calibration(s) transient calibration “Verification” requires an independent set of field data Post-Audit: requires waiting for prediction to occur Models as interactive management tools

HAPPY MODELING!

Have a good summer!