Model performance & Evaluation Calibration and Validation

Slides:



Advertisements
Similar presentations
Hydrology Rainfall - Runoff Modeling (I)
Advertisements

S A L T M O D A computer program for the prediction of the salinity of soil moisture, ground water and drainage water, the depth of the water table, and.
Modelling the rainfall-runoff process
Design of Experiments Lecture I
NWS Calibration Workshop, LMRFC March, 2009 Slide 1 Sacramento Model Derivation of Initial Parameters.
Introduction to runoff modeling on the North Slope of Alaska using the Swedish HBV Model Emily Youcha, Douglas Kane University of Alaska Fairbanks Water.
Model calibration using. Pag. 5/3/20152 PEST program.
Hydrological Modeling for Upper Chao Phraya Basin Using HEC-HMS UNDP/ADAPT Asia-Pacific First Regional Training Workshop Assessing Costs and Benefits of.
Errors & Uncertainties Confidence Interval. Random – Statistical Error From:
Model calibration and validation
Resampling techniques Why resampling? Jacknife Cross-validation Bootstrap Examples of application of bootstrap.
UH Unit Hydrograph Model Response Functions of Linear Systems Basic operational rules:  Principle of Proportionality: f(cQ ) = c  f(Q)  Principle of.
Evaluating Hypotheses
CE 498/698 and ERS 685 (Spring 2004) Lecture 181 Lecture 18: The Modeling Environment CE 498/698 and ERS 485 Principles of Water Quality Modeling.
Two-Step Calibration Method for SWAT Francisco Olivera, Ph.D. Assistant Professor Huidae Cho Graduate Student Department of Civil Engineering Texas A&M.
An analysis of MLR and NLP for use in river flood routing and comparison with the Muskingum method Mohammad Zare Manfred Koch Dept. of Geotechnology and.
The Calibration Process
Hydrology & Water Resources Engineering
Christopher Dougherty EC220 - Introduction to econometrics (chapter 3) Slideshow: prediction Original citation: Dougherty, C. (2012) EC220 - Introduction.
1 PREDICTION In the previous sequence, we saw how to predict the price of a good or asset given the composition of its characteristics. In this sequence,
Aaker, Kumar, Day Ninth Edition Instructor’s Presentation Slides
WaterSmart, Reston, VA, August 1-2, 2011 Steve Markstrom and Lauren Hay National Research Program Denver, CO Jacob LaFontaine GA Water.
National Weather Service River Forecast System Model Calibration Fritz Fiedler Hydromet 00-3 Tuesday, 23 May East Prospect Road, Suite 1 Fort.
Simple Linear Regression Models
Results of the WMO Laboratory Intercomparison of rainfall intensity gauges Luca G. Lanza University of Genoa WMO (Project Leader) DIAM UNIGE September.
Hydrologic Modeling: Verification, Validation, Calibration, and Sensitivity Analysis Fritz R. Fiedler, P.E., Ph.D.
Fundamentals of Data Analysis Lecture 9 Management of data sets and improving the precision of measurement.
1 Calibration of Watershed Models Why calibrate? –OFS: short term forecasts –ESP: no run time mods –Learn model and hydrology –Good training for forecasting.
Multiple Regression Petter Mostad Review: Simple linear regression We define a model where are independent (normally distributed) with equal.
Twinning water quality modelling in Latvia Helene Ejhed
Understanding hydrologic changes: application of the VIC model Vimal Mishra Assistant Professor Indian Institute of Technology (IIT), Gandhinagar
Chapter 3 Response Charts.
Introduction to the TOPMODEL
Experience with modelling of runoff formation processes at basins of different scales using data of representative and experimental watersheds Olga Semenova.
Surface Water Surface runoff - Precipitation or snowmelt which moves across the land surface ultimately channelizing into streams or rivers or discharging.
BME 353 – BIOMEDICAL MEASUREMENTS AND INSTRUMENTATION MEASUREMENT PRINCIPLES.
P B Hunukumbura1 S B Weerakoon1
Hydrology and application of the RIBASIM model SYMP: Su Yönetimi Modelleme Platformu RBE River Basin Explorer: A modeling tool for river basin planning.
Lecture №4 METHODS OF RESEARCH. Method (Greek. methodos) - way of knowledge, the study of natural phenomena and social life. It is also a set of methods.
Hydrology and application of the RIBASIM model SYMP: Su Yönetimi Modelleme Platformu RBE River Basin Explorer: A modeling tool for river basin planning.
Sandeep Bisht Assistant Director Basin planning
Welcome to MM305 Unit 5 Seminar Dr. Bob Forecasting.
Variability. The differences between individuals in a population Measured by calculations such as Standard Error, Confidence Interval and Sampling Error.
Stats Methods at IC Lecture 3: Regression.
Variability.
National scale hydrological modelling for the UK
Rainfall-Runoff modeling
Alexander Loew1, Mike Schwank2
Selecting the Best Measure for Your Study
Simulation of stream flow using WetSpa Model
The Calibration Process
Chapter 6 Calibration and Application Process
ACCURACY IN PERCENTILES
Professor S K Dubey,VSM Amity School of Business
Statistical Methods For Engineers
Budgeting: Estimating Costs and Risks
CHAPTER 26: Inference for Regression
Introduction to Instrumentation Engineering
Cost estimation and behaviour
Scientific Investigations
Overview of Models & Modeling Concepts
Discrete Event Simulation - 4
Devil physics The baddest class on campus IB Physics
Linear Model Selection and regularization
Cluster Validity For supervised classification we have a variety of measures to evaluate how good our model is Accuracy, precision, recall For cluster.
Evaluation of the TRMM Multi-satellite Precipitation Analysis (TMPA) and its utility in hydrologic prediction in La Plata Basin Dennis P. Lettenmaier and.
Scientific Investigations
Qi Li,Qing Wang,Ye Yang and Mingshu Li
MGS 3100 Business Analysis Regression Feb 18, 2016
GHOST (Generic Hydrologic Overland-Subsurface Toolkit)
Presentation transcript:

Model performance & Evaluation Calibration and Validation

Calibration and validation Prepared by: Ilyas Masih and Marloes Mul UNESCO-IHE, The Netherlands

Evaluation of model performance The process of assessing the performance of a model requires the subjective and/or objective estimates of the “closeness” of the simulated behaviour of the model to observations. Qualitative or subjective assessment: to assess the systematic (e.g., over- or under prediction) and dynamic (e.g., timing, rising limb, falling limb, and base flow) behaviour of the model. Objective assessment: requires the use of a mathematical estimate of the error between the simulated and observed hydrologic variable(s) – i.e. objective or efficiency criteria.

Evaluation of model performance The confidence in use of a model (for reasoning and the formulation of testable predictions) will depend on how ‘close’ the model is to reality. degree of accuracy (bias), degree of precision (uncertainty) and degree of correspondence (‘sameness’ of the quantities in question) in the comparative evaluation of both fields (observation vs simulation). Calibration is a process of changing the model input values (usually parameters) to match as closely as possible the simulated behaviour of the model to the observation.

Schematic of model evaluation and Calibration process Vrugt (2004)

Calibration Improving the model results by Changing parameters For when the understanding of the processes is insufficient For when the information is insufficient to determine the parameters

Parameter Obtained by measuring Derived from observations (such as the K-value) Derived by calibration

Commonly used variables against which water systems models are evaluated and calibrated Water balance components: e.g. streamflows, groundwater levels, actual evapotranspiration, soil moisture Water quality: e.g. sediment, Nitrogen, Phosphorous Water allocation: demand and supply Other? (e.g. reservoir volume/level and releases)

Commonly used Objective functions (criteria) in water systems models’ evaluation and calibration Bias Mean square Error; root mean square error, relative root mean square error Coefficient of determination Nash-Sutcliffe efficiency Other? (e.g. comparison of the slopes of the observed and simulated flow duration curves)

Objective functions: example Nash-Stucliffe efficiency (NSE), coefficient of determination (R2) and Bias(error) (%)

Calibration approaches: Manual calibration Most widely used calibration method Visual comparison of measured and simulated data Semi-intuìtive trial and error process for parameter adjustment Excellent model calibrations, but manual calibration... is highly labor-intensive (human resources) is difficult to learn procedures are model-dependent results are user-dependent

Calibration approaches: Automatic calibration The procedure that optimizes an objective function by systematically searching the parameter space according to a fixed set of rules Example Monte Carlo method Genetic Algorithms Shuffled Complex Evolution

Calibration approaches: Manual VS Automatic calibration “The debate on whether manual or automatic calibration of hydrological models is superior is likely to remain inconclusive. However, better understanding of the catchment is the key for the success of both approaches.” Masih, I., 2011.

A note of caution on the use of objective functions (from Krause et al “In general, many efficiency criteria contain a summation of the error term (difference between the simulated and the observed variable at each time step) normalized by a measure of the variability in the observations. To avoid the canceling of errors of opposite sign, the summation of the absolute or squared errors is often used for many efficiency criteria. As a result, an emphasis is placed on larger errors while smaller errors tend to be neglected. Since errors associated with high streamflow values tend to be larger than those associated with errors for lower values, calibration (both manual and automatic) attempts aimed at minimizing these types of criteria often lead to fitting the higher portions of the hydrograph (e.g., peak flows) at the expense of the lower portions (e.g., baseflow). Further, different efficiency criterion may place emphasis on different systematic and/or dynamic behavioural errors making it difficult for a hydrologist to clearly assess model performance.”

A note of caution on the use of objective functions Every objective functions has strengths and weakness What we can do to overcome these limitations Select appropriate objective function according to the study objectives Use more than one objective function (Multi-objective calibration) Use more than one variable in calibration (e.g. ET and Q) Calibrate at more than one observation point (e.g. more flow gauges within a basin than only at one station) What else?

Examples of model calibration and evaluation: Masih et al Examples of model calibration and evaluation: Masih et al., 2010-HBV model application, Iran

Examples of model calibration and evaluation: Masih et al Examples of model calibration and evaluation: Masih et al., 2010-HBV model application, Iran Calibration (Oct. 1988-Sep. 1994) R2= 0.91; NSE = 0.91; Bias= -5% Observed (blue) Simulate (red)

Some examples: Model calibration and evaluation Example from Masih et al., 2011-SWAT model application: What do you observe by visually inspecting these graphs Which one is good or bad? What could be the reason? Can we improve the fit through calibration process or this is what we can achieve at best?

HYPE model application in Swedon, Lindsrom et al., 2010 Figure 5 | Examples of local model evaluations from different river basins in Sweden (gray: simulation and black: observations). The model was optimized to the observations in each particular diagram. S: snow depth at Ljusnedal (R 2 = 0.92); E: evaporation at Norunda (R 2 = 0.59; Grelle 2008); G: groundwater level at Svartberget (R 2 = 0.90); F: soil frost depth at Svartberget (R 2 = 0.84, Lindstro¨m et al. 2002); W: lake water level in Lake Mo¨ ckeln (R 2 = 0.95); Q: discharge at Torsebro (R 2 = 0.94); O18: 18O content at Stubbetorp (R 2 = 0.54, Andersson & Lepisto¨ 1998); TN: total nitrogen at JRK research basin O19 (R 2 = 0.68) and TP: total phosphorous at JRK research basin AB5 (R 2 = 0.41).

Validation Model should be able to reproduce field observations not used in calibration Can be carried out as long as an independent data set has been kept aside

Validation Ensuring theories and assumptions are correct Ensuring computer programming code is correct Water balance calculation Parameters are valid values Compare to second data series of observed data

Examples model validation: Masih et al Examples model validation: Masih et al., 2010-HBV model application, Iran. Same parameter set used for the validation as of calibration Validation (Oct.1994-Sep. 2001) R2= 0.81; NSE = 0.67; Bias= 20% Observed (blue) Simulate (red)

Examples model validation: Merz and Bloschl, 2004-HBV model application to 308 Austrian catchments

Model Evaluation: Calibration and validation If simulations are good in both calibration and validation period, it gives an indication of the possibility of a model to be confidently used for the intended purpose Most often the model performance during validation period is lower than calibration, but could be other way around as well. Modeler should try to find reasons for high/low performance Sensitivity and uncertainty analysis are also integral part of calibration and validation process, and should be properly included in a modelling study.

Discussion and Questions