V. Nonlinear Regression Objective-Function Surfaces Thus far, we have: Parameterized the forward model Obtained head and flow observations and their weights.

Slides:



Advertisements
Similar presentations
Introduction to parameter optimization
Advertisements

AP Statistics Section 3.2 C Coefficient of Determination
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
VII. Observation-Parameter Statistics
Simple Linear Regression. Start by exploring the data Construct a scatterplot  Does a linear relationship between variables exist?  Is the relationship.
Probabilistic & Statistical Techniques Eng. Tamer Eshtawi First Semester Eng. Tamer Eshtawi First Semester
Chapter 10 Curve Fitting and Regression Analysis
A Short Introduction to Curve Fitting and Regression by Brad Morantz
Section 4.2 Fitting Curves and Surfaces by Least Squares.
Curve-Fitting Regression
Chapter 13 Introduction to Linear Regression and Correlation Analysis
Linear Regression Analysis
Chapters 8, 9, 10 Least Squares Regression Line Fitting a Line to Bivariate Data.
Quantify prediction uncertainty (Book, p ) Prediction standard deviations (Book, p. 180): A measure of prediction uncertainty Calculated by translating.
Least-Squares Regression
Introduction to Linear Regression and Correlation Analysis
Calibration Guidelines 1. Start simple, add complexity carefully 2. Use a broad range of information 3. Be well-posed & be comprehensive 4. Include diverse.
Simple Linear Regression Models
IV. Sensitivity Analysis for Initial Model 1. Sensitivities and how are they calculated 2. Fit-independent sensitivity-analysis statistics 3. Scaled sensitivities.
V. Nonlinear Regression By Modified Gauss-Newton Method: Theory Method to calculate model parameter estimates that result in the best fit, in a least squares.
Identify Parameters Important to Predictions using PPR & Identify Existing Observation Locations Important to Predictions using OPR.
III. Ground-Water Management Problem Used for the Exercises.
Chem Math 252 Chapter 5 Regression. Linear & Nonlinear Regression Linear regression –Linear in the parameters –Does not have to be linear in the.
VI. Evaluate Model Fit Basic questions that modelers must address are: How well does the model fit the data? Do changes to a model, such as reparameterization,
Curve-Fitting Regression
VIII: Methods for Evaluating Model Predictions 1. Define predictive quantity and calculate sensitivities and standard deviations (Ex8.1a) 2. Assess data.
© 2011 Autodesk Freely licensed for use by educational institutions. Reuse and changes require a note indicating that content has been modified from the.
IX. Transient Model Nonlinear Regression and Statistical Analysis.
Transformations.  Although linear regression might produce a ‘good’ fit (high r value) to a set of data, the data set may still be non-linear. To remove.
Creating a Residual Plot and Investigating the Correlation Coefficient.
9. Testing Model Linearity Modified Beale’s Measure (p ) Total model nonlinearity (p. 144) Intrinsic model nonlinearity (p. 145) For total and.
Correlation & Regression Analysis
Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.
IX. Transient Forward Modeling. Ground-Water Management Issues Recall the ground-water management issues for the simple flow system considered in the.
Simple Linear Regression The Coefficients of Correlation and Determination Two Quantitative Variables x variable – independent variable or explanatory.
Part II Exploring Relationships Between Variables.
Chapter 4 Basic Estimation Techniques
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
Calibration.
Linear Regression.
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
Linear regression Fitting a straight line to observations.
Nonlinear regression.
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Correlation and Regression
Chapter 3: Describing Relationships
DSS-ESTIMATING COSTS Cost estimation is the process of estimating the relationship between costs and cost driver activities. We estimate costs for three.
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
CHAPTER 3 Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
Chapter 3: Describing Relationships
CHAPTER 3 Describing Relationships
Presentation transcript:

V. Nonlinear Regression Objective-Function Surfaces Thus far, we have: Parameterized the forward model Obtained head and flow observations and their weights Calculated and evaluated sensitivities of the simulated observations to each parameter Now the parameter-estimation process can be used to get “best set” of parameter values  optimization problem Before we get into the mathematics behind parameter estimation we first graphically examine this process

V. Nonlinear Regression Objective-Function Surfaces Sum of squared weighted residuals objective function : Goal of nonlinear regression is to find the set of model parameters b that minimizes S(b) HEADS FLOWS PRIOR

Objective-Function Surfaces - continued Weighted squared errors are dimensionless, so quantities with different units can be summed in the objective function. Increasing the weight on an observation increases the contribution of that observation to S(b).

Objective-Function Surfaces - continued Objective function has as many dimensions as there are model parameters. For a 2-parameter problem, the objective function can be calculated for many pairs of parameter values, and the resulting objective-function surface can be contoured

Steady-State Problem as a Two-Parameter Problem Original six-parameter model is re-posed so that the six defined parameters are combined to form two parameters: KMult and RchMult. [Problem with K_RB when using MODFLOW Omission from KMult not problematic because K_RB is insensitive]. When KMult = 1.0: Like when HK_1, HK_2, VK_CB, and K_RB equal their starting values in the six-parameter model. When Rch_Mult = 1.0: Like when RCH_1 and RCH_2 equal their starting values in the six-parameter model.

Steady-State Problem as a Two-Parameter Problem With the problem posed in terms of KMult and RchMult: Use UCODE_2005 in Evaluate Objective Function mode to calculate S(b) using many sets of values for KMult and RchMult Values of KMult and RchMult range from 0.1 to 10 Use many values for each within this range. If 100, would have 100x100=10,000 sets of parameter values Plot values of S(b) for each set of parameter values Contour the resulting objective-function surface Examine how the objective-function surface changes given different observation types and weights.

Steady-State Problem as a Two-Parameter Problem Heads only With flow weighted using a coefficient of variation of 10% With flow weighted using a coefficient of variation of 1% Objective function surfaces (Book, Fig. 5-4, p. 82) (contours of objective function calculated for combinations of 2 parameters)

Why aren’t the objective functions symmetric about he minimum? (the trough when correlated) Darcy’s Law Q = -KA h = h 0 - (Q/KA) X = - Linear = - X Nonlinear in K = - X Nonlinear in K Parameter Nonlinearity of Darcy’s Law (Hill and Tiedeman, 2007, p ) Nonlinearity makes it much harder to estimate parameter values.

DO EXERCISE 5.1a: Assess relation of objective-function surfaces to parameter correlation coefficients.

Exercise 5.1a - questions Use Darcy’s Law to explain why all the parameters are completely correlated when only hydraulic-head observations are used. Why does adding a single flow measurement make such a difference in the objective-function surface? Given that addition of one observation prevents the parameters from being completely correlated, what effect do you expect any error in the flow measurement to have on the regression results?

Why aren’t the objective functions symmetric about he minimum? (the trough when correlated) Darcy’s Law Q = -KA h = h 0 - (Q/KA) X = - Linear = - X Nonlinear in K = - X Nonlinear in K Parameter Nonlinearity of Darcy’s Law (Hill and Tiedeman, 2007, p ) Nonlinearity makes it much harder to estimate parameter values.

Introduction to the Performance of the Gauss- Newton Method: Effect of MAX-CHANGE Goal of the modified Gauss-Newton (MGN) method: find the minimum value of the objective function. MGN iterates. Each iteration moves toward the minimum of an approximate objective function. Approximation: linearize the model about the current set of parameter values. If the approximate and true objective functions are very different, the minimum of the approximate objective-function may be far from the true minimum. Often advantageous to restrict the method: for any one iteration the parameter values are not allowed to change too much. Use damping. MAX-CHANGE: User-specified value partly controls the damping. MAX-CHANGE = the maximum fractional change allowed in one regression iteration. If MAX-CHANGE=2 and the parameter value=1.1, the new value is allowed to be between 1.1±(2x1.1), or between -1.1 and 3.3.

DO EXERCISE 5.1b: Examine the performance of the modified Gauss-Newton method for the two-parameter lumped problem.

Exercise 5.1b – questions in first bullet Do the regression runs converge to optimal parameter values? How do the estimated parameter values compare among the different regression runs? Explain the difference in the progression of parameter values during these regression runs.

Run 1 MaxChange= 10,000 Run 2 MaxChange= 10,000 Run 3 MaxChange= 0.5 Run 4 MaxChange =0.5 Iter. KRchK K K    Converged 3      Converged 10 2  Converged 4 regression runs with different starting values or different maximum step sizes: Run 1: Start near trough Run 2: Start far away, let regression take big steps Runs 3 & 4: Start far away, force small steps The regression converged in 3 of the runs! Are those parameter estimates unique? Exercise: Plot regression results on objective function surface for model calibrated with ONLY HEAD DATA

Run 1 MaxChange =10,000 Run 2 MaxChang e=10,000 Run 3 MaxChange =0.5 Run 4 MaxChange =0.5 Iter. KRchK K K    Converged 2    Converged 8 7   Converged 10 2  The regression again converged in 3 of the runs. Now do we have a calibrated model with unique parameter estimates? Exercise: Plot regression results on objective function surface for model calibrated with HEAD AND FLOW DATA Same starting values and maximum step sizes as in previous exercise.

Effects of Correlation and Insensitivity b1b1 b2b2 minimum Linear objective function: No correlation, b 1 less sensitive ~Var(b 2) ~Var(b 1 )

Effects of Correlation and Insensitivity b1b1 b2b2 minimum Linear objective function Strong, negative correlation

Parameter values along section Minimum is not well defined objective function value Effects of Correlation and Insensitivity

b1b1 b2b2 minimum Linear objective function Strong, negative correlation ~Var(b 2 ) ~Var(b 1 )

Insensitivity Stretches the contours in the direction of the insensitive parameter. very insensitive = very uncertain Correlations Rotate the contours away from the parameter axis Uncertainty from one parameter can be passed into another parameter! Create parameter combinations that give equivalent results Increases the non-uniqueness Effects of Correlation and Insensitivity