Quiz 2.

Slides:



Advertisements
Similar presentations
Line Search.
Advertisements

Optimization.
ROOTS OF EQUATIONS Student Notes ENGR 351 Numerical Methods for Engineers Southern Illinois University Carbondale College of Engineering Dr. L.R. Chevalier.
FTP Biostatistics II Model parameter estimations: Confronting models with measurements.
Optimization Introduction & 1-D Unconstrained Optimization
The loss function, the normal equation,
Visual Recognition Tutorial
CSCE Review—Fortran. CSCE Review—I/O Patterns: Read until a sentinel value is found Read n, then read n things Read until EOF encountered.
458 More on Model Building and Selection (Observation and process error; simulation testing and diagnostics) Fish 458, Lecture 15.
458 Interlude (Optimization and other Numerical Methods) Fish 458, Lecture 8.
A few words about convergence We have been looking at e a as our measure of convergence A more technical means of differentiating the speed of convergence.
458 Fitting models to data – II (The Basics of Maximum Likelihood Estimation) Fish 458, Lecture 9.
Engineering Optimization
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 21 CURVE FITTING Chapter 18 Function Interpolation and Approximation.
458 Fitting models to data – I (Sum of Squares) Fish 458, Lecture 7.
CISE-301: Numerical Methods Topic 1: Introduction to Numerical Methods and Taylor Series Lectures 1-4: KFUPM.
Collaborative Filtering Matrix Factorization Approach
MATH 175: NUMERICAL ANALYSIS II Lecturer: Jomar Fajardo Rabajante IMSP, UPLB 2 nd Semester AY
Chapter 13: Inference in Regression
First, some left-overs… Lazy languages Nesting 1.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
CISE-301: Numerical Methods Topic 1: Introduction to Numerical Methods and Taylor Series Lectures 1-4: KFUPM CISE301_Topic1.
Lecture Notes Dr. Rakhmad Arief Siregar Universiti Malaysia Perlis
Numerical Methods Applications of Loops: The power of MATLAB Mathematics + Coding 1.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Surplus-Production Models
ENM 503 Lesson 1 – Methods and Models The why’s, how’s, and what’s of mathematical modeling A model is a representation in mathematical terms of some real.
Loop Application: Numerical Methods, Part 1 The power of Matlab Mathematics + Coding.
Lecture 6 Numerical Analysis. Solution of Non-Linear Equations Chapter 2.
Chapter 3 Roots of Equations. Objectives Understanding what roots problems are and where they occur in engineering and science Knowing how to determine.
Numerical Methods for Engineering MECN 3500
Numerical Methods.
Physics 114: Lecture 18 Least Squares Fit to Arbitrary Functions Dale E. Gary NJIT Physics Department.
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
Newton’s Method, Root Finding with MATLAB and Excel
Today’s class Roots of equation Finish up incremental search
Applications of Loops: The power of MATLAB Mathematics + Coding
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Linearization, Newton’s Method
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm One double-sided cheat sheet (8.5in x 11in) allowed Bring your calculator to the exam Chapters.
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Nonlinear function minimization (review). Newton’s minimization method Ecological detective p. 267 Isaac Newton We want to find the minimum value of f(x)
NUMERICAL ANALYSIS I. Introduction Numerical analysis is concerned with the process by which mathematical problems are solved by the operations.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 / Chapter 5.
Hidden Markov Models BMI/CS 576
Physics 114: Lecture 16 Least Squares Fit to Arbitrary Functions
Lesson 8: Basic Monte Carlo integration
Linear Equations Constant Coefficients
Root Finding Methods Fish 559; Lecture 15 a.
Non-linear Minimization
Solver & Optimization Problems
MSY from age-structured models
Computer Architecture Experimental Design
Data Mining Lecture 11.
Bayesian Models in Machine Learning
Collaborative Filtering Matrix Factorization Approach
Instructor :Dr. Aamer Iqbal Bhatti
Using Parametric Curves to Describe Motions
Optimization and Some Traditional Methods
Differentiation Recap
3.8 Newton’s Method How do you find a root of the following function without a graphing calculator? This is what Newton did.
The loss function, the normal equation,
Mathematical Foundations of BME Reza Shadmehr
Some Comments on Root finding
MATH 1910 Chapter 3 Section 8 Newton’s Method.
EE, NCKU Tien-Hao Chang (Darby Chang)
CISE-301: Numerical Methods Topic 1: Introduction to Numerical Methods and Taylor Series Lectures 1-4: KFUPM CISE301_Topic1.
Presentation transcript:

Quiz 2

Schaefer model with index of abundance Constant: index assumed linearly proportional to biomass Unfished biomass = K Observed index value Predicted index value Schaefer MB (1954) Some aspects of the dynamics of the population important to the management of the commercial marine fisheries. Inter-American Tropical Tuna Commission Bulletin 1(2):25-56

Simulated lobster CPUE data a one-way trip 5 Lobster simulation.xlsx, sheet “Simulate”

r = 0.05, K = 40271, lnSSQ = 0.902 r = 0.6, K = 9993, lnSSQ = 0.326 r = 1.0, K = 6355, SSQ = 0.359 5 Lobster simulation.xlsx, sheet “Simulate”

r = 0.05, K = 40271, lnSSQ = 0.902 r = 0.6, K = 9993, lnSSQ = 0.326 r = 0.20, K = 22550, lnSSQ = 0.611 r = 1.0, K = 6355, SSQ = 0.359

“Lobster” model fits All the models fit the index data very well But terminal harvest rates (u2005) range from 0.18 to >1 We need auxiliary information about harvest rates Expert knowledge: from length data it is clear than almost all lobster are caught, thus harvest rates are around 0.7 in recent years. Solution: add a term (u2005 – 0.7)2 to the SSQ

Unequal weighting How to weight multiple data sources How to make probabilistic statements about the results

Lobster lessons learned A one-way trip is not very informative Harvest and growth are confounded We can add outside information by including new terms in the SSQ

Summary of SSQ fitting Make the predicted close to the observed! A simple approach that can be applied to simple or very complex models Find the hypothesis that comes closest to the data Also find competing hypotheses that fit data nearly as well We always want to understand the fit of competing hypotheses

Next steps Models often have many sources of data Move to likelihood that provides a logical method of weighting alternative data sources Likelihood also lets us make more probabilistic statements about competing fits

Nonlinear function minimization

Readings Hilborn and Mangel, Ecological detective, chapter 11 Numerical Recipes 3rd Edition: The Art of Scientific Computing, chapter 11 mostly online: http://apps.nrbook.com/empanel/index.html#

Warning Some of the material in this lecture looks complex and very scary But implementing the equations is intuitive and good practice The algebra is presented for completeness, and we won’t deal with it outside the lecture You can master this subject!

The general problem To find a maximum or minimum in a multi-dimensional surface

Hill climbing in reverse Conceptually we want to walk down hill until every direction is up, to minimize the objective function (e.g. sum of squares)

How to avoid false summits Make sure you are on a true summit Look in all directions In Solver restart from solution to make sure it has converged See if other starting points end up at the same place Start Solver from a number of places, and see if they all go to the same place

Multiple starting points

Basic approach: gradient method Start with a guess about x See which direction is down (calculate the slope dy/dx) Look at the slope and the change in slope (1st and 2nd derivatives) to guess about how far we have to go until we reach the bottom Move that direction Start over again

Newton’s minimization method Ecological detective p. 267 We want to find the minimum value of f(x) Isaac Newton

Newton’s Method: 6 Newtons method.xlxs, sheet Newton derivatives

Lambda = 1.9 Lambda = 0.2 6 Newtons method.xlxs, sheet Newton derivatives

6 Newtons method.xlxs, sheet Newton derivatives

Intuition Function minimum implies first derivative is zero The second derivative is the rate of change of first derivative Divide first derivative by second derivative to get number of units of x to jump to find where first derivative is zero Set lambda < 1 to prevent overshooting

Basic theory If the curve is quadratic (as it is in linear models), then the second derivative is uniform over the entire range of x and you can jump right to the minimum If the curve is not quadratic, then the derivatives change with x and you have to iterate 6 Newtons method.xlxs, sheet Newton derivatives

Numerical derivatives 6 Newtons method.xlxs, sheet Newton numerical

6 Newtons method.xlxs, sheet Newton numerical

Why numerical derivatives? Many real-life functions and models do not have easy formulae from which derivatives can be directly calculated. But it is always easy to change the parameters by a small number (Δ) and rerun the function to get a close approximation to the numerical derivative.

Golden section search Numerical recipes; Ecological detective p. 271-3 One parameter only, no derivatives or differentiation required Minimum must be within bounds; much faster than bisection search Algorithm: Select lower bound L and upper bound U containing the answer Select two new points x1 and x2: Calculate the function at those points Replace U or L with one of the new points as appropriate to narrow the bounds containing the minimum Select another new point to replace the one chosen as a new bound, and repeat Choice of x1 and x2 based on “golden” ratio

Golden section search Step 1 L = 0 x1 x2 U = 1 Step 2 L = x1 x2 x3

Golden section search 6 Golden search.xlxs