6.2 Grid Search of Chi-Square Space

Slides:



Advertisements
Similar presentations
Autocorrelation and Heteroskedasticity
Advertisements

Very simple to create with each dot representing a data value. Best for non continuous data but can be made for and quantitative data 2004 US Womens Soccer.
Running a model's adjoint to obtain derivatives, while more efficient and accurate than other methods, such as the finite difference method, is a computationally.
Experimental Design, Response Surface Analysis, and Optimization
§ 8.3 Quadratic Functions and Their Graphs.
§ 8.3 Quadratic Functions and Their Graphs. Graphing Quadratic Functions Blitzer, Intermediate Algebra, 5e – Slide #2 Section 8.3 The graph of any quadratic.
PCA + SVD.
End of Chapter 8 Neil Weisenfeld March 28, 2005.
Lecture 3-2 Summarizing Relationships among variables ©
1 Doing Statistics for Business Doing Statistics for Business Data, Inference, and Decision Making Marilyn K. Pelosi Theresa M. Sandifer Chapter 11 Regression.
Correlation and regression 1: Correlation Coefficient
Copyright © Cengage Learning. All rights reserved. 3 Exponential and Logarithmic Functions.
Introduction to Error Analysis
Linear Systems Two or more unknown quantities that are related to each other can be described by a “system of equations”. If the equations are linear,
Introduction The tourism industry thrives on being able to provide travelers with an amazing travel experience. Specifically, in areas known for having.
Chapter 15 Modeling of Data. Statistics of Data Mean (or average): Variance: Median: a value x j such that half of the data are bigger than it, and half.
Chapter 13 Statistics © 2008 Pearson Addison-Wesley. All rights reserved.
§ 8.3 Quadratic Functions and Their Graphs. Blitzer, Intermediate Algebra, 4e – Slide #48 Graphing Quadratic Functions Graphs of Quadratic Functions The.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
CSE554AlignmentSlide 1 CSE 554 Lecture 5: Alignment Fall 2011.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
© 2008 Pearson Addison-Wesley. All rights reserved Chapter 1 Section 13-6 Regression and Correlation.
Optimising Cuts for HLT George Talbot Supervisor: Stewart Martin-Haugh.
Physics 114: Lecture 18 Least Squares Fit to Arbitrary Functions Dale E. Gary NJIT Physics Department.
Motivation As we’ve seen, chaos in nonlinear oscillator systems, such as the driven damped pendulum discussed last time is very complicated! –The nonlinear.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Copyright © 2013 Pearson Education, Inc. Section 3.2 Linear Equations in Two Variables.
MultiModality Registration Using Hilbert-Schmidt Estimators By: Srinivas Peddi Computer Integrated Surgery II April 6 th, 2001.
Section 2.2 Quadratic Functions. Thursday Bellwork 4 What does a quadratic function look like? 4 Do you remember the standard form? 4 How could we use.
Copyright © Cengage Learning. All rights reserved. 8 9 Correlation and Regression.
Copyright © Cengage Learning. All rights reserved. Functions.
Slide 1 Copyright © 2004 Pearson Education, Inc.  Descriptive Statistics summarize or describe the important characteristics of a known set of population.
CSE 554 Lecture 8: Alignment
Physics 114: Lecture 16 Least Squares Fit to Arbitrary Functions
Linear Regression Essentials Line Basics y = mx + b vs. Definitions
Copyright © Cengage Learning. All rights reserved.
Inference for Least Squares Lines
Statistical Data Analysis - Lecture /04/03
Quadratic Functions and Their Graphs
SIMPLE LINEAR REGRESSION MODEL
Local Search Algorithms
Elementary Statistics
2.1 Equations of Lines Write the point-slope and slope-intercept forms
Roberto Battiti, Mauro Brunato
A special case of calibration
Lecture Slides Elementary Statistics Thirteenth Edition
LESSON 21: REGRESSION ANALYSIS
Copyright © Cengage Learning. All rights reserved.
2.2 The Limit of a Function In this section, we will learn:
Algebra: Graphs, Functions, and Linear Systems
“Exploring Quadratic Functions”
KS4 Mathematics Linear Graphs.
6.7 Practical Problems with Curve Fitting simple conceptual problems
6.5 Taylor Series Linearization
Correlation and Regression
Functions and Their Graphs
2.3 Estimating PDFs and PDF Parameters
5.2 Least-Squares Fit to a Straight Line
5.4 General Linear Least-Squares
6.1 Introduction to Chi-Square Space
6.6 The Marquardt Algorithm
9.5 Least-Squares Digital Filters
6.3 Gradient Search of Chi-Square Space
9.4 Enhancing the SNR of Digitized Signals
CHAPTER 3 Describing Relationships
Chapter 7 Functions of Several Variables
A8 Linear and real-life graphs
Created by Erin Hodgess, Houston, Texas
Tutorial 3 Applications of the Derivative
Communication Driven Remapping of Processing Element (PE) in Fault-tolerant NoC-based MPSoCs Chia-Ling Chen, Yen-Hao Chen and TingTing Hwang Department.
Presentation transcript:

6.2 Grid Search of Chi-Square Space example data from a Gaussian-shaped peak are given and plotted initial coefficient guesses are made the basic grid search strategy is outlined an actual manual search is performed chi-square space is examined around the minimum an automatic Mathcad program is used one difficulty with a grid search is described 6.2 : 1/11

Non-Linear Equation This is the same example used in Section 6.1. Find the least-squares coefficients that minimize chi-square for the Gaussian equation. The following data were collected across the peak: (45,0.001)(46,0.010) (47,0.037)(48,0.075) (49,0.120)(50,0.178) (51,0.184)(52,0.160) (53,0.126)(54,0.064) (55,0.034) 6.2 : 2/11

Initial Coefficient Guesses I used familiarity with a Gaussian for the initial guesses, g10 and g11. I set g11 equal to the mode, 51, and g10 equal to one fourth the peak baseline width, 11/4 = 2.75. The fit is shown at the left. It seems that a0 is too large, so I tried 2.00. The right graph shows this change. This is good enough to start, so g10 = 2.00 and g11 = 51.0. 6.2 : 3/11

Basic Strategy Start with the first guesses, g10 and g11. Compute chi-square. Change g10 by an amount equal to the desired precision, say in this case 0.01, and compute a new chi-square. If you don't know which direction to go, try both g10+0.01 and g10-0.01. Continue in the same direction along a0 until the value of chi-square no longer decreases. The value producing the lowest chi-square is your second guess, g20. Now switch to the other coefficient. Change g11 by an amount equal to the desired precision, say in this case 0.1. Again, you might try both g11+0.1 and g11-0.1. Continue in the same direction until the value of chi-square no longer decreases. The value producing the lowest chi-square is your second guess, g21. Repeat the process with g20 and g21, etc. Iterate until chi-square no longer changes. 6.2 : 4/11

Varying g0 while g1 = 51.0 g0 g1 c2 2.00 51.0 0.000784 2.10 0.000333 2.20 0.000350 2.15 0.000290 2.16 0.000294 2.14 Looking at the fit on the last slide, I decided to try increasing a0. Increasing to 2.10 lowered chi-square. Moving in the same direction, 2.20 gave a higher chi-square so the minimum is in between the two values. I split the difference using 2.15. Further investigation showed that either 2.14 or 2.15 could be used. 6.2 : 5/11

Varying g1 while g0 = 2.14 g0 g1 c2 2.14 51.0 0.000290 52.0 0.015281 50.0 0.012648 50.5 0.003017 50.6 0.001906 50.7 0.001077 50.8 0.000530 50.9 0.000268 Moving along the a1 axis proved much more laborious, because the second guess was too far from the first. Continuation of the grid search along g0, showed that 2.15 and 2.13 gave higher c2 when g1 = 50.9. 6.2 : 6/11

Chi-Square Space A contour graph was constructed showing both the starting point and the region around the minimum. The contour graph also shows the route taken while using a manual grid search to locate the minimum. Chi-square space is smooth with no apparent false minima. The minimum c2 at the resolution used for graph (Da0 = 0.001, Da1 = 0.01) occurs at a0 = 2.143 and a1 = 50.94). cmin2 = 0.000243 6.2 : 7/11

Grid Search Program Description A program that automatically computes the minimum in chi-square space is shown in the Mathcad worksheet, "6.2 Grid Search Program.mcd". It uses five functions with x,y and a as vector inputs, and res as a scalar input: f(x,a), inputs are the x-data and the initial coefficient guesses, a; output is the corresponding y-value as a scalar. chisqr(y,x,a), inputs are the x- and y-data, and the a-coefficients; output is chi-square at the location given by a. dir(y,x,a,res), inputs are the data, coefficients, and resolution along all coefficient axes, res; output is a vector containing the coefficient and direction yielding the largest drop in chi-square. move(y,x,a,res), moves along a coefficient axis until a minimum in chi-square is found; output is the a-vector at the minimum. grid(y,x,a,res), performs a grid search using the initial guesses given in the a-vector at the specified resolution; output is a vector containing the coefficient values at the minimum. 6.2 : 8/11

Grid Search Program Output Start with the initial guesses and a resolution about ten times finer than the resolution of the guesses. This result matches the manual search. If higher resolution is desired, use the above output as the initial guesses. Quit when the number of significant figures in chi-square exceed the number of significant figures in the maximum amplitude. To estimate the standard deviation of the fit, divide c2 by N-2 and take the square root, e.g. 0.005. Thus, 0.184 has two significant figures, so stop when c2 is stable at three significant figures. 6.2 : 9/11

Verifying the Grid Search Result When using any method of searching chi-square space it is important that you attempt to verify that your solution is valid. examine a contour plot of chi-square space to see if you are at the true minimum compare the regression line to the data relocate the minimum using different starting guesses for the coefficients To obtain different starting guesses, use the first set of guesses and the final coefficients to obtain three more sets of guesses Da0 = 2.00  2.14 = +0.14 g0 = 2.14 + 0.14 = 2.28 Da1 = 51.00  50.94 = -0.06 g1 = 50.94 - 0.06 = 50.88 6.2 : 10/11

A Difficult Grid Search When two equation coefficients have a covariance, chi-square space becomes tilted at an angle to the grids. The grid search algorithm has trouble with angled chi-square space because it oscillates back and forth between coefficient axes. The straight line example shown took 23 changes in axis to locate the minimum. Doing this by hand is tiring! 6.2 : 11/11