Presentation is loading. Please wait.

Presentation is loading. Please wait.

6.3 Gradient Search of Chi-Square Space

Similar presentations


Presentation on theme: "6.3 Gradient Search of Chi-Square Space"— Presentation transcript:

1 6.3 Gradient Search of Chi-Square Space
example data are from the same Gaussian peak used in Section 6.2 the basic gradient search strategy is outlined the mathematical details of estimating the gradient are given an automatic Mathcad program is described the ease with which the gradient search can traverse "tilted" chi-square space is shown 6.3 : 1/8

2 Gaussian Example Continued
This is the same example used in Section Find the least-squares coefficients that minimize chi-square for the Gaussian equation. The following data were collected across the peak: (45,0.001)(46,0.010) (47,0.037)(48,0.075) (49,0.120)(50,0.178) (51,0.184)(52,0.160) (53,0.126)(54,0.064) (55,0.034) The initial parameter guesses were: g10 = 2.00 g11 = 51.0 6.3 : 2/8

3 Basic Strategy Start with the first guesses, g10 and g11. Compute chi-square. Choose a value of Da for each axis that will give the desired parameter resolutions. At the location, (g10, g11), compute the local direction of steepest descent (negative of the mathematical gradient). Continue in the direction of steepest descent until the value of chi-square no longer decreases. The value of the two coefficients producing the lowest chi-square are now your second guesses, g20 and g21. Re-compute the local direction of steepest descent at the new location, (g20, g21). Continue in the direction of steepest descent until the value of chi-square no longer decreases. This is the next guess at the coefficients. Repeat the process with each new location until chi-square space becomes flat, i.e. the gradient is zero. 6.3 : 3/8

4 Mathematical Details The direction of steepest descent is given by the negative of the gradient. For two coefficients, this direction is given by the following. Often the coefficient resolution, Da, is too large to obtain reasonable estimates of the derivatives. Instead, the partial derivatives are estimated by using a fraction of the resolution, fDa, where 0 < f < 1. A reasonable value is f = 0.01 to f = 0.1. The step sizes have to be adjusted for each parameter so that travel is in the direction of steepest descent. At the right, L is the length of the descent vector, and d is the step size along each parameter. 6.3 : 4/8

5 Gradient Search Program Description
A program that automatically computes the minimum in chi-square space is shown in the Mathcad worksheet, "6.3 Gradient Search Program.mcd". It uses five functions with x,y and a as vector inputs, and res as a scalar input: f(x,a), inputs are the x-data and the initial coefficient guesses, a; output is the corresponding y-value as a scalar. chisqr(y,x,a), inputs are the x- and y-data, and the a-coefficients; output is chi-square at the location given by a. dir(y,x,a,res), inputs are the data, coefficients, and resolution along all coefficient axes, res; output is a vector containing the coefficient step sizes that will follow the steepest descent. move(y,x,a,res), moves along the direction of steepest descent until the gradient is zero; output is the a-vector at the minimum. grad(y,x,a,res), performs a gradient search using the initial guesses given in the a-vector at the specified resolution; output is a vector containing the coefficient values at the minimum. 6.3 : 5/8

6 Gradient Search Program Output
Start with the initial guesses and a resolution about 100 times finer than the resolution of the guesses. This result matches the manual search. If higher resolution is desired, use the above output as the initial guesses. Note that these values are slightly different than those obtained with the grid search, ( , ), because the gradient search can locate points "off the grid." The minimum chi-square is the same value. 6.3 : 6/8

7 Chi-Square Space The contour graph shows the route taken while using a gradient search to locate the minimum. It took two moves to get to the minimum, but the second move had a distance too small to show up on the graph. Note that the grid lines are not followed The minimum c2 at the resolution used for graph (Da0 = 0.01, Da1 = 0.01) occurs at a0 = and a1 = ). cmin2 = 6.3 : 7/8

8 Gradient Search with Covariance
The gradient search of "tilted" chi-square space works much better than a grid search. The grid search required 23 changes in coefficient axes while locating the minimum. The gradient search found the minimum with only three changes in direction! 6.3 : 8/8


Download ppt "6.3 Gradient Search of Chi-Square Space"

Similar presentations


Ads by Google