Physics 114: Lecture 16 Least Squares Fit to Arbitrary Functions

Slides:



Advertisements
Similar presentations
Self tuning regulators
Advertisements

The Basics of Physics with Calculus – Part II
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Regression Analysis Using Excel. Econometrics Econometrics is simply the statistical analysis of economic phenomena Here, we just summarize some of the.
456/556 Introduction to Operations Research Optimization with the Excel 2007 Solver.
Lecture 3-2 Summarizing Relationships among variables ©
Physics 114: Lecture 17 Least Squares Fit to Polynomial
CpE- 310B Engineering Computation and Simulation Dr. Manal Al-Bzoor
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Ch 8.1 Numerical Methods: The Euler or Tangent Line Method
Analytical vs. Numerical Minimization Each experimental data point, l, has an error, ε l, associated with it ‣ Difference between the experimentally measured.
Numerical Methods Applications of Loops: The power of MATLAB Mathematics + Coding 1.
Physics 114: Exam 2 Review Lectures 11-16
Physics 114: Lecture 18 Least Squares Fit to Arbitrary Functions Dale E. Gary NJIT Physics Department.
Lecture PowerPoint Slides Basic Practice of Statistics 7 th Edition.
Review of fundamental 1 Data mining in 1D: curve fitting by LLS Approximation-generalization tradeoff First homework assignment.
NON-LINEAR REGRESSION Introduction Section 0 Lecture 1 Slide 1 Lecture 6 Slide 1 INTRODUCTION TO Modern Physics PHYX 2710 Fall 2004 Intermediate 3870 Fall.
MAT 2401 Linear Algebra 2.5 Applications of Matrix Operations
MathematicalMarketing Slide 5.1 OLS Chapter 5: Ordinary Least Square Regression We will be discussing  The Linear Regression Model  Estimation of the.
AP PHYSICS 1 SUMMER PACKET Table of Contents 1.What is Physics? 2.Scientific Method 3.Mathematics and Physics 4.Standards of Measurement 5.Metric System.
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
In this chapter, we explore some of the applications of the definite integral by using it to compute areas between curves, volumes of solids, and the work.
MathematicalMarketing Slide 3c.1 Mathematical Tools Chapter 3: Part c – Parameter Estimation We will be discussing  Nonlinear Parameter Estimation  Maximum.
Fitting Equations to Data Chapter 7. Problems Classwork: 7.1, 7.8, 7.13, 7.22 Homework: 7.2, 7.9, 7.16, 7.23, 7.28.
P2 Chapter 8 CIE Centre A-level Pure Maths © Adam Gibson.
Multiple Regression Analysis: Inference
List manipulation;curve fitting
The simple linear regression model and parameter estimation
Department of Mathematics
Physics 114: Lecture 13 Probability Tests & Linear Fitting
Physics 114: Lecture 18 Least Squares Fit to 2D Data
Fitting Equations to Data
Linear Algebra Review.
Chapter 7. Classification and Prediction
1 Functions and Applications
Statistical Data Analysis - Lecture /04/03
Physics 114: Lecture 14 Linear Fitting
Physics 114: Exam 2 Review Weeks 7-9
Physics 114: Lecture 17 More Fitting to Arbitrary Functions
Physics 114: Lecture 15 Least Squares Fit to Polynomial
Copyright © Cengage Learning. All rights reserved.
ECE 2202 Circuit Analysis II
A Simple Artificial Neuron
OPSE 301: Lab13 Data Analysis – Fitting Data to Arbitrary Functions
Digital Processing Techniques for Transmission Electron Microscope Images of Combustion-generated Soot Bing Hu and Jiangang Lu Department of Civil and.
Data Mining Lecture 11.
A special case of calibration
Physics 114: Lecture 10 Error Analysis/ Propagation of Errors
Physics 114: Exam 2 Review Material from Weeks 7-11
CHAPTER 26: Inference for Regression
CHAPTER 14: Confidence Intervals The Basics
Physics 114: Lecture 14 Linear Fitting
Regression.
Modelling data and curve fitting
Physics 114: Lecture 11 Error Analysis, Part II
John Federici NJIT Physics Department
Physics 114: Lecture 14-a Linear Fitting Using Matlab
6.2 Grid Search of Chi-Square Space
John Federici NJIT Physics Department
6.5 Taylor Series Linearization
Prediction and Accuracy
6.1 Introduction to Chi-Square Space
The Basics of Physics with Calculus – Part II
2.10 Solution by Variation of Parameters Section 2.10 p1.
6.3 Gradient Search of Chi-Square Space
Introduction to Measurement
CHAPTER Five: Collection & Analysis of Rate Data
Introduction to Artificial Intelligence Lecture 22: Computer Vision II
Presentation transcript:

Physics 114: Lecture 16 Least Squares Fit to Arbitrary Functions John F. Federici NJIT Physics Department

More Star Wars jokes….

More Philosophy For this class, we are more interested in ANALYZING the data then the details of exactly HOW to write all our own code from scratch. We will USE THE FORCE …… the Curve Fitting App…. so the details of HOW the fit is done is not important. However, you should understand the basic concept. So hang on for 5-6 slides while we get past the basic idea, and then we will fit arbitrary functions.

Nonlinear Least Squares Fitting The general means to fit curves, surfaces, or higher dimensions to data relies on minimizing chi-square, but there is no closed-form method to calculate the coefficients. As usual, for a function y(x), say, the chi-square is The techniques we developed in Chapters 6 and 7 only work when the parameters in the function y(x) are linear, i.e. obey When this is not the case, i.e. when y(x) depends on products or powers in parameters), then the minimization of chi-square results in coupled equations that in general cannot be solved. We may sometimes be able to linearize it, as we saw last time, but generally we need to proceed by trial and error.

Searching Parameter Space The basic approach could not be simpler, philosophically. The idea is to simply calculate a trial value of the function for a given set of values of parameters, calculate the chi-square, and repeat for a large number of parameters until you find the minimum chi-square. As a concrete example, consider a Gaussian function with parameters a, b and c: fit to a set of data yi = 0.0008, 0.0211, 0.2327, 1.0546, 1.9648, 1.5049, 0.4739., with equal errors si = s = 0.2. Try a set of parameters a = 1, b = 1, c = 1. With these parameters, the function y(x) gives y(x) = 0.0000, 0.0001, 0.0183, 0.3679, 1.0000, 0.3679, 0.0183. The reduced chi-square is then A plot of the curves yi and y(x) are shown at right.

Searching Parameter Space (cont’d) Now we simply try other sets of parameters and continue to calculate chi-square, attempting to find a minimum chi-square such that the reduced chi-square is about 1. The plot shows the result of trying other values of a from 1 to 3, stepping by 0.5, while keeping b = 1 and c = 1. The reduced chi-square for this set of five curves is Looks like a = 2.5, b = 1, c = 1 is the best so far. Now we change one of the other parameters, by say, setting b = 1.2, and varying a again, with c = 1. The plot is shown at right, and the chi-square for this new set of curves is Looks like a = 2.5, b = 1.2, c = 1 is the best so far.

Searching Parameter Space (cont’d) If we repeat this 3 more times, increasing b by 0.2 each time, we end up with the following 25 values of reduced chi-square: c=1 Now let’s try c=1.2, and repeat the whole thing again: c=1.2 b=1.0 b=1.2 b=1.4 b=1.6 b=1.8 a=1.0 18.4311 17.9506 18.3422 19.6973 21.8997 a=1.5 10.1180 9.3389 9.8318 11.8644 15.2623 a=2.0 5.7779 4.6612 5.1924 7.9024 12.5586 a=2.5 5.4107 3.9175 4.4241 7.8115 13.7888 a=3.0 9.0164 7.1078 7.5267 11.5916 18.9528 b=1.0 b=1.2 b=1.4 b=1.6 b=1.8 a=1.0 15.3064 14.8579 15.3022 16.6286 18.7497 a=1.5 6.5330 5.8522 6.5054 8.4939 11.6843 a=2.0 2.4672 1.5487 2.4021 5.0517 9.3174 a=2.5 3.1090 1.9475 2.9921 6.3022 11.6491 a=3.0 8.4585 7.0485 8.2756 12.2454 18.6794

Searching Parameter Space (cont’d) After additional trial c values, we find a best fit of a = 2.0, b = 1.2, c = 1.4. Here is the final fit with these values. But note that once we have the parameters, we can plot a much smoother Gaussian through the points by evaluating it at more x values. Note that we could be even more precise by stepping with smaller steps.

Searching Parameter Space (cont’d) Here is the reduced chi-square shown for smaller steps, covering the same range of a and b, but now as a 10 x 10 plot, and shown as an image. The minimum chi-square is in a dip on a surface in “parameter space”. cn2= 1.29

What is the Point Prof. Federici? The point LUKE is that Matlab invokes an algorithm to search the parameter space for the set of parameters that minimizes R2 or χ2. Whether the algorithm is a very simple (but time consuming) method of plotting out a grid or using calculus to locate a local minimum (set derivatives equal to zero), the essential concept is that one is searching in parameter space for a local minimum in 2, 3, 4 or N dimensional space. So, now that you understand the basic concept, let’s implement it in MATLAB.

Example 1 Everyone turn on their Laptops, and type in the commands as I do to follow along. >> d = [0 0.48 1.03 1.25 1.64 1.96 2.66 3.18 3.92]; >> theta = [0 0.174 .349 .522 .696 .87 1.044 1.218 1.392]; >> plot(theta,d,'.') We will now fit this data with the following equation   Where n is the fitting parameter

CURVE FITTING TOOL Choose x data to be “theta” Choose y data to be “d” Choose CUSTOM EQUATION and type in the equation you will be using. 5*sin(x)*(1-cos(x)/(n^2-(sin(x))^2)^(1/2)) Click on AUTO FIT.. The function should then try to fit. You MIGHT GET an ERROR. Open up FIT OPTIONS

Example 1 – cont. Specific algorithm which is used to determine best fit parameters

Example 1 – Cont. For your fitting parameters, you can change the upper and lower limit for the fitting parameter and also change your initial ‘guess’ as to the final answer. Change the LOWER limit to be 1. The quantity which you are solving for is a refractive index which is typically 1 or larger. Sometimes when you do the fitting you need to provide a REASONABLE guess for the fitting parameter. Remember, MATLAB is searching parameter space for a minimum. You want to make sure that it finds a GLOBAL minimum and not just a localized minimum.

GOOD FITS! General model: f(x) = 5*sin(x)*(1-cos(x)/(n^2-(sin(x))^2)^(1/2)) Coefficients (with 95% confidence bounds): n = 1.583 (1.425, 1.741) Goodness of fit: SSE: 0.3621 R-square: 0.9723 Adjusted R-square: 0.9723 RMSE: 0.2127 Note that the nonlinear fit gives you the best fit parameter with 95% (ie. 2σ) uncertainty.

Example 2 – Interferometry Light waves INTERFERE through PRINCIPLE OF SUPERPOSITION. The NET wave is the algebraic sum of the two individual waves. Depending on the phase relation between the waves (ie. whether the peaks and valleys line up or anti-align) one can achieve HIGH power output or LOW power input.

Experimental Setup PZT: lead (Pb) zirconia (Zr) Titanate (Ti) THICKNESS of PZT is proportional to the voltage applied to the PZT.

Experimental Setup Measured Light Intensity by Photo Detector Voltage applied to PZT

Experimental Setup Measured Light Intensity by Photo Detector Voltage applied to PZT

Analysis of data The NUMBER Of interference fringes which pass by the detector is related to HOW FAR the PZT stack moves. From the Data, the goal is to EXTRACT from the data HOW FAR does the Mirror on the PZT material move. Wavelength of the Light (use 650nm) We calculate N from the best fit parameter A1 Length of time that mirror moving

Glitches in the data Why are there glitches in the data?

Voltage proportional to Thickness of PZT – Position of Mirror Mirror moving Forward Mirror moving Backward Mirror moving Forward Mirror moving Forward Mirror moving Backward Time

Voltage proportional to Thickness of PZT – Position of Mirror Mirror moving Forward Mirror moving Backward Mirror moving Forward Mirror moving Forward Mirror moving Backward Time

Removing Data from Fitting Fit data ONLY when Mirror is moving at CONSTANT velocity in ONE direction!

Class EXERCISE Download Lecture 16 Class Exercise data file. Create a plot that looks like the figure below to make sure that you have the data correct Use the CURVE Fitting Tool with ‘Custom Equation’ to fit the data. y = a*sin(b*x+c)+d EXCLUDE the data outside of the specific time range so that you ONLY fit data for the mirror moving at constant velocity in one direction. choose the tab for “Tools…Exclude Outliers” or you could instead use the “Tools…Exclude by Rule” to limit the data As needed, edit the FIT OPTIONS for starting points and limits for parameters a, b, c and d. so that you get a reasonably good fit. You can estimate these values using the plotted data which looks like the graph above. With best fit parameters, use equations on Slide 20 to calculate how far mirror has moved.