Physics 114: Lecture 16 Linear and Non-Linear Fitting Dale E. Gary NJIT Physics Department.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

The Maximum Likelihood Method
Chapter 12 Inference for Linear Regression
Physics 114: Lecture 19 Least Squares Fit to 2D Data Dale E. Gary NJIT Physics Department.
Hypothesis Testing Steps in Hypothesis Testing:
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
STA305 week 31 Assessing Model Adequacy A number of assumptions were made about the model, and these need to be verified in order to use the model for.
Simple Linear Regression. G. Baker, Department of Statistics University of South Carolina; Slide 2 Relationship Between Two Quantitative Variables If.
Physics 114: Lecture 9 Probability Density Functions Dale E. Gary NJIT Physics Department.
Physics 114: Lecture 7 Uncertainties in Measurement Dale E. Gary NJIT Physics Department.
Experimental Uncertainties: A Practical Guide What you should already know well What you need to know, and use, in this lab More details available in handout.
The Simple Linear Regression Model: Specification and Estimation
BCOR 1020 Business Statistics Lecture 17 – March 18, 2008.
Chapter 10 Simple Regression.
Class 5: Thurs., Sep. 23 Example of using regression to make predictions and understand the likely errors in the predictions: salaries of teachers and.
Part 4 Chapter 13 Linear Regression
Chap 9-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 9 Estimation: Additional Topics Statistics for Business and Economics.
Chapter 11 Multiple Regression.
Chi Square Distribution (c2) and Least Squares Fitting
1/49 EF 507 QUANTITATIVE METHODS FOR ECONOMICS AND FINANCE FALL 2008 Chapter 9 Estimation: Additional Topics.
Physics 114: Lecture 17 Least Squares Fit to Polynomial
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Inference for regression - Simple linear regression
Linear Regression Analysis Additional Reference: Applied Linear Regression Models – Neter, Kutner, Nachtsheim, Wasserman The lecture notes of Dr. Thomas.
Stat13-lecture 25 regression (continued, SE, t and chi-square) Simple linear regression model: Y=  0 +  1 X +  Assumption :  is normal with mean 0.
Chi-squared distribution  2 N N = number of degrees of freedom Computed using incomplete gamma function: Moments of  2 distribution:
CORRELATION & REGRESSION
Chapter 15 Modeling of Data. Statistics of Data Mean (or average): Variance: Median: a value x j such that half of the data are bigger than it, and half.
Inferences for Regression
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Analytical vs. Numerical Minimization Each experimental data point, l, has an error, ε l, associated with it ‣ Difference between the experimentally measured.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
R. Kass/W03P416/Lecture 7 1 Lecture 7 Some Advanced Topics using Propagation of Errors and Least Squares Fitting Error on the mean (review from Lecture.
Lecture 3: Inference in Simple Linear Regression BMTRY 701 Biostatistical Methods II.
Physics 114: Exam 2 Review Lectures 11-16
Curve Fitting and Regression EEE 244. Descriptive Statistics in MATLAB MATLAB has several built-in commands to compute and display descriptive statistics.
ES 240: Scientific and Engineering Computation. Chapter 13: Linear Regression 13. 1: Statistical Review Uchechukwu Ofoegbu Temple University.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Physics 114: Lecture 14 Mean of Means Dale E. Gary NJIT Physics Department.
Physics 114: Lecture 18 Least Squares Fit to Arbitrary Functions Dale E. Gary NJIT Physics Department.
Physics 114: Lecture 12 Error Analysis, Part II Dale E. Gary NJIT Physics Department.
The chi-squared statistic  2 N Measures “goodness of fit” Used for model fitting and hypothesis testing e.g. fitting a function C(p 1,p 2,...p M ; x)
NON-LINEAR REGRESSION Introduction Section 0 Lecture 1 Slide 1 Lecture 6 Slide 1 INTRODUCTION TO Modern Physics PHYX 2710 Fall 2004 Intermediate 3870 Fall.
Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00.
Assumptions of Multiple Regression 1. Form of Relationship: –linear vs nonlinear –Main effects vs interaction effects 2. All relevant variables present.
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
MathematicalMarketing Slide 5.1 OLS Chapter 5: Ordinary Least Square Regression We will be discussing  The Linear Regression Model  Estimation of the.
Richard Kass/F02P416 Lecture 6 1 Lecture 6 Chi Square Distribution (  2 ) and Least Squares Fitting Chi Square Distribution (  2 ) (See Taylor Ch 8,
BPS - 5th Ed. Chapter 231 Inference for Regression.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
R. Kass/Sp07P416/Lecture 71 More on Least Squares Fit (LSQF) In Lec 5, we discussed how we can fit our data points to a linear function (straight line)
The simple linear regression model and parameter estimation
Physics 114: Lecture 13 Probability Tests & Linear Fitting
Regression Analysis AGEC 784.
Inference for Regression (Chapter 14) A.P. Stats Review Topic #3
Physics 114: Lecture 14 Linear Fitting
Physics 114: Exam 2 Review Weeks 7-9
Physics 114: Lecture 15 Least Squares Fit to Polynomial
The Maximum Likelihood Method
Physics 114: Exam 2 Review Material from Weeks 7-11
Statistical Methods For Engineers
CHAPTER 29: Multiple Regression*
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
John Federici NJIT Physics Department
Linear regression Fitting a straight line to observations.
Physics 114: Lecture 14-a Linear Fitting Using Matlab
5.2 Least-Squares Fit to a Straight Line
Nonlinear Fitting.
Presentation transcript:

Physics 114: Lecture 16 Linear and Non-Linear Fitting Dale E. Gary NJIT Physics Department

Mar 29, 2010 Reminder of Previous Results  Last time we showed that it is possible to fit a straight line of the form to any data by minimizing chi-square:  We found that we could solve for the parameters a and b that minimize the difference between the fitted line and the data (with errors  i ) as: where Note, if errors  i are all the same, they cancel, so can be ignored. Becomes N

Mar 29, 2010 MatLAB Commands for Linear Fits  MatLAB has a “low level” routine called polyfit() that can be used to fit a linear function to a set of points (assumes all errors are equal): x = 0:0.1:2.5; y = randn(1,26)* *x; % Makes a slightly noisy linear set of points y = 3 + 2x p = polyfit(x,y,1) % Fits straight line and returns values of b and a in p p = % in this case, fit equation is y = x  Plot the points, and overplot the fit using polyval() function. plot(x,y,'.'); hold on plot(x,polyval(p,x),'r')  Here is the result. The points have a scatter of  = 0.3 around the fit, as we specified above.

Mar 29, 2010 Fits with Unequal Errors  MatLAB has another routine called glmfit() (generalized linear model regression) that can be used to specify weights (unequal errors). Say the errors are proportional to square-root of y (like Poisson): err = sqrt(y)/3; hold off errorbar(x,y,err,’.’) % Plots y vs. x, with error bars  Now calculate fit using glmfit() p = glmfit(x,y) % Does same as polyfit(x,y,1) (assumes equal errors) p = glmfit(x,y,’normal’,’weights’,err) % Allows specification of errors (weights), but must include ‘normal’ distribution type p = circshift(p,1) % Unfortunately, glmfit returns a,b instead of b,a as polyval() wants % circshift() has the effect of swapping the order of p elements hold on plot(x,polyval(p,x),’r’)

Mar 29, 2010 Error Estimation  We saw that when the errors of each point in the plot are the same, they cancel from the equations for the fit parameters a and b. If we do not, in fact, know the errors, but we believe that a linear fit is a correct fit to the points, then we can use the fit itself to estimate those errors.  First, consider the residuals (deviations of the points from the fit, y i – y ), calculated by resid = y - polyval(p,x) plot(x,resid)  You can see that this is a random distribution with zero mean, as it should be. As usual, you can calculate the variance, where now there are two fewer degrees of freedom ( m = 2 ) because we used the data to determine a and b.  Indeed, typing std(resid) gives , which is close to the 0.3 we used.  Once we have the fitted line, either using individual weighting or assuming uniform errors, how do we know how good the fit is? That is, what are the errors in the determination of the parameters a and b ?

Mar 29, 2010 Chi-Square Probability  Recall that the value of  2 is  This should be about equal to the number of degrees of freedom, = N – 2 = 24 in this case. Since  i is a constant 0.3, we can bring it out of the summation, and calculate sum(resid.^2)/0.3^2 ans =  As we mentioned last time, it is often easier to consider the reduced chi- square, which is about unity for a good fit. In this case,  2 = /24 = If we look this value up in table C.4 of the text, we find that P ~ 0.4 which means if we repeated the experiment multiple times, about 40% would be expected to have a larger chi-square.

Mar 29, 2010 Uncertainties in the Parameters  We started out searching for a linear fit,, where a and b are the parameters of the fit. What are the uncertainties in these parameters?  We saw in Lecture 14 that errors due to a series of measurements propagate to the result for, e.g. a, according to  Since we have the expressions for a and b as  The partial derivatives are (note  is independent of y i )

Mar 29, 2010 Uncertainties in the Parameters  Inserting these into the expressions after some algebra (see text page 109), we have  In the case of common  i =  we have  For our example, this is calculated as del = 26*sum(x.^2)-sum(x)^2 siga = sum(x.^2)*0.3^2/del % gives sigb = 26*0.3^2/del % gives