Download presentation
Presentation is loading. Please wait.
1
7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 Another important method to estimate parameters Connection with maximum likelihood : - N independent Gauss-distributed random variables y i, i=1,…,N - Each y i related to another variable (exactly known) x i - Each y i has unknown mean i and known variance i 2 → can be regarded as a measurement of an N-dimensional random vector True value Goal: estimate parameters (x; ) x y i i
2
Log-likelihood function : This is maximized by finding that minimize the quantity : → the method of least squares. Generalize the method for “arbitrary” probablility distributions (also non-Gaussian) Correlated y i : Likelihood: y i have common N-dim Gaussian p.d.f. with a known covariance matrix V ij : equivalent to minimizing of : 7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10
3
Linear least-squares fit special case : ( linear function of ; a j (x) are in general not linear in x but fixed ) → can be solved analytically (although often solved numerically) → linear LS estimators are unbiased and have minimum variance( among all linear estimators) Value of at x i can be written : → Minimum : Solution :, if exists the solutions are linear functions of the original measurements 7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10
4
Covariance matrix using error propagation: Equivalently the inverse covariance matrix : that coincides with RCF bound when y i are Gaussian distributed For the case of λ linear in → 2 is quadratic in → describes an ellipsoid with tangent 7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10
5
Example : fit a straight line measurements y i statistically independent, errors σ i → looking for one can apply matrix method but it is simpler to form derivatives directly: 7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 (x;m,c) = mx + c x y i i y
6
7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 Solutions for simpler, when all σ i = σ
7
Variance and covariance : 7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 Variance, covariance doesn’t depend on measurements ! (only on errors and x i )
8
7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 For any point (error propagation by correlated variables)
9
7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 x y hihi Scale x, y so, that x = y Errors on x and y A B C r n P (some point at line → B) minimizing
10
with when when 7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 Solution : (from ) (from )
11
7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 Least squares with binned data so far was an arbitrary function now λ is proportional to a p.d.f. of random variable x N measurements → histogram with N bins, y i = number of entries in bin i = p.d.f. The number of entries predicted in bin i is : Minimizing the quantity : = Poisson error = λ i
12
Alternative (“Modified least-squared method NLS”) : numerically simpler but worse estimation of errors (esp. if y i is small) Normalization factor the predicted number of entries becomes : is estimator Goodness-of-fit with itself is a random variable distributed according to - distribution Number of degrees of freedom = number of measured points – number of parameters 7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.