Download presentation
Presentation is loading. Please wait.
Published byCarol Stevens Modified over 9 years ago
1
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1) is obtained thru its slope and intercept. LS (Least Squares) method finds parameters by minimizing the sum of the squared deviations of the fitted values from the actual observations.
2
Predicting y (response=dependent) from x (predictor=independent): Formula:
3
14.2: Simple Linear Regression (linear in the parameters) Regression is NOT fitting line but E(Y|X=x) 14.2.1 Properties of the estimated slope & Intercept
4
Variance-Covariance of the beta’s : Under the assumptions of Theorem A:
5
Inferences about the beta’s: In the previous result,
6
14.2.2: Assessing the Fit Recall, that the residuals are the differences between the observed and the fitted values: Residuals are to be plotted versus the x-values. Ideal: plot should look like a horizontal blur; that is to say that one can reasonably model it as linear. Caution: the errors have zero mean and are said to be homoscedastic = constant variance & independently of the predicator x. That is to say:
7
Steps in Linear Regression: 1.Fit the Regression Model (Mathematics) –Pick a method: Least Squares or else –Plot the data Y versus g(x) –Compute regression estimates & residuals –Check for linearity & outliers (plot residuals) –More diagnostics (beyond the scoop of this class) 2.Statistical Inference (Statistics) –Check for error assumptions – Check for normality (if not transform data) –If nonlinear form, then (beyond the scoop of this class) Least Squares Java applet: http://www.math.tamu.edu/FiniteMath/Classes/LeastSquares/ LeastSquares.html
8
14.2.3: Correlation & Regression A close relation exists between Correlation Analysis & Fitting straight lines by the Least Squares method.
9
14.3: Matrix approach to Linear Least Squares We’ve already fitted straight lines (p=1). What if p > 1 ? Investigate some Linear Algebra tools
10
Formulation of the Least Squares problem:
11
14.4: Statistical Properties of Least Squares Estimates 14.4.1: Vector-valued Random Variables
12
Cross-covariance matrix:
13
14.4.2: Mean and Covariance of Least Squares Estimates
14
14.4.3: Estimation of the common variance for the random errors In order to make inference about, one must get an estimate of the parameter (if unknown).
15
14.4.4: Residuals & Standardized Residuals
16
14.4.5: Inference about Recall Section 14.4 for the statistical properties of the Least Squares Estimates with some additional assumptions about the errors being
17
14.5: Multiple Linear Regression This section will generalize Section 14.2 (Simple Linear Regression) by doing the Multiple Linear Regression thru an example of polynomial regression.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.