Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)

Slides:



Advertisements
Similar presentations
Lecture 17: Tues., March 16 Inference for simple linear regression (Ch ) R2 statistic (Ch ) Association is not causation (Ch ) Next.
Advertisements

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Computational Statistics. Basic ideas  Predict values that are hard to measure irl, by using co-variables (other properties from the same measurement.
Kin 304 Regression Linear Regression Least Sum of Squares
CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Probability & Statistical Inference Lecture 9
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Ch11 Curve Fitting Dr. Deshi Ye
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Simple Linear Regression
The Simple Linear Regression Model: Specification and Estimation
Chapter 10 Simple Regression.
9. SIMPLE LINEAR REGESSION AND CORRELATION
Lecture 19: Tues., Nov. 11th R-squared (8.6.1) Review
Topics: Regression Simple Linear Regression: one dependent variable and one independent variable Multiple Regression: one dependent variable and two or.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Simple Linear Regression Analysis
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Correlation and Regression Analysis
Simple Linear Regression Analysis
1 Simple Linear Regression 1. review of least squares procedure 2. inference for least squares lines.
Correlation & Regression
Regression and Correlation Methods Judy Zhong Ph.D.
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Linear Regression Analysis Additional Reference: Applied Linear Regression Models – Neter, Kutner, Nachtsheim, Wasserman The lecture notes of Dr. Thomas.
Chapter 11 Simple Regression
Simple Linear Regression
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Ch4 Describing Relationships Between Variables. Pressure.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Ch4 Describing Relationships Between Variables. Section 4.1: Fitting a Line by Least Squares Often we want to fit a straight line to data. For example.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Multiple Regression Petter Mostad Review: Simple linear regression We define a model where are independent (normally distributed) with equal.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
REGRESSION DIAGNOSTICS Fall 2013 Dec 12/13. WHY REGRESSION DIAGNOSTICS? The validity of a regression model is based on a set of assumptions. Violation.
Economics 173 Business Statistics Lecture 10 Fall, 2001 Professor J. Petry
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
1 Simple Linear Regression Chapter Introduction In Chapters 17 to 19 we examine the relationship between interval variables via a mathematical.
Lecture Slides Elementary Statistics Twelfth Edition
The simple linear regression model and parameter estimation
Regression Analysis AGEC 784.
REGRESSION G&W p
Linear Regression.
AP Statistics Chapter 14 Section 1.
Kin 304 Regression Linear Regression Least Sum of Squares
Ch12.1 Simple Linear Regression
Simple Linear Regression - Introduction
CHAPTER 29: Multiple Regression*
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
Regression Models - Introduction
Linear regression Fitting a straight line to observations.
Simple Linear Regression
Ch11 Curve Fitting II.
Simple Linear Regression
Ch 4.1 & 4.2 Two dimensions concept
Regression Models - Introduction
Presentation transcript:

Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1) is obtained thru its slope and intercept. LS (Least Squares) method finds parameters by minimizing the sum of the squared deviations of the fitted values from the actual observations.

Predicting y (response=dependent) from x (predictor=independent): Formula:

14.2: Simple Linear Regression (linear in the parameters) Regression is NOT fitting line but E(Y|X=x) Properties of the estimated slope & Intercept

Variance-Covariance of the beta’s : Under the assumptions of Theorem A:

Inferences about the beta’s: In the previous result,

14.2.2: Assessing the Fit Recall, that the residuals are the differences between the observed and the fitted values: Residuals are to be plotted versus the x-values. Ideal: plot should look like a horizontal blur; that is to say that one can reasonably model it as linear. Caution: the errors have zero mean and are said to be homoscedastic = constant variance & independently of the predicator x. That is to say:

Steps in Linear Regression: 1.Fit the Regression Model (Mathematics) –Pick a method: Least Squares or else –Plot the data Y versus g(x) –Compute regression estimates & residuals –Check for linearity & outliers (plot residuals) –More diagnostics (beyond the scoop of this class) 2.Statistical Inference (Statistics) –Check for error assumptions – Check for normality (if not transform data) –If nonlinear form, then (beyond the scoop of this class) Least Squares Java applet: LeastSquares.html

14.2.3: Correlation & Regression A close relation exists between Correlation Analysis & Fitting straight lines by the Least Squares method.

14.3: Matrix approach to Linear Least Squares We’ve already fitted straight lines (p=1). What if p > 1 ?  Investigate some Linear Algebra tools

Formulation of the Least Squares problem:

14.4: Statistical Properties of Least Squares Estimates : Vector-valued Random Variables

Cross-covariance matrix:

14.4.2: Mean and Covariance of Least Squares Estimates

14.4.3: Estimation of the common variance for the random errors In order to make inference about, one must get an estimate of the parameter (if unknown).

14.4.4: Residuals & Standardized Residuals

14.4.5: Inference about Recall Section 14.4 for the statistical properties of the Least Squares Estimates with some additional assumptions about the errors being

14.5: Multiple Linear Regression This section will generalize Section 14.2 (Simple Linear Regression) by doing the Multiple Linear Regression thru an example of polynomial regression.