Ordinary Least Squares (OLS) Regression

Slides:



Advertisements
Similar presentations
Managerial Economics in a Global Economy
Advertisements

Lesson 10: Linear Regression and Correlation
Statistical Techniques I EXST7005 Simple Linear Regression.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Three Types of Unobtrusive Research 1.Content analysis - examine written documents such as editorials. 2.Analyses of existing statistics. 3.Historical/comparative.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Week Lecture 3Slide #1 Minimizing e 2 : Deriving OLS Estimators The problem Deriving b 0 Deriving b 1 Interpreting b 0 and b 1.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 CURVE.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Simple Linear Regression Analysis
Prepared by Robert F. Brooker, Ph.D. Copyright ©2004 by South-Western, a division of Thomson Learning. All rights reserved.Slide 1 Managerial Economics.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Chapter 11 Simple Regression
Chapter 6 & 7 Linear Regression & Correlation
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Statistical Methods Statistical Methods Descriptive Inferential
10B11PD311 Economics REGRESSION ANALYSIS. 10B11PD311 Economics Regression Techniques and Demand Estimation Some important questions before a firm are.
Correlation and Regression Basic Concepts. An Example We can hypothesize that the value of a house increases as its size increases. Said differently,
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
CHAPTER 5 CORRELATION & LINEAR REGRESSION. GOAL : Understand and interpret the terms dependent variable and independent variable. Draw a scatter diagram.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
OLS Regression What is it? Closely allied with correlation – interested in the strength of the linear relationship between two variables One variable is.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Correlation and Regression Basic Concepts. An Example We can hypothesize that the value of a house increases as its size increases. Said differently,
Linear Regression Essentials Line Basics y = mx + b vs. Definitions
Chapter 4 Demand Estimation
Lecture 9 Sections 3.3 Objectives:
CHAPTER 3 Describing Relationships
LSRL.
Least Squares Regression Line.
Part 5 - Chapter
Part 5 - Chapter 17.
LEAST – SQUARES REGRESSION
Statistics 101 Chapter 3 Section 3.
Linear Regression Special Topics.
CHAPTER 3 Describing Relationships
Practice. Practice Practice Practice Practice r = X = 20 X2 = 120 Y = 19 Y2 = 123 XY = 72 N = 4 (4) 72.
CHAPTER 3 Describing Relationships
Micro Economics in a Global Economy
Linear Regression Bonus
ECONOMETRICS DR. DEEPTI.
Part 5 - Chapter 17.
Managerial Economics in a Global Economy
Chapter 4 Demand Estimation
Regression Models - Introduction
Least-Squares Regression
CHAPTER 3 Describing Relationships
Least-Squares Regression
CHAPTER 3 Describing Relationships
Chapter 5 LSRL.
Chapter 5 LSRL.
Least-Squares Regression
Least-Squares Regression
Correlation and Regression
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
Regression & Prediction
CHAPTER 3 Describing Relationships
Ch 4.1 & 4.2 Two dimensions concept
CHAPTER 3 Describing Relationships
3.2 – Least Squares Regression
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
9/27/ A Least-Squares Regression.
CHAPTER 3 Describing Relationships
Presentation transcript:

Ordinary Least Squares (OLS) Regression What is it? Closely allied with correlation – interested in the strength of the linear relationship between two variables One variable is specified as the dependent variable The other variable is the independent (or explanatory) variable

Regression Model Y = a + bx + e What is Y? What is a? What is b? What is x? What is e?

Elements of the Regression Line a = Y intercept (what Y is predicted to equal when X = 0) b = Slope (indicates the change in Y associated with a unit increase in X) e = error (the difference between the predicted Y (Y hat) and the observed Y

Regression Has the ability to quantify precisely the relative importance of a variable Has the ability to quantify how much variance is explained by a variable(s) Use more often than any other statistical technique

The Regression Line Y = a + bx + e Y = sentence length X = prior convictions Each point represents the number of priors (X) and sentence length (Y) of a particular defendant The regression line is the best fit line through the overall scatter of points

X and Y are observed. We need to estimate a & b

Calculus 101 Least Squares Method and differential calculus Differentiation is a very powerful tool that is used extensively in model estimation. Practical examples of differentiation are usually in the form of minimization/optimization problems or rate of change problems.

How do you draw a line when the line can be drawn in almost any direction? The Method of Least Squares: drawing a line that minimizing the squared distances from the line (Σe2) This is a minimization problem and therefore we can use differential calculus to estimate this line.

Least Squares Method x y Deviation =y-(a+bx) d2 1 1 - a (1 - a)2 1 1 - a (1 - a)2 1-2a+a2 3 3 - a - b (3 - a - b)2 9 - 6a + a2 - 6b + 2ab + b2 2 2 - a - 2b (2 - a - 2b)2 4 - 4a - a2 - 8b + 4ab + 4b2 4 4 - a - 3b (4 - a - 3b)2 16 - 8a + a2 - 24b + 6ab +9b2 5 5 - a - 4b (5 - a - 4b)2 25 - 10a +a2 -40b +8ab +16b2

Summing the squares of the deviations yields: f(a, b) = 55-30a + 5a2 - 78b + 20ab + 30b2 Calculate the first order partial derivatives of f(a,b) fb = -78 + 20a + 60b and fa = -30 + 10a + 20b

Set each partial derivative to zero: Manipulate fa: 0 = -30 + 10a + 20b 10a = 30 - 20b a= 3 - 2b

Substitute (3-2b) into fb: 0 = -78 + 20a + 60b = -78 +20(3-2b) + 60b = -78 + 60 - 40b + 60b = -18 +20b 20b = 18 b = 0.9 Slope = .09

Substituting this value of b back into fa to obtain a: Y-intercept = 1.2

Estimating the model (the easy way) Calculating the slope (b)

Sum of Squares for X Some of Squares for Y Sum of products

We’ve seen these values before

Regression is strongly related to Correlation

Calculating the Y-intersept (a) Calculating the error term (e) Y hat = predicted value of Y e will be different for every observation. It is a measure of how much we are off in are prediction.