OLS Regression What is it? Closely allied with correlation – interested in the strength of the linear relationship between two variables One variable is.

Slides:



Advertisements
Similar presentations
Managerial Economics in a Global Economy
Advertisements

Lesson 10: Linear Regression and Correlation
Statistical Techniques I EXST7005 Simple Linear Regression.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Three Types of Unobtrusive Research 1.Content analysis - examine written documents such as editorials. 2.Analyses of existing statistics. 3.Historical/comparative.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Probabilistic & Statistical Techniques Eng. Tamer Eshtawi First Semester Eng. Tamer Eshtawi First Semester
Simple Regression. Major Questions Given an economic model involving a relationship between two economic variables, how do we go about specifying the.
2.2 Correlation Correlation measures the direction and strength of the linear relationship between two quantitative variables.
Chapter 10 Simple Regression.
SIMPLE LINEAR REGRESSION
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 CURVE.
Simple Linear Regression Analysis
Least Squares Regression Line (LSRL)
Correlation & Regression
McGraw-Hill/Irwin Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 13 Linear Regression and Correlation.
Linear Regression.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Least-Squares Regression
Chapter 11 Simple Regression
Chapter 6 & 7 Linear Regression & Correlation
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Statistical Methods Statistical Methods Descriptive Inferential
10B11PD311 Economics REGRESSION ANALYSIS. 10B11PD311 Economics Regression Techniques and Demand Estimation Some important questions before a firm are.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
Correlation and Regression Basic Concepts. An Example We can hypothesize that the value of a house increases as its size increases. Said differently,
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
The Practice of Statistics, 5th Edition Starnes, Tabor, Yates, Moore Bedford Freeman Worth Publishers CHAPTER 3 Describing Relationships 3.2 Least-Squares.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
Essential Statistics Chapter 51 Least Squares Regression Line u Regression line equation: y = a + bx ^ –x is the value of the explanatory variable –“y-hat”
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 3 Association: Contingency, Correlation, and Regression Section 3.3 Predicting the Outcome.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
STATISTICS 12.0 Correlation and Linear Regression “Correlation and Linear Regression -”Causal Forecasting Method.
4 basic analytical tasks in statistics: 1)Comparing scores across groups  look for differences in means 2)Cross-tabulating categoric variables  look.
Simple Linear Regression The Coefficients of Correlation and Determination Two Quantitative Variables x variable – independent variable or explanatory.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Correlation and Regression Basic Concepts. An Example We can hypothesize that the value of a house increases as its size increases. Said differently,
Describing Bivariate Relationships. Bivariate Relationships When exploring/describing a bivariate (x,y) relationship: Determine the Explanatory and Response.
Chapter 3 LSRL. Bivariate data x – variable: is the independent or explanatory variable y- variable: is the dependent or response variable Use x to predict.
Chapter 5 LSRL. Bivariate data x – variable: is the independent or explanatory variable y- variable: is the dependent or response variable Use x to predict.
Part 5 - Chapter
Part 5 - Chapter 17.
LEAST – SQUARES REGRESSION
Statistics 101 Chapter 3 Section 3.
Linear Regression Special Topics.
Practice. Practice Practice Practice Practice r = X = 20 X2 = 120 Y = 19 Y2 = 123 XY = 72 N = 4 (4) 72.
CHAPTER 3 Describing Relationships
Ordinary Least Squares (OLS) Regression
Correlation and Simple Linear Regression
Part 5 - Chapter 17.
Regression Models - Introduction
Least-Squares Regression
Correlation and Simple Linear Regression
Least-Squares Regression
Simple Linear Regression and Correlation
Correlation and Regression
CHAPTER 3 Describing Relationships
Regression & Prediction
CHAPTER 3 Describing Relationships
3.2 – Least Squares Regression
CHAPTER 3 Describing Relationships
CHAPTER 3 Describing Relationships
9/27/ A Least-Squares Regression.
3 basic analytical tasks in bivariate (or multivariate) analyses:
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

OLS Regression What is it? Closely allied with correlation – interested in the strength of the linear relationship between two variables One variable is specified as the dependent variable The other variable is the independent (or explanatory) variable

Regression Model Y = a + bx + e What is Y? What is a? What is b? What is x? What is e? What is Y-hat?

Elements of the Regression Line a = Y intercept (what Y is predicted to equal when X = 0) b = Slope (indicates the change in Y associated with a unit increase in X) e = error (the difference between the predicted Y (Y hat) and the observed Y

Regression Has the ability to quantify precisely the relative importance of a variable Has the ability to quantify how much variance is explained by a variable(s) Use more often than any other statistical technique

The Regression Line Y = a + bx + e Y = sentence length X = prior convictions Each point represents the number of priors (X) and sentence length (Y) of a particular defendant The regression line is the best fit line through the overall scatter of points

X and Y are observed. We need to estimate a & b

Calculus 101 Least Squares Method and differential calculus Differentiation is a very powerful tool that is used extensively in model estimation. Practical examples of differentiation are usually in the form of minimization/optimization problems or rate of change problems.

Calculus 101: Calculating the rate of change or slope of a line For a straight line it is relatively simple to calculate the slope

Calculating the rate of change or slope of a line for a curve is a bit harder Differential Calculus: We have a curve describing the variable Y as some function of the variable X: y = x 2

It is possible to find a general expression involving the function f(x) that describes the slopes of the approximating sequence of secant lines h = x1 – x0 (represents a small difference from a point of interest)

Lets take a cost curve example: C(x) = x 2 what is the derivative if x = 3 = f(3+h) – f(3) / h = (3+h) 2 – (3) 2 / h = (9 + 6h + h 2 ) – 9 / h = 6h + h 2 / h = 6 + h = 6 (as h approaches 0) ∆y/∆x = 6

How does this relate to our Regression model that is a straight line?

How do you draw a line when the line can be drawn in almost any direction? The Method of Least Squares: drawing a line that minimizing the squared distances from the line (Σe 2 ) This is a minimization problem and therefore we can use differential calculus to estimate this line.

X and Y are observed. We need to estimate a & b

Least Squares Method xy Deviation =y-(a+bx)d a(1 - a) 2 1-2a+a a - b(3 - a - b) a + a 2 - 6b + 2ab + b a - 2b(2 - a - 2b) a - a 2 - 8b + 4ab + 4b a - 3b(4 - a - 3b) a + a b + 6ab +9b a - 4b(5 - a - 4b) a +a 2 -40b +8ab +16b 2

Summing the squares of the deviations yields: f(a, b) = 55-30a + 5a2 - 78b + 20ab + 30b2 Calculate the first order partial derivatives of f(a,b) f b = a + 60b and f a = a + 20b

Set each partial derivative to zero: Manipulate fa: 0 = a + 20b 10a = b a= 3 - 2b

Substitute (3-2b) into f b : 0 = a + 60b = (3-2b) + 60b = b + 60b = b 20b = 18 b = 0.9 Slope =.09

Substituting this value of b back into f a to obtain a: 10a = (.09) 10a = a = 12 a= 1.2 Y-intercept = 1.2

Estimating the model (the easy way) Calculating the slope (b)

Sum of Squares for X Some of Squares for Y Sum of produces

Calculating the Y-intersept (a) Calculating the error term (e) Y hat = predicted value of Y e will be different for every observation. It is a measure of how much we are off in are prediction.

Regression is strongly related to Correlation