3.1 Ch. 3 Simple Linear Regression 1.To estimate relationships among economic variables, such as y = f(x) or c = f(i) 2.To test hypotheses about these.

Slides:



Advertisements
Similar presentations
The Simple Linear Regression Model Specification and Estimation Hill et al Chs 3 and 4.
Advertisements

Kin 304 Regression Linear Regression Least Sum of Squares
Chapter 12 Simple Linear Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Ch.6 Simple Linear Regression: Continued
Simple Regression. Major Questions Given an economic model involving a relationship between two economic variables, how do we go about specifying the.
Simple Linear Regression
The Simple Linear Regression Model: Specification and Estimation
Linear Regression.
Chapter 10 Simple Regression.
政治大學 中山所共同選修 黃智聰 政治大學中山所共同選修 課程名稱: 社會科學計量方法與統計-計量方法 Methodology of Social Sciences 授課內容: The Simple Linear Regression Model 日期: 2003 年 10 月 9 日.
Simple Linear Regression
Chapter 3 Simple Regression. What is in this Chapter? This chapter starts with a linear regression model with one explanatory variable, and states the.
The Simple Regression Model
SIMPLE LINEAR REGRESSION
SIMPLE LINEAR REGRESSION
Lecture 17 Interaction Plots Simple Linear Regression (Chapter ) Homework 4 due Friday. JMP instructions for question are actually for.
Correlation and Regression Analysis
Simple Linear Regression and Correlation
Simple Linear Regression Analysis
Simple Linear Regression. Introduction In Chapters 17 to 19, we examine the relationship between interval variables via a mathematical equation. The motivation.
Chapter 6 (cont.) Regression Estimation. Simple Linear Regression: review of least squares procedure 2.
Ordinary Least Squares
Regression and Correlation Methods Judy Zhong Ph.D.
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Chapter 11 Simple Regression
Simple Linear Regression
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
Chapter 4-5: Analytical Solutions to OLS
7.1 Multiple Regression More than one explanatory/independent variable This makes a slight change to the interpretation of the coefficients This changes.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
10B11PD311 Economics REGRESSION ANALYSIS. 10B11PD311 Economics Regression Techniques and Demand Estimation Some important questions before a firm are.
Chapter 13 Multiple Regression
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
The Simple Linear Regression Model: Specification and Estimation ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s.
Economics 173 Business Statistics Lecture 10 Fall, 2001 Professor J. Petry
Chapter 8: Simple Linear Regression Yang Zhenlin.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
Linear Regression Linear Regression. Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Purpose Understand Linear Regression. Use R functions.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
The simple linear regression model and parameter estimation
Chapter 4 Basic Estimation Techniques
Simple Linear Regression
Ch. 2: The Simple Regression Model
Correlation and Simple Linear Regression
The Simple Linear Regression Model: Specification and Estimation
3.1 Examples of Demand Functions
Kin 304 Regression Linear Regression Least Sum of Squares
BPK 304W Regression Linear Regression Least Sum of Squares
Simple Linear Regression - Introduction
Ch. 2: The Simple Regression Model
CHAPTER 29: Multiple Regression*
The Simple Linear Regression Model
The Simple Linear Regression Model: Specification and Estimation
Simple Linear Regression
SIMPLE LINEAR REGRESSION
Simple Linear Regression
SIMPLE LINEAR REGRESSION
The Simple Regression Model
Simple Linear Regression
Introduction to Regression
Regression Models - Introduction
Presentation transcript:

3.1 Ch. 3 Simple Linear Regression 1.To estimate relationships among economic variables, such as y = f(x) or c = f(i) 2.To test hypotheses about these relationships, such as: is the marginal propensity to consume less than 1.0: dc/di < 1 (do people save?) 3.To Forecast/predict the value of one variable based on the value of another variable, such as: what will consumer spending be next year?

3.2 Population Regression Function Y: dependent variable: the variable whose behavior we seek to explain X: independent variable: the variable that determines (in part) the variation in the dependent variable 1)the deterministic portion: β 1 + β 2 x 2)the stochastic portion: e Population Regression Function is: y = β 1 + β 2 x + e

3.3 Example: Weekly Food Expenditures y t = dollars spent each week on food items by household t x t = weekly income of household t Suppose we believe x and y have a linear, causal relationship whereby variation in x “causes” variation in y: y = β 1 + β 2 x However, it is unlikely that the relationship will be exact. We need to allow for an error term, e: y = β 1 + β 2 x + e

3.4 The Error Term The presence of the (unobservable) error term drives much of what we do. Suppose (incorrectly) the relationship between X and Y did not have an error term. Suppose β 1 = 30 and β 2 = 0.15 Then for families with x = $480  Y = β 1 + β 2 x = (480) = $102 This means all families with a weekly income of $480 will spend $102 per week on food. Very unlikely. More likely to see variation in weekly food expenditures for the group of families who earn $480 per week (or any other $$ amount)

3.5 The Error Term (con’t) Y is a random variable because it is the sum of non-random variable: (β 1 + β 2 x ) and a random variable: (e) The error term (e) picks up: 1. Omitted variables (unspecified factors ) that influence the dependent variable 2.Effects of a non-linear relationship between Y and X 3. Unpredictable random behavior that is be unique to that observation is in error. (the model is one of behavior)

3.6 Use Rules of Mathematical Expectation From Chapter 2, you learned the following rules of mathematical expectation (the E(.) operator) Suppose that Z is the random variable: E(Z + a) = E(Z) + a where a is a constant (has no randomness) Suppose that b is also a constant: E(bZ + a) = bE(Z) + a In both of these rules, we see how the E(.) operator moves thru and stops on random variables, not on constants

3.7 Assumptions of the Model 1.The value of y, for each value of x, is y =  1 +  2 x + e the model is linear in the parameters (  ) and the error term is additive 2.The average value of the random error e is: E(e) = 0  3.The variance of the random error e is: var(e) =  2  var(y) =  2 Homosckedastic 4.The error term is serially independent (uncorrelated with itself) cov(e i,e j ) = cov(y i,y j ) = 0 (Serial independence)

3.8 Assumptions of the Model (con’t) 5.The variable X is not random and must take at least two different values  COV(X,e) = 0 [or E(Xe) = 0 b/c E(e) = 0 by assumption 2] 6.(optional) e is normally distributed with mean 0, var(e)=  2 e ~ N(0,  2 )

3.9.. xtxt x 1 =480x 2 =800 ytyt f(y t ) The probability density function for y t at two levels of household income, x t expenditure income

3.10 Goal of Regression Analysis The economic model is … E(y|x) = β 1 + β 2 x The econometric model is… y = β 1 + β 2 x + e We want to estimate this mean. This mean is just a line with an intercept and slope.  1 measures E(y|x = 0)  2 measures the change in E(y) from a change in x

3.11 f(. ) f(e)f(y) Probability density function for e and y 0 1+2x1+2x

3.12 The Framework Population regression values: y t =  1 +  2 x t + e t (Data on y t are generated by this econometric model) Population regression line is the mean of y E(y t |x t ) =  1 +  2 x t The parameters of the model are:  1 the intercept  2 the slope  2 the variance of the error term (e t )

3.13 The Estimation: The parameters  1,  2, and  2 are unknown to us and must be estimated. Take a sample of data on X and Y The idea is to fit a line through the data points. Call this line the Sample Regression Line: y t = b 1 + b 2 x t where b 1 be the estimator for  1 and b 2 the estimator for  2. Define a Residual as: e t = y t – y t it measures the difference between actual y value and the “fitted” (or “predicted”) value. ^ ^ ^

3.14

3.15 Estimating the Population Regression Line:The Least Squares Principle Any line through the data points will generate a set of residuals. We want the line that generates the smallest residuals. The smallest residuals are those whose sum of squares is the smallest. We call it the Method of Least Squares

3.16 Obtaining the Least Squares Estimates To obtain the Least Squares estimates of  1 and  2 we will find the values of the slope and intercept of a line that minimizes the sum of squared residuals using calculus. This will provide us with formulas, called the least squares estimators, that we can use in any regression problem.

3.17 The sum of squared deviations:

3.18 Minimize this sum with respect to  1 and  2 by taking the first derivative twice: once with respect to  1 and again with respect to  2 : When these two terms are set to zero,  1 and  2 become b 1 and b 2 because they no longer represent just any value of  1 and  2 but the special values that correspond to the minimum of S(  ) 

3.19 We have two equations and two unknowns (b 1 and b 2 ). Therefore, solve these two equations for b 1 and b 2. Let’s start with #1: #1 #2

3.20 Now solve #2 for b 2 and substitute in our formula for b 1

3.21 And from slide 3.25: or:

3.22 Estimates for Food Expenditure Data How do we interpret these estimates?

3.23 ^

3.24 Regression Terminology ConceptDefinition Dependent variable (y)The variable whose behavior we want to model Independent variable (x)The variable that we believe determines the dependent variable Parameters (coefficients) β 1 and β 2 Defines the nature of the relationship between Y and X Random error term (e)We expect the relationship between X and Y to have a random element. Estimators b 1 and b 2 Formulas explaining how to combine the sample data on X and Y to estimate the intercept and slope Fitted line y t = b 1 + b 2 x t Predicted values for Y, using the estimates of intercept and slope Residual e t = y t - y t Difference between actual and predicted values of Y ^ ^^