The Simple Regression Model

Slides:



Advertisements
Similar presentations
Properties of Least Squares Regression Coefficients
Advertisements

Multiple Regression Analysis
Kin 304 Regression Linear Regression Least Sum of Squares
The Simple Regression Model
Regresi Linear Sederhana Pertemuan 01 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Simple Linear Regression. Start by exploring the data Construct a scatterplot  Does a linear relationship between variables exist?  Is the relationship.
Lecture 4 Econ 488. Ordinary Least Squares (OLS) Objective of OLS  Minimize the sum of squared residuals: where Remember that OLS is not the only possible.
Part 1 Cross Sectional Data
The Simple Linear Regression Model: Specification and Estimation
The Simple Regression Model
Chapter 10 Simple Regression.
2.5 Variances of the OLS Estimators
Simple Linear Regression
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
The Simple Regression Model
Lecture 1 (Ch1, Ch2) Simple linear regression
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
So are how the computer determines the size of the intercept and the slope respectively in an OLS regression The OLS equations give a nice, clear intuitive.
3. Multiple Regression Analysis: Estimation -Although bivariate linear regressions are sometimes useful, they are often unrealistic -SLR.4, that all factors.
Regression and Correlation Methods Judy Zhong Ph.D.
Introduction to Linear Regression and Correlation Analysis
3.1 Ch. 3 Simple Linear Regression 1.To estimate relationships among economic variables, such as y = f(x) or c = f(i) 2.To test hypotheses about these.
Chapter 11 Simple Regression
Hypothesis Testing in Linear Regression Analysis
Chapter 4-5: Analytical Solutions to OLS
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
2.4 Units of Measurement and Functional Form -Two important econometric issues are: 1) Changing measurement -When does scaling variables have an effect.
6. Simple Regression and OLS Estimation Chapter 6 will expand on concepts introduced in Chapter 5 to cover the following: 1) Estimating parameters using.
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
11 Chapter 5 The Research Process – Hypothesis Development – (Stage 4 in Research Process) © 2009 John Wiley & Sons Ltd.
© 2013 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Multiple Regression Analysis: Inference
Lecture 6 Feb. 2, 2015 ANNOUNCEMENT: Lab session will go from 4:20-5:20 based on the poll. (The majority indicated that it would not be a problem to chance,
Heteroscedasticity Chapter 8
Multiple Regression Analysis: Estimation
Announcements Reminder: Exam 1 on Wed.
The simple linear regression model and parameter estimation
Chapter 4 Basic Estimation Techniques
Regression Analysis AGEC 784.
Ch. 2: The Simple Regression Model
Multiple Regression Analysis: Further Issues
Multiple Regression Analysis: Further Issues
Multiple Regression Analysis: Estimation
The Simple Linear Regression Model: Specification and Estimation
Chapter 5: The Simple Regression Model
More on Specification and Data Issues
The Simple Regression Model
ECONOMETRICS DR. DEEPTI.
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
Multiple Regression Analysis
More on Specification and Data Issues
Chapter 6: Multiple Regression – Additional Topics
Ch. 2: The Simple Regression Model
Chapter 6: MULTIPLE REGRESSION ANALYSIS
Multiple Regression Analysis: Further Issues
Our theory states Y=f(X) Regression is used to test theory.
Chapter 6: Multiple Regression – Additional Topics
Simple Linear Regression
Heteroskedasticity.
Seminar in Economics Econ. 470
The Simple Regression Model
Multiple Regression Analysis: Further Issues
Multiple Regression Analysis with Qualitative Information
More on Specification and Data Issues
Presentation transcript:

The Simple Regression Model Chapter 2 Wooldridge: Introductory Econometrics: A Modern Approach, 5e

The Simple Regression Model Definition of the simple linear regression model "Explains variable in terms of variable " Intercept Slope parameter Dependent variable, explained variable, response variable,… Error term, disturbance, unobservables,… Independent variable, explanatory variable, regressor,…

The Simple Regression Model Interpretation of the simple linear regression model The simple linear regression model is rarely applicable in prac-tice but its discussion is useful for pedagogical reasons "Studies how varies with changes in :" as long as By how much does the dependent variable change if the independent variable is increased by one unit? Interpretation only correct if all other things remain equal when the indepen- dent variable is increased by one unit

The Simple Regression Model Example: Soybean yield and fertilizer Example: A simple wage equation Rainfall, land quality, presence of parasites, … Measures the effect of fertilizer on yield, holding all other factors fixed Labor force experience, tenure with current employer, work ethic, intelligence … Measures the change in hourly wage given another year of education, holding all other factors fixed

The Simple Regression Model When is there a causal interpretation? Conditional mean independence assumption Example: wage equation The explanatory variable must not contain information about the mean of the unobserved factors e.g. intelligence … The conditional mean independence assumption is unlikely to hold because individuals with more education will also be more intelligent on average.

The Simple Regression Model Population regression function (PFR) The conditional mean independence assumption implies that This means that the average value of the dependent variable can be expressed as a linear function of the explanatory variable

The Simple Regression Model Population regression function For individuals with , the average value of is

The Simple Regression Model In order to estimate the regression model one needs data A random sample of observations First observation Second observation Third observation Value of the dependent variable of the i-th ob- servation Value of the expla- natory variable of the i-th observation n-th observation

The Simple Regression Model Fit as good as possible a regression line through the data points: Fitted regression line For example, the i-th data point

The Simple Regression Model What does "as good as possible" mean? Regression residuals Minimize sum of squared regression residuals Ordinary Least Squares (OLS) estimates

The Simple Regression Model CEO Salary and return on equity Fitted regression Causal interpretation? Salary in thousands of dollars Return on equity of the CEO‘s firm Intercept If the return on equity increases by 1 percent, then salary is predicted to change by 18,501 $

The Simple Regression Model Fitted regression line (depends on sample) Unknown population regression line

The Simple Regression Model Wage and education Fitted regression Causal interpretation? Hourly wage in dollars Years of education Intercept In the sample, one more year of education was associated with an increase in hourly wage by 0.54 $

The Simple Regression Model Voting outcomes and campaign expenditures (two parties) Fitted regression Causal interpretation? Percentage of vote for candidate A Percentage of campaign expenditures candidate A Intercept If candidate A‘s share of spending increases by one percentage point, he or she receives 0.464 percen- tage points more of the total vote

The Simple Regression Model Properties of OLS on any sample of data Fitted values and residuals Algebraic properties of OLS regression Fitted or predicted values Deviations from regression line (= residuals) Deviations from regression line sum up to zero Correlation between deviations and regressors is zero Sample averages of y and x lie on regression line

The Simple Regression Model For example, CEO number 12‘s salary was 526,023 $ lower than predicted using the the information on his firm‘s return on equity

The Simple Regression Model Goodness-of-Fit Measures of Variation "How well does the explanatory variable explain the dependent variable?" Total sum of squares, represents total variation in dependent variable Explained sum of squares, represents variation explained by regression Residual sum of squares, represents variation not explained by regression

The Simple Regression Model Decomposition of total variation Goodness-of-fit measure (R-squared) Total variation Explained part Unexplained part R-squared measures the fraction of the total variation that is explained by the regression

The Simple Regression Model CEO Salary and return on equity Voting outcomes and campaign expenditures Caution: A high R-squared does not necessarily mean that the regression has a causal interpretation! The regression explains only 1.3 % of the total variation in salaries The regression explains 85.6 % of the total variation in election outcomes

The Simple Regression Model Incorporating nonlinearities: Semi-logarithmic form Regression of log wages on years of eduction This changes the interpretation of the regression coefficient: Natural logarithm of wage Percentage change of wage … if years of education are increased by one year

The Simple Regression Model Fitted regression The wage increases by 8.3 % for every additional year of education (= return to education) For example: Growth rate of wage is 8.3 % per year of education

The Simple Regression Model Incorporating nonlinearities: Log-logarithmic form CEO salary and firm sales This changes the interpretation of the regression coefficient: Natural logarithm of CEO salary Natural logarithm of his/her firm‘s sales Percentage change of salary … if sales increase by 1 % Logarithmic changes are always percentage changes

The Simple Regression Model CEO salary and firm sales: fitted regression For example: The log-log form postulates a constant elasticity model, whereas the semi-log form assumes a semi-elasticity model + 1 % sales ! + 0.257 % salary

The Simple Regression Model Expected values and variances of the OLS estimators The estimated regression coefficients are random variables because they are calculated from a random sample The question is what the estimators will estimate on average and how large their variability in repeated samples is Data is random and depends on particular sample that has been drawn

The Simple Regression Model Standard assumptions for the linear regression model Assumption SLR.1 (Linear in parameters) Assumption SLR.2 (Random sampling) In the population, the relationship between y and x is linear The data is a random sample drawn from the population Each data point therefore follows the population equation

The Simple Regression Model Discussion of random sampling: Wage and education The population consists, for example, of all workers of country A In the population, a linear relationship between wages (or log wages) and years of education holds Draw completely randomly a worker from the population The wage and the years of education of the worker drawn are random because one does not know beforehand which worker is drawn Throw back worker into population and repeat random draw times The wages and years of education of the sampled workers are used to estimate the linear relationship between wages and education

The Simple Regression Model The values drawn for the i-th worker The implied deviation from the population relationship for the i-th worker:

The Simple Regression Model Assumptions for the linear regression model (cont.) Assumption SLR.3 (Sample variation in explanatory variable) Assumption SLR.4 (Zero conditional mean) The values of the explanatory variables are not all the same (otherwise it would be impossible to stu- dy how different values of the explanatory variable lead to different values of the dependent variable) The value of the explanatory variable must contain no information about the mean of the unobserved factors

The Simple Regression Model Theorem 2.1 (Unbiasedness of OLS) Interpretation of unbiasedness The estimated coefficients may be smaller or larger, depending on the sample that is the result of a random draw However, on average, they will be equal to the values that charac-terize the true relationship between y and x in the population "On average" means if sampling was repeated, i.e. if drawing the random sample and doing the estimation was repeated many times In a given sample, estimates may differ considerably from true values

The Simple Regression Model Variances of the OLS estimators Depending on the sample, the estimates will be nearer or farther away from the true population values How far can we expect our estimates to be away from the true population values on average (= sampling variability)? Sampling variability is measured by the estimator‘s variances Assumption SLR.5 (Homoskedasticity) The value of the explanatory variable must contain no information about the variability of the unobserved factors

The Simple Regression Model Graphical illustration of homoskedasticity The variability of the unobserved influences does not dependent on the value of the explanatory variable

The Simple Regression Model An example for heteroskedasticity: Wage and education The variance of the unobserved determinants of wages increases with the level of education

The Simple Regression Model Theorem 2.2 (Variances of OLS estimators) Conclusion: The sampling variability of the estimated regression coefficients will be the higher the larger the variability of the unobserved factors, and the lower, the higher the variation in the explanatory variable Under assumptions SLR.1 – SLR.5:

The Simple Regression Model Estimating the error variance The variance of u does not depend on x, i.e. is equal to the unconditional variance One could estimate the variance of the errors by calculating the variance of the residuals in the sample; unfortunately this estimate would be biased An unbiased estimate of the error variance can be obtained by substracting the number of estimated regression coefficients from the number of observations

The Simple Regression Model Theorem 2.3 (Unbiasedness of the error variance) Calculation of standard errors for regression coefficients Plug in for the unknown The estimated standard deviations of the regression coefficients are called "standard errors". They measure how precisely the regression coefficients are estimated.