Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin The Two-Variable Model: Hypothesis Testing chapter seven.

Slides:



Advertisements
Similar presentations
Chapter 4: Basic Estimation Techniques
Advertisements

Multiple Regression Analysis
CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION
Hypothesis Testing Steps in Hypothesis Testing:
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
The Simple Linear Regression Model: Specification and Estimation
Copyright © 2008 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Managerial Economics, 9e Managerial Economics Thomas Maurice.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Chapter 10 Simple Regression.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Simple Linear Regression
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
All rights reserved by Dr.Bill Wan Sing Hung - HKBU 4A.1 Week 4a Multiple Regression The meaning of partial regression coefficients.
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
The Simple Regression Model
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Chapter Topics Types of Regression Models
Chapter 11 Multiple Regression.
Simple Linear Regression Analysis
SIMPLE LINEAR REGRESSION
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Simple Linear Regression Analysis
Ordinary Least Squares
SIMPLE LINEAR REGRESSION
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Statistical Inference: Estimation and Hypothesis Testing chapter.
Introduction to Linear Regression and Correlation Analysis
Chapter 11 Simple Regression
2-1 MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4)
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
7.1 Multiple Regression More than one explanatory/independent variable This makes a slight change to the interpretation of the coefficients This changes.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
Chapter 5: Regression Analysis Part 1: Simple Linear Regression.
Copyright © 2005 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Managerial Economics Thomas Maurice eighth edition Chapter 4.
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
1Spring 02 First Derivatives x y x y x y dy/dx = 0 dy/dx > 0dy/dx < 0.
Roger B. Hammer Assistant Professor Department of Sociology Oregon State University Conducting Social Research Ordinary Least Squares Regression.
May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 5: TWO-VARIABLE REGRESSION: Interval Estimation and Hypothesis Testing.
Chapter Three TWO-VARIABLEREGRESSION MODEL: THE PROBLEM OF ESTIMATION
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Roger B. Hammer Assistant Professor Department of Sociology Oregon State University Conducting Social Research The Classical Model and Hypothesis Testing.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Copyright (C) 2002 Houghton Mifflin Company. All rights reserved. 1 Understandable Statistics Seventh Edition By Brase and Brase Prepared by: Lynn Smith.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
Econometrics III Evgeniya Anatolievna Kolomak, Professor.
Correlation and Linear Regression Chapter 13 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Chapter 4: Basic Estimation Techniques
Basic Estimation Techniques
THE LINEAR REGRESSION MODEL: AN OVERVIEW
Chapter 5: The Simple Regression Model
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
Basic Estimation Techniques
CHAPTER 29: Multiple Regression*
Chapter 6: MULTIPLE REGRESSION ANALYSIS
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Goodness of Fit The sum of squared deviations from the mean of a variable can be decomposed as follows: TSS = ESS + RSS This decomposition can be used.
HETEROSCEDASTICITY: WHAT HAPPENS IF THE ERROR VARIANCE IS NONCONSTANT?
Simple Linear Regression
SIMPLE LINEAR REGRESSION
Linear Regression Summer School IFPRI
Ch3 The Two-Variable Regression Model
Financial Econometrics Fin. 505
Presentation transcript:

Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin The Two-Variable Model: Hypothesis Testing chapter seven

7-2 The Classical Linear Regression Model Assumptions The regression model is linear in the parameters The explanatory variables, X, are uncorrelated with the error term Always true if X’s are nonstochastic (fixed numbers as in conditional regression analysis) Stochastic X’s require simultaneous equations models Given the value of X i, the expected value of the disturbance term is zero: E(u|X i ) = 0. See Fig. 7-1.

7-3 Figure 7-1 Conditional distribution of disturbances u i.

7-4 More Assumptions of the CLRM The variance of each u i is constant (homoscedastic): var(u i ) = σ 2 Individual Y values are spread around their mean with the same variance. See Fig. 7-2(a) Unequal variance is heteroscedasticity, Fig. 7-2(b) There is no correlation across the error terms. Or no autocorrelation. See Fig. 7-3 Cov(u i, u j ) = 0 or the u i are random. The model is correctly specified (no specification error or specification bias).

7-5 Figure 7-2 (a) Homoscedasticity (equal variance); (b) Heteroscedasticity (unequal variance).

7-6 Figure 7-3 Patterns of autocorrelation: (a) No autocorrelation; (b) positive autocorrelation; (c) negative autocorrelation.

7-7 Variances and Standard Errors The CLRM assumptions allow us to estimate the variances and standard errors of the OLS estimators. Note n - 2 is the degrees of freedom (or n – k) Standard error of the regression

7-8 Table 7-1 Computations for the lotto example.

7-9 Gauss-Markov Theorem Given the assumptions of the CLRM, the OLS estimators are BLUE. b 1 and b 2 are linear estimators. E(b 1 ) = B 1 and E(b 2 ) = B 2 in repeated applications the means of the estimators converge to the true values (unbiased). The estimator of σ 2 is unbiased. b 1 and b 2 are efficient estimators (minimum variance among linear unbiased estimators).

7-10 Sampling Distributions of OLS Estimators The OLS estimators are normally distributed under the assumption that the error term u i of the PRF is normally distributed b 1 ~ N(B 1, σ b1 2 ), b 2 ~ N(B 2, σ b2 2 ) u i ~ N(0, σ 2 ) Follows from the Central Limit Theorem and the property that any linear function of a normally distributed variable is normally distributed

7-11 Figure 7-4 (Normal) sampling distributions of b 1 and b 2.

7-12 Hypothesis Testing Suppose we want to test the hypothesis H 0 : B 2 = 0 As b 2 is normally distributed, we could use the standard normal distribution for hypotheses about its mean, except that the variance is unknown. Use the t distribution (estimator-value)/se

7-13 Figure 7-7 One-tailed t test: (a) Right-tailed; (b) left-tailed.

7-14 Coefficient of Determination or r 2 How good is the fitted regression line? Write the regression relationship in terms of deviations from mean values, then square it and sum over the sample The parts can be interpreted individually

7-15 Coefficient of Determination or r 2 The Total Sum of Squares (TSS) is composed of the Explained Sum of Squares (ESS) and the Residual Sum of Squares (RSS) The r 2 indicates the proportion of the total variation in Y explained by the sample regression function (SRF)

7-16 Figure 7-8 Breakdown of total variation Y i.

7-17 Reporting Results of Regression Analysis For simple regression in the Lotto example → For multiple equations and/or explanatory variables see Table II in schooltrans.doc.

7-18 Caution: Forecasting While we can calculate an estimate of Y for any given value of X using regression results As the X value chosen departs from the mean value of X, the variance of the Y estimate increases Consider the Lotto example, Fig Forecasts of Y for X’s far away from their mean and/or outside the range of the sample are unreliable and should be avoided.

7-19 Figure % confidence band for the true Lotto expenditure function.