Multiple Regression Applications

Slides:



Advertisements
Similar presentations
Multiple Regression.
Advertisements

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Nine.
The Multiple Regression Model.
Chapter 14, part D Statistical Significance. IV. Model Assumptions The error term is a normally distributed random variable and The variance of  is constant.
Econ 140 Lecture 81 Classical Regression II Lecture 8.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
4.3 Confidence Intervals -Using our CLM assumptions, we can construct CONFIDENCE INTERVALS or CONFIDENCE INTERVAL ESTIMATES of the form: -Given a significance.
Econ 140 Lecture 151 Multiple Regression Applications Lecture 15.
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
1 Lecture 4:F-Tests SSSII Gwilym Pryce
1 Module II Lecture 4:F-Tests Graduate School 2004/2005 Quantitative Research Methods Gwilym Pryce
Classical Regression III
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
Econ 140 Lecture 121 Prediction and Fit Lecture 12.
Chapter 12 Simple Regression
Econ 140 Lecture 131 Multiple Regression Models Lecture 13.
Multiple Regression Models
Econ 140 Lecture 181 Multiple Regression Applications III Lecture 18.
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
Econ 140 Lecture 171 Multiple Regression Applications II &III Lecture 17.
Econ 140 Lecture 191 Heteroskedasticity Lecture 19.
Further Inference in the Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Simple Linear Regression Analysis
Linear Regression Example Data
BCOR 1020 Business Statistics
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Chapter 7 Forecasting with Simple Regression
Introduction to Regression Analysis, Chapter 13,
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Introduction to Multiple Regression Statistics for Managers.
Multiple Linear Regression Analysis
Lecture 5 Correlation and Regression
Introduction to Linear Regression and Correlation Analysis
Chapter 13: Inference in Regression
Hypothesis Testing in Linear Regression Analysis
1 Tests with two+ groups We have examined tests of means for a single group, and for a difference if we have a matched sample (as in husbands and wives)
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Econ 3790: Business and Economics Statistics
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
QMS 6351 Statistics and Research Methods Regression Analysis: Testing for Significance Chapter 14 ( ) Chapter 15 (15.5) Prof. Vera Adamchik.
CHAPTER 14 MULTIPLE REGRESSION
1 1 Slide Simple Linear Regression Coefficient of Determination Chapter 14 BA 303 – Spring 2011.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Six.
Chapter 11 Linear Regression Straight Lines, Least-Squares and More Chapter 11A Can you pick out the straight lines and find the least-square?
Go to Table of Content Single Variable Regression Farrokh Alemi, Ph.D. Kashif Haqqi M.D.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Chapter 13 Multiple Regression
VI. Regression Analysis A. Simple Linear Regression 1. Scatter Plots Regression analysis is best taught via an example. Pencil lead is a ceramic material.
Lecture 10: Correlation and Regression Model.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Environmental Modeling Basic Testing Methods - Statistics III.
Copyright (C) 2002 Houghton Mifflin Company. All rights reserved. 1 Understandable Statistics S eventh Edition By Brase and Brase Prepared by: Lynn Smith.
The general linear test approach to regression analysis.
Chap 6 Further Inference in the Multiple Regression Model
June 30, 2008Stat Lecture 16 - Regression1 Inference for relationships between variables Statistics Lecture 16.
Essentials of Business Statistics: Communicating with Numbers By Sanjiv Jaggia and Alison Kelly Copyright © 2014 by McGraw-Hill Higher Education. All rights.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 14-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
EDUC 200C Section 9 ANOVA November 30, Goals One-way ANOVA Least Significant Difference (LSD) Practice Problem Questions?
Significance Tests for Regression Analysis. A. Testing the Significance of Regression Models The first important significance test is for the regression.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
INTRODUCTION TO MULTIPLE REGRESSION MULTIPLE REGRESSION MODEL 11.2 MULTIPLE COEFFICIENT OF DETERMINATION 11.3 MODEL ASSUMPTIONS 11.4 TEST OF SIGNIFICANCE.
Chapter 4 Basic Estimation Techniques
Multiple Regression Lecture 13 Lecture 12.
Quantitative Methods Simple Regression.
Presentation transcript:

Multiple Regression Applications Lecture 15 Lecture 15

Today’s plan Relationship between R2 and the F-test. Restricted least squares and testing for the imposition of a linear restriction in the model Lecture 15

R2 We know We can rewrite this as ^ Remember: If R2 = 1, the model explains all of the variation in Y If R2 = 0, the model explains none of the variation in Y Lecture 15

R2 (2) We know from the sum of squares identity that ^ Dividing by the total sum of squares we get ^ Lecture 15

R2 (3) Thus we have or or If we divide the denominator and numerator of the F-test by the total sum of squares: Lecture 15

F-stat in terms of R2 Even if you’re not given the residual sum of squares, you can compute the F-statistic: Recalling our LINEST (from L13.xls) output, we can substitute R2 = 0.188 We would reject the null at a 5% significance level and accept the null at the 1% significance level Lecture 15

Relationship between R2 & F When R2 = 0 there is no relationship between the Y and X variables This can be written as Y = a In this instance, we accept the null and F = 0 When R2 = 1, all variation in Y is explained by the X variables The F statistic approaches infinity as the denominator would equal zero In this instance, we always reject the null Lecture 15

Restricted Least Squares Imposing a linear restriction in a regression model and re-examining the relationship between R2 and the F-test. In restricted least squares we want to test a restriction such as Where our model is We can write  = 1 -  and substitute it into the model equation so that: (lnY - lnK) = a + a(lnL - lnK) + e Lecture 15

Restricted Least Squares (2) We can rewrite our equation as: G = a +Z + e* Where: G = (lnY - lnK) and Z = (lnL - lnK) The model with G as the dependent variable will be our restricted model the restricted model is the equation we will estimate under the assumption that the null hypothesis is true Lecture 15

Restricted Least Squares (3) How do we test one model against another? We take the unrestricted and restricted forms and test them using an F-test The F statistic will be * refers to the restricted model q is the number of constraints in this case the number of constraints = 1 ( + = 1) n - k is the df of the unrestricted model Lecture 15

Testing linear restrictions We wish to test the linear restriction imposed in the Cobb-Douglas log-linear model: Test for constant returns to scale, or the restriction: H0:  +  = 1 We will use L14.xls to test this restriction - worked out in L15.xls Lecture 15

Testing linear restrictions (2) The unrestricted regression equation estimated from the data is: Note the t-ratios for the coefficients: : 0.674/0.026 = 26.01 : 0.447/0.030 = 14.98 compared to a t-value of around 2 for a 5% significance level, both  &  are very precisely determined coefficients Lecture 15

Testing linear restrictions (3) adding up the regression coefficients, we have: 0.674 +0.447 = 1.121 how do we test whether or not this sum is statistically different from 1? First, we rewrite the restriction:  = 1-  Our restricted model is: (lnY - lnK) = a + a(lnL - lnK) + e or G = a +Z + e* Lecture 15

Testing linear restrictions (4) The procedure for estimation is as follows: 1. Estimate the unrestricted version of the model 2. Estimate the restricted version of the model 3. Collect for the unrestricted model and for the restricted model 4. Compute the F-test where q is the number of restrictions (in this case q = 1) and (n-k) is the degrees of freedom for the unrestricted model Lecture 15

Testing linear restrictions (5) On L15.xls we find a sample n = 32 and an estimated unrestricted model giving us the following information: Lecture 15

Testing linear restrictions (7) The restricted model gives us the following information: We can use this information to compute our F statistic: F* = [(1.228 - 0.351)/1]/(0.359/29) = 72.47 Lecture 15

Testing linear restrictions (8) The F table value at a 5% significance level is: F0.05,1,29 = 4.17 Since F* > F0.05,1,29 we will reject the null hypothesis that there are constant returns to scale NOTE: the dependent variables for the restricted and unrestricted models are different dependent variable in unrestricted version: lnY dependent variable in restricted version: (lnY-lnK) Lecture 15

Testing linear restrictions (9) We can also use R2 to calculate the F-statistic by first dividing through by the total sum of squares Using our definition of R2 we can write: Lecture 15

Testing linear restrictions (10) NOTE: we cannot simply use the R2 from the unrestricted model since it has a different dependent variable What we need to do is take the expectation E(G|L,K) We need our unrestricted model to have the dependent variable G, or: Where G = (lnY - lnK) We can test this because we know that  +  - 1 = 0.121 since  +  = 1 estimating this unrestricted model will give us the unrestricted R2 Lecture 15

Testing linear restrictions (11) From L15.xls we have : R2* = 0.871 R2 = 0.963 Our computed F-statistic will be Lecture 15

Testing linear restrictions (12) On L15.xls we have 32 observations of output, employment, and capital The spreadsheet has regression output for the restricted and unrestricted models The R2 and sum of squares are in bold type F-tests on the restriction are on the bottom of the sheet We find that Excel gives us an F-statistic of 72.4665 The F table value at a 5% significance level is 4.1830 The probability that we would accept the null given this F-statistic is very small Lecture 15

Testing linear restrictions (13) From this we can conclude that we have a model where there are increasing returns to scale. We don’t know the true value, but we can reject the restriction that there are constant returns to scale. Lecture 15