SEEMINGLY UNRELATED REGRESSION INFERENCE AND TESTING Sunando Barua Binamrata Haldar Indranil Rath Himanshu Mehrunkar.

Slides:



Advertisements
Similar presentations
Multiple Regression.
Advertisements

Further Inference in the Multiple Regression Model Hill et al Chapter 8.
Inference in the Simple Regression Model
Panel Data Models Prepared by Vera Tabakova, East Carolina University.
Chapter 9: Simple Regression Continued
Topic 12: Multiple Linear Regression
A. The Basic Principle We consider the multivariate extension of multiple linear regression – modeling the relationship between m responses Y 1,…,Y m and.
The Multiple Regression Model.
1 SSS II Lecture 1: Correlation and Regression Graduate School 2008/2009 Social Science Statistics II Gwilym Pryce
4.3 Confidence Intervals -Using our CLM assumptions, we can construct CONFIDENCE INTERVALS or CONFIDENCE INTERVAL ESTIMATES of the form: -Given a significance.
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Psychology 202b Advanced Psychological Statistics, II February 1, 2011.
Chapter 10 Simple Regression.
Business Statistics - QBM117
Hypothesis Testing Steps of a Statistical Significance Test. 1. Assumptions Type of data, form of population, method of sampling, sample size.
Chapter Topics Types of Regression Models
Chapter 11 Multiple Regression.
BCOR 1020 Business Statistics Lecture 21 – April 8, 2008.
Aaker, Kumar, Day Seventh Edition Instructor’s Presentation Slides
Chapter 9 Hypothesis Testing.
Review for Exam 2 Some important themes from Chapters 6-9 Chap. 6. Significance Tests Chap. 7: Comparing Two Groups Chap. 8: Contingency Tables (Categorical.
Lecture 5 Correlation and Regression
Regression and Correlation Methods Judy Zhong Ph.D.
Aaker, Kumar, Day Ninth Edition Instructor’s Presentation Slides
Maths Study Centre CB Open 11am – 5pm Semester Weekdays
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
© 2002 Prentice-Hall, Inc.Chap 14-1 Introduction to Multiple Regression Model.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Introduction to Linear Regression
Agresti/Franklin Statistics, 1 of 122 Chapter 8 Statistical inference: Significance Tests About Hypotheses Learn …. To use an inferential method called.
Lesson Multiple Regression Models. Objectives Obtain the correlation matrix Use technology to find a multiple regression equation Interpret the.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 8 Hypothesis Testing.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Four Ending Wednesday, September 19 (Assignment 4 which is included in this study guide.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Lecture 10: Correlation and Regression Model.
Environmental Modeling Basic Testing Methods - Statistics III.
Logic and Vocabulary of Hypothesis Tests Chapter 13.
June 30, 2008Stat Lecture 16 - Regression1 Inference for relationships between variables Statistics Lecture 16.
Example x y We wish to check for a non zero correlation.
4-1 MGMG 522 : Session #4 Choosing the Independent Variables and a Functional Form (Ch. 6 & 7)
Lecture #25 Tuesday, November 15, 2016 Textbook: 14.1 and 14.3
Chapter 14 Introduction to Multiple Regression
Vera Tabakova, East Carolina University
Chapter 9 Hypothesis Testing.
Inference for Regression (Chapter 14) A.P. Stats Review Topic #3
Vera Tabakova, East Carolina University
26134 Business Statistics Week 5 Tutorial
15.5 The Hausman test For the random effects estimator to be unbiased in large samples, the effects must be uncorrelated with the explanatory variables.
Multiple Regression Analysis: Inference
Linear Regression and Correlation Analysis
Multiple Regression Analysis and Model Building
Regression.
Chapter 11 Simple Regression
Chapter 9 Hypothesis Testing.
P-value Approach for Test Conclusion
6-1 Introduction To Empirical Models
Chapter 9 Hypothesis Testing.
Multiple Regression Models
Statistical Inference about Regression
Interval Estimation and Hypothesis Testing
Heteroskedasticity.
Chapter 7: The Normality Assumption and Inference with OLS
Tutorial 6 SEG rd Oct..
Chapter 9 Dummy Variables Undergraduated Econometrics Page 1
Introduction to Regression
Statistical Inference for the Mean: t-test
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

SEEMINGLY UNRELATED REGRESSION INFERENCE AND TESTING Sunando Barua Binamrata Haldar Indranil Rath Himanshu Mehrunkar

Four Steps of Hypothesis Testing 1. Hypotheses: Null hypothesis (H 0 ): A statement that parameter(s) take specific value (Usually: “no effect”) Alternative hypothesis (H 1 ): States that parameter value(s) falls in some alternative range of values (“an effect”) 2.Test Statistic: Compares data to what H 0 predicts, often by finding the number of standard errors between sample point estimate and H 0 value of parameter. For example, the test stastics for Student’s t-test is

3. P-value (P): A probability measure of evidence about H 0. The probability (under presumption that H 0 is true) the test statistic equals observed value or value even more extreme in direction predicted by H 1. The smaller the P-value, the stronger the evidence against H Conclusion: If no decision needed, report and interpret P-value If decision needed, select a cutoff point (such as 0.05 or 0.01) and reject H 0 if P-value ≤ that value

Seemingly Unrelated Regression FirmInv(t)mcap(t-1)nfa(t-1)a(t-1)1 Ashok Leyland i_amcap_anfa_aa_a Mahindra & Mahindra i_mmcap_mnfa_ma_m Tata Motorsi_tmcap_tnfa_ta_t Inv(t) : Gross investment at time ‘t’ mcap(t-1): Value of its outstanding shares at time ‘t-1’ (using closing price of NSE) nfa(t-1) : Net Fixed Assets at time ‘t-1’ a(t-1) : Current assets at time ‘t-1’

System Specification

I = Xβ + Є E(Є)=0, E(Є Є’) = ∑ ⊗ I 17

SIMPLE CASE [σ ij =0, σ ii =σ² => ∑ ⊗ I 17 = σ²I 17 ] Estimation: OLS estimation method can be applied to the individual equations of the SUR model OLS = (X’X) -1 X’ I SAS command : proc reg data=sasuser.ppt; al:model i_a=mcap_a nfa_a a_a; mm:model i_m=mcap_m nfa_m a_m; tata:model i_t=mcap_t nfa_t a_t; run; proc syslin data=sasuser.ppt sdiag sur; al:model i_a=mcap_a nfa_a a_a; mm:model i_m=mcap_m nfa_m a_m; tata:model i_t=mcap_t nfa_t a_t; run;

Estimated equations Ashok Leyland: = mcap_a nfa_a a_a Mahindra & Mahindra: = mcap_m nfa_m a_m Tata Motors: = mcap_t nfa_t a_t

Regression results for Mahindra & Mahindra Dependent Variable: i_m investment Keeping the other explanatory variables constant, a 1 unit increase in mcap_m at ‘t-1’ results in an average increase of units in i_m at ‘t’. Similarly, a 1 unit increase in nfa_m at ‘t-1’ results in an average increase of units in i_m and a 1 unit increase in a_m at ‘t-1’ results in an average increase of units in i_m at ‘t’. From P-values, we can see that at 10% level of significance, the estimate of the mcap_m and a_m coefficients are significant. Parameter Estimates VariableLabelDFParameter Estimate Standard Error t ValuePr > |t| Intercept mcap_mMkt. cap nfa_mNet fixed assets a_massets

GENERAL CASE [ ∑ is free ] We need to use the GLS method of estimation since the error variance-covariance matrix (∑) of the SUR model is not equal to σ²I 17. GLS =[X’( ∑ ⊗ I 17 ) -1 X] -1 X ’( ∑ ⊗ I 17 ) -1 I SAS command : proc syslin data=sasuser.ppt sur; al:model i_a=mcap_a nfa_a a_a; mm:model i_m=mcap_m nfa_m a_m; tata:model i_t=mcap_t nfa_t a_t; run; Estimation:

Estimated Equations Ashok Leyland: = mcap_a nfa_a – 0.065a_a Mahindra & Mahindra: = mcap_m nfa_m a_m Tata Motors: = mcap_t + 2.1nfa_t a_t

Regression results for Mahindra & Mahindra Dependent Variable: i_m investment Keeping the other explanatory variables constant, a 1 unit increase in mcap_m at ‘t-1’ results in an average increase of units in i_m at ‘t’. Similarly, a 1 unit increase in nfa_m at ‘t-1’ results in an average increase of 1.16 units in i_m and a 1 unit increase in a_m at ‘t-1’ results in an average increase of 0.67 units in i_m at ‘t’. From P-values, we can see that at 10% level of significance, the estimate of the mcap_m and nfa_m coefficients are significant. Parameter Estimates VariableDFParameter Estimate Standard Error t ValuePr > |t|Variable Label Intercept Intercept mcap_m mktcap nfa_m netfixedass ets a_m assets

HYPOTHESIS TESTING The appropriate framework for the test is the notion of constrained-unconstrained estimation

SIMPLE CASE 1 (σ ij =0,σ ii =σ 2 ) ASHOK LEYLAND AND MAHINDRA & MAHINDRA H 0 β 1 = β 2 H 1 β 1 ≠ β 2 VARIABLE NAMEDESCRIPTIONVALUE σ ii Varianceσ2σ2 σ ij Contemporaneous Covariance 0 NNumber of Firms2 T1T1 Number of observations of Ashok Leyland 17 T2T2 Number of observations of Mahindra & Mahindra 17 KNumber of Parameters4

Unconstrained Model = + Constrained Model = + H 0 = β 1 = β 2

i = I i - i SS i = i i ’ SAS Command used to calculate Sum of Squares: Unconstrained Model proc syslin data=sasuser.ppt sdiag sur; al:model i_a=mcap_a nfa_a a_a; mm:model i_m=mcap_m nfa_m a_m; run; Constrained Model proc syslin data=sasuser.ppt sdiag sur; al:model i_a=mcap_a nfa_a a_a; mm:model i_m=mcap_m nfa_m a_m; joint: srestrict al.nfa_a=mm.nfa_m,al.a_a=mm.a_m, al.intercept = mm.intercept; run;

Number of restrictions = DOF c - DOF uc = 4 F cal = [(SS c – SS uc )/number of restrictions]/ [SS uc /DOF uc ] ~ F (4,26) = The F tab value at 5% LOS is 2.74 Decision Criteria : We reject H 0 when F cal > F tab Therefore, we reject H 0 at 5% LOS Not all the coefficients in the two coefficient matrices are equal. Unconstrained ModelConstrained Model SS al = ; SS mm = SS uc = SS al + SS mm = DOF uc = T1 + T2 – K – K = 26 SS c = DOF c = T1 + T2 –K = 30

SIMPLE CASE 2 (σ ij =0,σ ii =σ 2 ) ASHOK LEYLAND, MAHINDRA & MAHINDRA AND TATA MOTORS H 0 β 1 = β 2 = β₃ H 1 β 1 ≠ β 2 ≠ β₃ VARIABLE NAMEDESCRIPTIONVALUE σ ii Varianceσ2σ2 σ ij Contemporaneous Covariance 0 NNumber of Firms3 T1T1 Number of observations of Ashok Leyland 17 T2T2 Number of observations of Mahindra & Mahindra 17 T3T3 Number of observations of Tata Motors 17 KNumber of Parameters4

SAS Command used to calculate Sum of Squares: Unconstrained Model proc syslin data=sasuser.ppt sdiag sur; al:model i_a=mcap_a nfa_a a_a; mm:model i_m=mcap_m nfa_m a_m; tata:model i_t=mcap_t nfa_t a_t; run; Constrained Model proc syslin data=sasuser.ppt sdiag sur; al:model i_a=mcap_a nfa_a a_a; mm:model i_m=mcap_m nfa_m a_m; tata:model i_t=mcap_t nfa_t a_t; joint: srestrict al.mcap_a = mm.mcap_m = tata.mcap_t, al.nfa_a = mm.nfa_m = tata.nfa_t, al.a_a = mm.a_m = tata.a_t, al.intercept = mm.intercept = tata.intercept; run;

Number of restrictions = DOF c - DOF uc = 8 F cal = [(SS c – SS uc )/number of restrictions]/ [SS uc /DOF uc ] ~ F (8,39) = The F tab value at 5% LOS is 2.18 Decision Criteria : We reject H 0 when F cal > F tab Therefore, we reject H 0 at 5% LOS. Not all the coefficients in the two coefficient matrices are equal. Unconstrained ModelConstrained Model SS al = ; SS mm = SS tata = SS uc = SS al + SS mm + SS tata = DOF uc = T 1 + T 2 +T 3 – K – K - K= 39 SS c = DOF c = T 1 + T 2 + T 3 – K = 47

SIMPLE CASE 3 (σ ij =0,σ ii =σ 2 ) ASHOK LEYLAND AND MAHINDRA & MAHINDRA H 0 β 1 = β 2 H 1 β 1 ≠ β 2 VARIABLE NAMEDESCRIPTIONVALUE σ ii Varianceσ2σ2 σ ij Contemporaneous Covariance 0 NNumber of Firms2 T1T1 Number of observations of Ashok Leyland 17 T2T2 Number of observations of Mahindra & Mahindra 2 KNumber of Parameters4 T 2 < K

of Unconstrained Model cannot be estimated using OLS model because (X’X) is not invertible as = 0 NOTE: SS uc = SS 1 + SS 2 ; SS 1 can be obtained but SS 2 cannot be calculated due to insufficient degrees of freedom. However, we can estimate the model for Ashok Leyland by OLS (SS uc = SS 1 ; T 1 -K degrees of freedom) Under the null hypothesis, we estimate the Constrained Model using T 1 + T 2 observations. (SS c ; T1 + T2 – K degrees of freedom) So, we can do the test even when T 2 = 1 SAS Command for Constrained Model: proc syslin data=sasuser.file1 sdiag sur; al: model i=mcap nfa a; run;

Number of restrictions = DOF c - DOF uc = 2 F cal = [(SS c – SS uc )/number of restrictions]/ [SS uc /DOF uc ] ~ F (2,13) = The F tab value at 5% LOS is 3.81 Decision Criteria : We reject H 0 when F cal > F tab Therefore, we reject H 0 at 5% LOS Not all the coefficients in the two coefficient matrices are equal. Unconstrained ModelConstrained Model SS al = SS uc = SS al = DOF uc = T1 – K = 13 SS c = DOF c = T1 + T2 –K = 15

Y 1 = X a1 β a1 + X a2 β a2 + ε 1 β 1 = (T 1 x1) [T 1 x(k 1 -S)][(k 1 -S)x1] (T 1 xS) (Sx1) (T 1 x1) Y 2 = X b1 β b1 + X b2 β b2 + ε 2 (T 2 x1) [T 2 x(k 2 -S)][(k 1 -S)x1] (T 2 xS) (Sx1) (T 1 x1) β 1 = β 11 β 12 β 13 β 14 β 15 β 16 β 2 = β 21 β 22 β 23 β 24 β 25 β 26 β 27 β 1 = β 11 β 13 β 15 β 16 β 12 β 14 β 2 = β 21 β 23 β 24 β 25 β 26 β 22 β 27 Need to be compared a1 a2 β2 =β2 = b1 b2 SIMPLE CASE 4 (PARTIAL TEST)

Ashok Leyland and Mahindra & Mahindra Unconstrained Model I 1 = X a1 β a1 + X a2 β a2 + ε 1 I 2 = X b1 β b1 + X b2 β b2 + ε 2 Constrained Model = [] + H0H0 β a2 = β b2 = β H1H1 β a2 ≠ β b2

SAS Command used to calculate Sum of Squares: Unconstrained Model proc syslin data=sasuser.ppt sdiag sur; al:model i_a=mcap_a nfa_a a_a; mm:model i_m=mcap_m nfa_m a_m; run; Constrained Model proc syslin data=sasuser.ppt sdiag sur; al:model i_a=mcap_a nfa_a a_a; mm:model i_m=mcap_m nfa_m a_m; joint: srestrict al.nfa_a=mm.nfa_m,al.a_a=mm.a_m; run;

Number of restrictions = DOF c - DOF uc = S = 2 F cal = [(SS c – SS uc )/number of restrictions]/ [SS uc /DOF uc ] ~ F (2,26) = The F tab value at 5% LOS is 3.37 Decision Criteria : We reject H 0 when F cal > F tab Therefore, we reject H 0 at 5% LOS Not all the coefficients in the two coefficient matrices are equal. Unconstrained ModelConstrained Model SS uc = 26 DOF uc = T 1 + T 2 – K – K = 26 SS c = x 28 = DOF c = T 1 + T 2 – (K 1 – S) – (K 2 - S) = 28

GENERAL CASE (∑ is free) ASHOK LEYLAND, MAHINDRA & MAHINDRA AND TATA MOTORS H 0 β 1 = β 2 = β₃ H 1 Not H 0 VARIABLE NAMEDESCRIPTIONVALUE σ ii Varianceσi2σi2 σ ij Contemporaneous Covariance σ ij NNumber of Firms3 T1T1 Number of observations of Ashok Leyland 17 T2T2 Number of observations of Mahindra & Mahindra 17 T3T3 Number of observations of Tata Motors 17 KNumber of Parameters4

SAS Command used to calculate Sum of Squares: Unconstrained Model proc syslin data=sasuser.ppt sur; al:model i_a=mcap_a nfa_a a_a; mm:model i_m=mcap_m nfa_m a_m; tata:model i_t=mcap_t nfa_t a_t; run; Constrained Model proc syslin data=sasuser.ppt sur; al:model i_a=mcap_a nfa_a a_a; mm:model i_m=mcap_m nfa_m a_m; tata:model i_t=mcap_t nfa_t a_t; joint: srestrict al.mcap_a = mm.mcap_m = tata.mcap_t, al.nfa_a = mm.nfa_m = tata.nfa_t, al.a_a = mm.a_m = tata.a_t, al.intercept = mm.intercept = tata.intercept; run;

Number of restrictions = DOF c - DOF uc = 8 F cal = [(SS c – SS uc )/number of restrictions]/ [SS uc /DOF uc ] ~ F (8,39) = The F tab value at 5% LOS is 2.18 Decision Criteria : We reject H 0 when F cal > F tab Therefore, we reject H 0 at 5% LOS Not all the coefficients in the two coefficient matrices are equal. Unconstrained ModelConstrained Model SS uc = x 39 = DOF uc = T1 + T2 + T3 – K – K – K = 39 SS c = x 47 = DOF c = T1 + T2 + T3 – K = 47

CHOW TEST MAHINDRA & MAHINDRA ( ; ) H 0 β 11 = β 12 H 1 Not H 0 VARIABLE NAMEDESCRIPTIONVALUE σ ii Varianceσi2σi2 σ ij Contemporaneous Covariance σ ij T1T1 Number of observations for Period1: T2T2 Number of observations for Period 2: KNumber of Parameters4 Period 1: as β 11 Period 2: as β 12

proc autoreg data=sasuser.ppt; mm:model i_m=mcap_m nfa_m a_m /chow=(10); run; SAS Command: Structural Change Test TestBreak Point Num DFDen DFF ValuePr > F Chow Test Result: Inference: F(4,9) = 3.63 at 5% LOS ; Fcal = 7.26 Also, P-value = As F(4,9) Fcal (also P-value is too low), we reject H0 at 5% LOS

THANK YOU!