Charles University Charles University STAKAN III

Slides:



Advertisements
Similar presentations
SJS SDI_21 Design of Statistical Investigations Stephen Senn 2 Background Stats.
Advertisements

Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
3.3 Hypothesis Testing in Multiple Linear Regression
A. The Basic Principle We consider the multivariate extension of multiple linear regression – modeling the relationship between m responses Y 1,…,Y m and.
Hypothesis Testing Steps in Hypothesis Testing:
Econ 140 Lecture 81 Classical Regression II Lecture 8.
4.3 Confidence Intervals -Using our CLM assumptions, we can construct CONFIDENCE INTERVALS or CONFIDENCE INTERVAL ESTIMATES of the form: -Given a significance.
The General Linear Model. The Simple Linear Model Linear Regression.
Classical Regression III
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
The Simple Linear Regression Model: Specification and Estimation
Lecture 23: Tues., Dec. 2 Today: Thursday:
Econ 140 Lecture 131 Multiple Regression Models Lecture 13.
Probability theory 2011 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different definitions.
4. Multiple Regression Analysis: Estimation -Most econometric regressions are motivated by a question -ie: Do Canadian Heritage commercials have a positive.
Multiple Regression Models
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 8 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
Linear and generalised linear models
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Multiple Linear Regression - Matrix Formulation Let x = (x 1, x 2, …, x n )′ be a n  1 column vector and let g(x) be a scalar function of x. Then, by.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Slide 6.1 Linear Hypotheses MathematicalMarketing In This Chapter We Will Cover Deductions we can make about  even though it is not observed. These include.
Byron Gangnes Econ 427 lecture 3 slides. Byron Gangnes A scatterplot.
Interval Estimation and Hypothesis Testing Prepared by Vera Tabakova, East Carolina University.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Trees Example More than one variable. The residual plot suggests that the linear model is satisfactory. The R squared value seems quite low though,
Chapter 5 Statistical Inference Estimation and Testing Hypotheses.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Joint Moments and Joint Characteristic Functions.
Stats & Summary. The Woodbury Theorem where the inverses.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Significance Tests for Regression Analysis. A. Testing the Significance of Regression Models The first important significance test is for the regression.
Correlation. The statistic: Definition is called Pearsons correlation coefficient.
Econometrics III Evgeniya Anatolievna Kolomak, Professor.
Charles University FSV UK STAKAN III Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences.
Virtual University of Pakistan
Charles University Charles University STAKAN III
Virtual University of Pakistan
Inference concerning two population variances
Inference about the slope parameter and correlation
Lecture #25 Tuesday, November 15, 2016 Textbook: 14.1 and 14.3
Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression.
(5) Notes on the Least Squares Estimate
CH 5: Multivariate Methods
Charles University Charles University STAKAN III
...Relax... 9/21/2018 ST3131, Lecture 3 ST5213 Semester II, 2000/2001
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Hypothesis testing and Estimation
Statistical Inference about Regression
Charles University Charles University STAKAN III
Charles University Charles University STAKAN III
Interval Estimation and Hypothesis Testing
5.4 General Linear Least-Squares
Charles University Charles University STAKAN III
Chapter 7: The Normality Assumption and Inference with OLS
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Charles University Charles University STAKAN III
8. One Function of Two Random Variables
Statistical Inference for the Mean: t-test
8. One Function of Two Random Variables
MGS 3100 Business Analysis Regression Feb 18, 2016
Lecture 46 Section 14.5 Wed, Apr 13, 2005
Statistical inference for the slope and intercept in SLR
Presentation transcript:

Charles University Charles University STAKAN III Tuesday, 14.00 – 15.20 Charles University Charles University Econometrics Econometrics Jan Ámos Víšek Jan Ámos Víšek FSV UK Institute of Economic Studies Faculty of Social Sciences Institute of Economic Studies Faculty of Social Sciences STAKAN III Tenth Lecture

Schedule of today talk Confidence intervals and regions Testing hypothesis about submodels Chow’s test of identity of models

In the fourth lecture we had Lemma Assumptions Let be iid. r.v’s . Moreover, let and be regular. Put where . where , Assertions is called Then then , studentization. , i.e. is distributed as Student with d. f.. It immediately gives % confidence interval for the i-th coordina- te of (We shall start with this row on the next slide.)

covers with probability . So, the promised row: Hence the interval covers with probability . What about a confidence interval for several parameters, simultaneously? At sixties, the rectangular confidence intervals were studied, employing mainly so called Bonferroni inequality. From the beginning of seventies, due to fact that better results about joint d.f. were available, mostly confidence intervals of the shape of rotational ellipsoids were considered.

Also in the fourth lecture we had Corollary Assumptions Let be iid. r.v’s . Moreover, let and be regular. Assertions Then . is Fisher-Snedecorovo . It immediately gives % confidence interval for ( as vector )

Confidence interval for at Lemma Assumptions Let be iid. r.v’s . Moreover, let and be regular. For put . where , Assertions Then confidence interval then is called studentization. , covers with probability .

Prior to proving the lemma ...... Let us make an idea what represents (assuming intercept in model). The first row (and of course also column) of is where . Hence  . Then

Let us make an idea what represents (assuming intercept in model). continued Let us make an idea what represents (assuming intercept in model). Finally , in words : is a distance of from (i.e. from a “gravity center”) in Riemann space with the metric tensor which is constant over the whole space.

Proof of Lemma Let’s recall that We have . and

Proof - continued From  ,  . From the 4-th lecture and is (statistically) independent from .

Proof - continued Hence , i.e. . From it QED

Confidence interval for Lemma Assumptions Let be iid. r.v’s . Moreover, let and be regular. For put . where , Assertions Then confidence interval then is called studentization. , covers with probability .

Proof We have already recalled that We have and . From normality of  ,

Proof - continued  . Again from the 4-th lecture and is (statistically) independent from . Hence ,

Proof - continued i.e. . From it and it finally gives . QED

The proof is a bit more involved and will be omitted. Confidence interval for Lemma Assumptions Let be iid. r.v’s . Moreover, let and be regular. For put . where , Assertions then is called Then the confidence region studentization. , covers for all simultaneously with probability . The proof is a bit more involved and will be omitted.

We shall test the hypothesis that the “true” model is Testing submodels We shall test the hypothesis that the “true” model is against the alternative that the “true” model is under assumption that . Assertion Assumptions First of all Let . where , Assertions Then the matrix then is called studentization. , is a projection matrix, i.e. symmetric and idempotent.

Testing submodels Y .

Proof of Assertion Since the projection of matrix into the space gives the same matrix , i.e. . Since the projection matrices are symmetric, we have also Let’s calculate QED

with ranks and , respectively. Further, let , Testing submodels continued Lemma Assumptions Let be iid. r.v’s . Moreover, let and and be regular, with ranks and , respectively. Further, let , , and be corresponding the least squares esti- mator of and sums squares with respect to and . where , Assertions Then then is called studentization. , , i.e. the statistic is distributed as Fisher-Snedecor with and degrees of freedom.

Proof of Lemma Let us decompose . Both matrices and are idempotent  and , i.e. . The matrix is also idempotent, i.e.

Proof - continued Along the same lines and since , Fisher-Cochram lemma says that , and the quadratic forms are independent. Moreover

Proof - continued Along the same lines , and finally . Finally . QED

Chow test Framework of the problem: Let us consider as a hypothesis the model , (1) . E.g. the super index “1” stays for the first period ( in question ) while the super index “2” stays for the second one. As an alternative we consider the model , (2) . In words, the alternative says that the parameters ‘s were the same in the both periods.

Chow test Let us modify the framework : H: , A:

Chow test It is immediately evident that , It is immediately evident that we can obtain q first column of the left matrix as combinations of 2q first columns of of the right matrix. So, we may utilize previous lemma for testing the “left” model as a submodel of the “right” one.

Let and be sums of the squared residuals in Chow test Lemma Assumptions Let and be sums of the squared residuals in the model (1) and (2), respectively. where , Assertions Then then is called studentization. , i.e. the statistic is distributed as Fisher-Snedecor with and degrees of freedom.

Confidence intervals and regions Testing hypothesis about submodels Chow’s test of identity of models

What is to be learnt from this lecture for exam ? Confidence intervals and regions - of individual regression coefficients, - simultaneous for all coefficients, - for value of model at given point, - for response variable. Testing submodels. Chow’s test. All what you need is on http://samba.fsv.cuni.cz/~visek/