Large Sample Theory EC 532 Burak Saltoğlu.

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Session 2 – Introduction: Inference
Part 12: Asymptotics for the Regression Model 12-1/39 Econometrics I Professor William Greene Stern School of Business Department of Economics.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Christopher Dougherty EC220 - Introduction to econometrics (review chapter) Slideshow: asymptotic properties of estimators: plims and consistency Original.
ORDER STATISTICS.
Chap 8: Estimation of parameters & Fitting of Probability Distributions Section 6.1: INTRODUCTION Unknown parameter(s) values must be estimated before.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
ASYMPTOTIC PROPERTIES OF ESTIMATORS: PLIMS AND CONSISTENCY
ELEC 303 – Random Signals Lecture 18 – Statistics, Confidence Intervals Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 10, 2009.
06/05/2008 Jae Hyun Kim Chapter 2 Probability Theory (ii) : Many Random Variables Bioinformatics Tea Seminar: Statistical Methods in Bioinformatics.
Sampling Distributions
1 Chap 5 Sums of Random Variables and Long-Term Averages Many problems involve the counting of number of occurrences of events, computation of arithmetic.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Multiple Regression Analysis
4. Convergence of random variables  Convergence in probability  Convergence in distribution  Convergence in quadratic mean  Properties  The law of.
2. Point and interval estimation Introduction Properties of estimators Finite sample size Asymptotic properties Construction methods Method of moments.
Least Squares Asymptotics Convergence of Estimators: Review Least Squares Assumptions Least Squares Estimator Asymptotic Distribution Hypothesis Testing.
Math Camp 2: Probability Theory Sasha Rakhlin. Introduction  -algebra Measure Lebesgue measure Probability measure Expectation and variance Convergence.
Review of Probability and Statistics
Approximations to Probability Distributions: Limit Theorems.
Standard error of estimate & Confidence interval.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University ECON 4550 Econometrics Memorial University of Newfoundland.
Limits and the Law of Large Numbers Lecture XIII.
All of Statistics Chapter 5: Convergence of Random Variables Nick Schafer.
Chapter 7 Sampling Distribution
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 26.
Lab 3b: Distribution of the mean
1 Copyright © 2007 Thomson Asia Pte. Ltd. All rights reserved. CH5 Multiple Regression Analysis: OLS Asymptotic 
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
TobiasEcon 472 Law of Large Numbers (LLN) and Central Limit Theorem (CLT)
1 ORDER STATISTICS AND LIMITING DISTRIBUTIONS. 2 ORDER STATISTICS Let X 1, X 2,…,X n be a r.s. of size n from a distribution of continuous type having.
Lecture 2 Basics of probability in statistical simulation and stochastic programming Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius,
Consistency An estimator is a consistent estimator of θ, if , i.e., if
The final exam solutions. Part I, #1, Central limit theorem Let X1,X2, …, Xn be a sequence of i.i.d. random variables each having mean μ and variance.
Confidence Interval & Unbiased Estimator Review and Foreword.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Review of Statistics.  Estimation of the Population Mean  Hypothesis Testing  Confidence Intervals  Comparing Means from Different Populations  Scatterplots.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
CLASSICAL NORMAL LINEAR REGRESSION MODEL (CNLRM )
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
Stochastic Process - Introduction
Ch5.4 Central Limit Theorem
Large Sample Distribution Theory
Jiaping Wang Department of Mathematical Science 04/22/2013, Monday
Large Sample Theory EC 532 Burak Saltoğlu.
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
SOME IMPORTANT PROBABILITY DISTRIBUTIONS
Sample Mean Distributions
Parameter, Statistic and Random Samples
STOCHASTIC REGRESSORS AND THE METHOD OF INSTRUMENTAL VARIABLES
ORDER STATISTICS AND LIMITING DISTRIBUTIONS
EC 331 The Theory of and applications of Maximum Likelihood Method
The Multivariate Normal Distribution, Part 2
STOCHASTIC HYDROLOGY Random Processes
Chengyuan Yin School of Mathematics
Lecture 7 Sampling and Sampling Distributions
ORDER STATISTICS AND LIMITING DISTRIBUTIONS
Central Limit Theorem: Sampling Distribution.
Econometrics I Professor William Greene Stern School of Business
Chapter 5 Properties of a Random Sample
Econometrics I Professor William Greene Stern School of Business
Multiple Regression Analysis
Presentation transcript:

Large Sample Theory EC 532 Burak Saltoğlu

Additional readings W Greene (Econometric Analysis), 2012. pg: 1106-1128 Hayashi (Econometrics), 2000

Outline Convergence in Probability Laws of Large Numbers Convergence of Functions Convergence in Distribution: Limiting Distributions Central Limit Theorems Asymtotic Distributions

Why large sample We have studied the finite (exact) distribution of OLS estimator and its associated tests. If the regressors are endogeneous (i.e. X and u are correlated) we won’t be able to handle the estimation. Rather than making assumptions on a sample of a given size, large sample theory makes assumptions on the stochastic processes that generates the sample.

Asymptotics What happens to a rv or a distribution as n tends to infinity. What is the approximate distribution under these limiting conditions. What is the rate of convergence to a limit. 2 Critical Laws of statistics are studied under large sample theory. Law of Large Numbers Central Limit Theorem

Convergence in Probability Definition : Let xn be a sequence random variable where n is sample size, the random variable xn converges in probability to a constant c if the values that the x may take that are not close to c become increasingly unlikely as n increases. If xn converges to c, then we say, All the mass of the probability distribution concentrates around c.

mean square Convergence Definition mean square convergence:

Convergence in Probability Definition : An estimator of a parameter is a consistent estimator iff

Almost sure convergence İntiutively: once the sequence gets closer to c it stays that way .

Almost sure convergence:alternative definition The random variable İs said to converge “almost surely” to c if and only if .

Law of large numbers: Strong versus Weak Law of Large Numbers Based on convergence in probability Strong form of large numbers Based on Almost sure convergence

Laws of Large Numbers Weak Law of Large Numbers: (Khinchine) Remarks: 1) No finite variance assumption . 2) Requires i.i.d sampling

Strong law of large numbers: Kolmogorov Remarks: İt is stronger because of in AS convergence iid’ness is not required . But with iid’ness we will obtain a convergence to a constant mean.

Convergence of Functions Theorem (Slutsky): For a continious function, g(xn) Using Slutsky theorem, we can write some rules of plim.

Convergence of Functions Rules for Probability Limits 1) For plimx=c and plimy=d i) plim (x+y) = c+d ii) plim xy=cd iii) plim (x/y)=c/d 2) For matrices X and Y with plimX=A and plimY=B i) plim X-1=A-1 ii) plim XY=AB

Convergence in Distribution: Limiting Distributions Definition 6: xn with cdf Fn(x) converges in distribution to a random variable with cdf, F(x) if then F(x) is the limiting distribution of xn and can be shown as

Example: Limiting Probability Distributions Student’s t Distribution: Given that Properties: (i) It is symmetric, positive everywhere and leptokurtic(has fatter tails than normal dist.) (ii) Only parameter is n, degrees of freedom. (iii)

Convergence in Distribution: Limiting Distributions Rules for Limiting Distributions: 1) If and plim yn=c, then 2) As a corrolary to Slutsky theorem, if and g(x) is a cont. function

Convergence in Distribution: Limiting Distributions 3) If yn has a limiting distribution and plim(xn-yn)=0 then xn has the same limiting distribution with yn

Central Limit Theorems Lindberg-Levy Central Limit Theorem: CLT states that any sampling distribution from any distribution would converge to normal distribution the mean of a sufficiently large number of independent random variables, each with finite mean and variance, will be approximately normally distributed

Example: distirbution of sample mean So X’s are said to be identically and independently distributed (iid) random variables. derive the distribution of

Central limit idea 27.10.2009

A simple monte carlo for CLT See Matlab: clt.m Let X1,X2…Xn are ~N(0,1), As E(Xbar)=mu ninfinity Stdev(Xbar)0

X=(x1) x1~N(0,1): E(Xbar)=-0.0182, sigma=0.98

CLT when n=5, aver(Xbar)=0.01, stdev=0.47

N=1000,s.aver=0.002, st.dev=0.031 27.10.2009

Stdev of sample average disappears 27.10.2009

End of the Lecture