Presentation is loading. Please wait.

Presentation is loading. Please wait.

Tutorial 10: Limit Theorems

Similar presentations


Presentation on theme: "Tutorial 10: Limit Theorems"β€” Presentation transcript:

1 Tutorial 10: Limit Theorems
Weiwen LIU April 3, 2017

2 Markov and Chebyshev Inequalities
These inequalities use the mean and possibly the variance of a random variable to draw conclusions on the probabilities of certain events. primarily useful in situations exact values or bounds for the mean and variance of a random variable 𝑋 are easily computable. but the distribution of 𝑋 is either unavailable or hard to calculate.

3 Markov inequality If a random variable 𝑋 can only take nonnegative values, then 𝑃 𝑋β‰₯π‘Ž ≀ 𝐸 𝑋 π‘Ž , for all π‘Ž>0 Loosely speaking, it asserts that if a nonnegative random variable has a small mean, then the probability that it takes a large value must also be small.

4 Example 1 Let 𝑋 be uniformly distributed in the interval [0,4] and note that 𝐸[𝑋]=2. Then, the Markov inequality asserts that 𝑃 𝑋β‰₯2 ≀ 2 2 =1. 𝑃 𝑋β‰₯3 ≀ 2 3 =0.67. 𝑃 𝑋β‰₯4 ≀ 2 4 =0.5.

5 Example 1 By comparing with the exact probabilities 𝑃 𝑋β‰₯2 =0.5.
𝑃 𝑋β‰₯3 =0.25. 𝑃 𝑋β‰₯4 =0. We see that the bounds provided by the Markov inequality can be quite loose.

6 Chebyshev inequality If 𝑋 is a random variable with mean πœ‡ and variance 𝜎 2 , then 𝑃 π‘‹βˆ’πœ‡ β‰₯𝑐 ≀ 𝜎 2 𝑐 2 , for all 𝑐>0

7 Example 2: Uninformative case
Let 𝑋 be uniformly distributed in 0,4 . Let us use the Chebyshev inequality to bound the probability that |π‘‹βˆ’2|β‰₯1. We have 𝜎 2 =16/12=4/3, and 𝑃 |π‘‹βˆ’2 β‰₯1 ≀4/3 Which is uninformative.

8 Example 2: Uninformative case
let 𝑋 be exponentially distributed with parameter πœ†=1, so that 𝐸[𝑋]=π‘£π‘Žπ‘Ÿ(𝑋)=1. For 𝑐>1, using the Chebyshev inequality, we obtain 𝑃 π‘₯β‰₯𝑐 = 𝑃 π‘‹βˆ’1β‰₯π‘βˆ’1 ≀𝑃 π‘‹βˆ’1 β‰₯π‘βˆ’1 ≀ 1 π‘βˆ’1 2

9 Example 2: Uninformative case
1 π‘βˆ’ is again conservative compared to the exact answer 𝑃(𝑋β‰₯ 𝑐)= 𝑒 βˆ’π‘

10 Weak law of large numbers
The law asserts that the sample mean of a large number of independent identically distributed random variables is very close to the true mean, with high probability.

11 Weak law of large numbers
Consider a sequence 𝑋 1 , 𝑋 2 , … of independent identically distributed random variables with mean πœ‡ and variance 𝜎 2 , and define the sample mean by 𝑀 𝑛 = 𝑋 1 +…+ 𝑋 𝑛 𝑛

12 Weak law of large numbers
We have E 𝑀 𝑛 = 𝐸[𝑋 1 ]+…+ 𝐸[𝑋 𝑛 ] 𝑛 = π‘›πœ‡ 𝑛 =πœ‡ and, using independence, var( 𝑀 𝑛 )= π‘£π‘Žπ‘Ÿ( 𝑋 1 +…+𝑋 𝑛 ) 𝑛 2 = π‘£π‘Žπ‘Ÿ( 𝑋 1 )+…+π‘£π‘Žπ‘Ÿ(𝑋 𝑛 ) 𝑛 2 𝑛 𝜎 2 𝑛 2 = 𝜎 2 𝑛

13 Weak law of large numbers
Let 𝑋 1 , 𝑋 2 , … be independent identically distributed random variables with mean πœ‡. For every πœ–>0, we have 𝑃 𝑀 𝑛 βˆ’πœ‡ β‰₯πœ– =𝑃( 𝑋 1 +…+ 𝑋 𝑛 𝑛 βˆ’πœ‡ β‰₯πœ–) β†’0, π‘Žπ‘  π‘›β†’βˆž

14 Convergence in Probability
Let π‘Œ 1 , π‘Œ 2 , … be a sequence of random variables, and let π‘Ž be a real number. We say that the sequence π‘Œ 𝑛 converges to π‘Ž in probability, if for every πœ–>0 , we have lim π‘›β†’βˆž P π‘Œ 𝑛 βˆ’π‘Ž β‰₯πœ– =0.

15 Example 3 Consider a sequence of independent random variables 𝑋 𝑛 that are uniformly distributed in the interval [0,1], and let π‘Œ 𝑛 = min 𝑋 1 ,… 𝑋 𝑛 .

16 Example 3 The sequence of values of π‘Œ 𝑛 cannot increase as 𝑛 increases, and it will occasionally decrease whenever a value of 𝑋 𝑛 that is smaller than the preceding values is obΒ­tained.

17 Example 3 Thus, we expect that π‘Œ 𝑛 converges to zero. Indeed, for πœ–>0,we have using the independence of the 𝑋 𝑛 , 𝑃 π‘Œ 𝑛 βˆ’0 β‰₯πœ– =𝑃 𝑋 1 β‰₯πœ–,… 𝑋 𝑛 β‰₯πœ– =𝑃( 𝑋 1 β‰₯πœ–),… 𝑃(𝑋 𝑛 β‰₯πœ–) =(1βˆ’πœ–) 𝑛

18 Example 3 In particular, lim π‘›β†’βˆž 𝑃( π‘Œ 𝑛 βˆ’0 β‰₯πœ–) = lim π‘›β†’βˆž (1βˆ’πœ–) 𝑛 =0
Since this is true for every πœ–>0, we conclude that π‘Œ 𝑛 converges to zero, in probaΒ­bility.

19 Example 4 In order to estimate 𝑓. the true fraction of smokers in a large population, Alvin selects 𝑛 people at random. His estimator 𝑀 𝑛 is obtained by dividing 𝑆 𝑛 , the number of smokers in his sample, by 𝑛, i.e. 𝑀 𝑛 = 𝑆 𝑛 𝑛 . Alvin Chooses the sample size 𝑛 to be the smallest possible number for which the Chebyshev inequality yields a guarantee that, 𝑃 𝑀 𝑛 βˆ’ 𝑓 β‰₯πœ– ≀𝛿, where πœ– and 𝛿 are some pre-specified tolerances. Determine how the value of 𝑛 recommended by the Chevyshev inequality changes in the following cases. A) The value of epsilon is reduced to half its original value. B) The probability delta is reduced to half its original value.

20 P 𝑀 𝑛 βˆ’πœ‡ β‰₯πœ– ≀ 𝜎 2 𝑛 πœ– 2 , for any πœ–>0
Example 4 Hint: P 𝑀 𝑛 βˆ’πœ‡ β‰₯πœ– ≀ 𝜎 2 𝑛 πœ– 2 , for any πœ–>0 A) 𝑛 β€² =4𝑛 B) 𝑛 β€² =2𝑛


Download ppt "Tutorial 10: Limit Theorems"

Similar presentations


Ads by Google