Download presentation
Presentation is loading. Please wait.
Published byGeorgina Mathews Modified over 9 years ago
1
1 Let X represent a Binomial r.v,Then from => for large n. In this context, two approximations are extremely useful. (4-1) 4. Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula
2
2 4.1 The Normal Approximation (Demoivre-Laplace Theorem) Suppose with p held fixed. Then for k in the neighborhood of np, we can approximate And we have: where (4-2) (4-3)
3
3 (4-2) Thus if and in (4-1) are within or around the neighborhood of the interval we can approximate the summation in (4-1) by an integration. In that case (4-1) reduces to where We can express (4-3) in terms of the normalized integral that has been tabulated extensively (See Table 4.1). (4-3) (4-4)
4
4 Table 4.1
5
5 Example 4.1 A fair coin is tossed 5,000 times. Find the probability that the number of heads is between 2,475 to 2,525. Solution: We need Here n is large so that we can use the normal approximation. In this case so that and Since and the approximation is valid for and
6
6 Example 4.1(cont.) Here From table(4-1):
7
7 4.2. The Poisson Approximation for large n, the Gaussian approximation of a binomial r.v is valid only if p is fixed, i.e., only if and what if is small, or if it does not increase with n? for example, as such that is a fixed number.
8
8 The Poisson Approximation(cont.) Consider random arrivals such as telephone calls over a line. n: total number of calls in the interval as we have Suppose : a small interval of duration as in Fig. 4.2 Fig. 4.2
9
9 The Poisson Approximation(cont.) p: probability of single call occurring in . as => normal approximation is invalid here. Suppose the interval in Fig. 4.2: : (H) “success” : A call inside , : (T ) “failure” : A call outside : probability of obtaining k calls (in any order) in, an interval of duration ,
10
10 The Poisson Approximation(cont.) rewrite (4-6) as: The right side of (4-8) represents the Poisson p.m.f. Thus (4-7) (4-8)
11
11 Example 4.2 Winning a Lottery: Suppose two million lottery tickets are issued with 100 winning tickets among them. (a) If a person purchases 100 tickets, what is the probability of winning? (b) How many tickets should one buy to be 95% confident of having a winning ticket? Solution: P:The probability of buying a winning ticket
12
12 Example 4.2(cont.) X : number of winning tickets, n: number of purchased tickets, P: P: an approximate Poisson distribution with parameter
13
13 Example 4.2(cont.) (a) (a)Probability of winning (b) In this case we need But or Thus one needs to buy about 60,000 tickets to be 95% confident of having a winning ticket!
14
14 Example 4.3 A space craft has 100,000 components The probability of any one component being defective is The mission will be in danger if five or more components become defective. Find the probability of such an event. Solution: n is large and p is small => Poisson approximation is valid with parameter
15
15 Example 4.3(cont.)
16
16 Conditional Probability Density Function (4-9) (4-10) (4-11) (4-12)
17
17 Conditional Probability Density Function Further Since for (4-13) (4-15) (4-14) (4-16) (4-17)
18
18 Example 4.4 Toss a coin and X(T)=0, X(H)=1. Suppose Determine Solution: has the following form. We need for all x. For so that and (a) 1 1 (b) 1 1 Fig. 4.3
19
19 Example 4.4(cont.) For so that For and (see Fig. 4.3(b)).
20
20 Example 4.5 Given suppose Find Solution: We will first determine From (4-11) and B as given above, we have For so that (4-18) (4-19)
21
21 Example 4.5(cont.) For so that Thus and hence (4-20) (4-21)
22
22 Example 4.5(cont.) (a) (b) Fig. 4.4
23
23 Example 4.6 Let B represent the event with For a given determine and Solution:
24
24 Example 4.6(cont.) For we have and hence For we have and hence For we have so that (4-24) (4-25) (4-23)
25
25 Example 4.6(cont.) Using (4-23)-(4-25), we get (see Fig. 4.5) Fig. 4.5
26
26 Conditional p.d.f & Bayes’ theorem First, we extend the conditional probability results to random variables: We know from chapter 2 (2-41) that: If is a partition of S and B is an arbitrary event, then:
27
27 Conditional p.d.f & Bayes’ theorem 1. 1.Setting in (2-41) we obtain
28
28 conditional p.d.f& Bayes’ theorem 2. From the identity It follows that: (4-26)
29
29 Conditional p.d.f & Bayes’ theorem 3. Setting in (4-27) we conclude that: (4-27) (4-28)
30
30 conditional p.d.f& Bayes’ theorem(cont.) 4. Further, let so that in the limit as or (4-29) (4-30)
31
31 conditional p.d.f& Bayes’ theorem(cont.) Total probability theorem. From (4-30), we also get or (4-31) (4-32)
32
32 Conditional p.d.f & Bayes’ theorem Bayes’ theorem. using (4-32) in (4-30), we get the desired result (4-33)
33
33 Example 4.7 Reexamine the coin tossing problem: : probability of obtaining a head in a toss. For a given coin, a-priori p can possess any value in the interval (0,1). In the absence of any additional information, we may assume the a-priori p.d.f to be a uniform distribution in that interval. Now suppose we actually perform an experiment of tossing the coin n times, and k heads are observed. This is new information. How can we update
34
34 Example 4.7(cont.) Solution: Let A= “k heads in n specific tosses”. Since these tosses result in a specific sequence, and using (4-32) we get The a-posteriori p.d.f represents the updated information given the event A, and from (4-30) (4-34) Fig.4.6 (4-35)
35
35 Example 4.7(cont.) Notice that the a-posteriori p.d.f of p in (4-36) is not a uniform distribution, but a beta distribution. We can use this a-posteriori p.d.f to make further predictions, For example, in the light of the above experiment, what can we say about the probability of a head occurring in the next (n+1)th toss? Fig. 4.7 (4-36)
36
36 Example 4.7(cont.) Let B= “head occurring in the (n+1)th toss, given that k heads have occurred in n previous tosses”. Clearly and from (4-32) Thus, if n =10, and k = 6, then which is more realistic compare to p = 0.5. (4-38)
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.