Download presentation
Presentation is loading. Please wait.
Published byClarence Bond Modified over 9 years ago
1
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology
2
Let X represent a Binomial r.v,Then from for large n. In this context, two approximations are extremely useful.
3
The Normal Approximation (Demoivre-Laplace Theorem)
4
As we know, If and are within with approximation: where We can express this formula in terms of the normalized integral that has been tabulated extensively.
6
A fair coin is tossed 5,000 times. Find the probability that the number of heads is between 2,475 to 2,525. We need Since n is large we can use the normal approximation. so that and and So the approximation is valid for and Example Solution
7
Here, Using the table, Example - continued
8
The Poisson Approximation For large n, the Gaussian approximation of a binomial r.v is valid only if p is fixed, i.e., only if and What if is small, or if it does not increase with n ? -for example when, such that is a fixed number.
9
The Poisson Theorem If Then
10
The Poisson Approximation
12
Thus, the Poisson p.m.f
13
Suppose -two million lottery tickets are issued -with 100 winning tickets among them. a) If a person purchases 100 tickets, what is the probability of winning? Example: Winning a Lottery Solution The probability of buying a winning ticket
14
X : number of winning tickets n : number of purchased tickets, P : an approximate Poisson distribution with parameter So, The Probability of winning is: Winning a Lottery - continued
15
b) How many tickets should one buy to be 95% confident of having a winning ticket? we need But or Thus one needs to buy about 60,000 tickets to be 95% confident of having a winning ticket! Winning a Lottery - continued Solution
16
A space craft has 100,000 components The probability of any one component being defective is The mission will be in danger if five or more components become defective. Find the probability of such an event. n is large and p is small Poisson Approximation with parameter Example: Danger in Space Mission Solution
17
Conditional Probability Density Function
18
Further, Since for
19
Toss a coin and X ( T )=0, X ( H )=1. Suppose Determine has the following form. We need for all x. For so that and Example Solution (a) 1 1 (b) 1 1
20
For so that For and Example - continued 1 1
21
Given suppose Find We will first determine For so that Example Solution
22
Thus and hence Example - continued (a) (b)
23
Let B represent the event with For a given determine and Example Solution
24
For we have and hence For we have and hence For we have so that Thus, Example - continued
25
Conditional p.d.f & Bayes’ Theorem First, we extend the conditional probability results to random variables: We know that If is a partition of S and B is an arbitrary event, then: By setting we obtain:
26
Conditional p.d.f & Bayes’ Theorem Using: We obtain: For,
27
Conditional p.d.f & Bayes’ Theorem Let so that in the limit as or we also get or (Total Probability Theorem)
28
Bayes’ Theorem (continuous version) using total probability theorem in We get the desired result
29
probability of obtaining a head in a toss. For a given coin, a-priori p can possess any value in (0,1). : A uniform in the absence of any additional information After tossing the coin n times, k heads are observed. How can we update this is new information? Let A = “ k heads in n specific tosses”. Since these tosses result in a specific sequence, and using Total Probability Theorem we get Example: Coin Tossing Problem Revisited Solution
30
The a-posteriori p.d.f represents the updated information given the event A, Using This is a beta distribution. We can use this a-posteriori p.d.f to make further predictions. For example, in the light of the above experiment, what can we say about the probability of a head occurring in the next ( n +1)th toss? Example - continued
31
Let B = “head occurring in the ( n +1)th toss, given that k heads have occurred in n previous tosses”. Clearly From Total Probability Theorem, Using (1) in (2), we get: Thus, if n =10, and k = 6, then which is more realistic compared to p = 0.5. Example - continued
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.