1 Let X represent a Binomial r.v,Then from => for large n. In this context, two approximations are extremely useful. (4-1) 4. Binomial Random Variable.

Slides:



Advertisements
Similar presentations
Lecture 7. Distributions
Advertisements

4. Binomial Random Variable Approximations,
Commonly Used Distributions
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
生醫統計學期末報告 Distributions 學生 : 劉俊成 學號 : 授課老師 : 蔡章仁.
CHAPTER 8 More About Estimation. 8.1 Bayesian Estimation In this chapter we introduce the concepts related to estimation and begin this by considering.
Chapter 12 Probability © 2008 Pearson Addison-Wesley. All rights reserved.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Physics 114: Lecture 9 Probability Density Functions Dale E. Gary NJIT Physics Department.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula.
CHAPTER 13: Binomial Distributions
Chapter 5 Basic Probability Distributions
Descriptive statistics Experiment  Data  Sample Statistics Sample mean Sample variance Normalize sample variance by N-1 Standard deviation goes as square-root.
Probability Distributions
15. Poisson Processes In Lecture 4, we introduced Poisson arrivals as the limiting behavior of Binomial random variables. (Refer to Poisson approximation.
Statistical Analysis Pedro Flores. Conditional Probability The conditional probability of an event B is the probability that the event will occur given.
R. Kass/S07 P416 Lec 3 1 Lecture 3 The Gaussian Probability Distribution Function Plot of Gaussian pdf x p(x)p(x) Introduction l The Gaussian probability.
Probability Distribution
1 As we have seen in section 4 conditional probability density functions are useful to update the information about an event based on the knowledge about.
Dept of Bioenvironmental Systems Engineering National Taiwan University Lab for Remote Sensing Hydrology and Spatial Modeling STATISTICS Random Variables.
1 1. Basics Probability theory deals with the study of random phenomena, which under repeated experiments yield different outcomes that have certain underlying.
Modeling and Simulation CS 313
1. Normal Approximation 1. 2 Suppose we perform a sequence of n binomial trials with probability of success p and probability of failure q = 1 - p and.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
1 5. Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v?
1 2. Independence and Bernoulli Trials Independence: Events A and B are independent if It is easy to show that A, B independent implies are all independent.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Statistical Applications Binominal and Poisson’s Probability distributions E ( x ) =  =  xf ( x )
1 TABLE OF CONTENTS PROBABILITY THEORY Lecture – 1Basics Lecture – 2 Independence and Bernoulli Trials Lecture – 3Random Variables Lecture – 4 Binomial.
Convergence in Distribution
1 Let X represent a Binomial r.v as in (3-42). Then from (2-30) Since the binomial coefficient grows quite rapidly with n, it is difficult to compute (4-1)
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Independence and Bernoulli Trials. Sharif University of Technology 2 Independence  A, B independent implies: are also independent. Proof for independence.
One Random Variable Random Process.
1 Functions of a Random Variable Let X be a r.v defined on the model and suppose g(x) is a function of the variable x. Define Is Y necessarily a r.v? If.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Functions of a Random Variable.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
1 3. Random Variables Let ( , F, P) be a probability model for an experiment, and X a function that maps every to a unique point the set of real numbers.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Joint Moments and Joint Characteristic Functions.
One Function of Two Random Variables
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
3. Random Variables (Fig.3.1)
Random variables (r.v.) Random variable
STATISTICS AND PROBABILITY IN CIVIL ENGINEERING
Appendix A: Probability Theory
Chapter 5 Sampling Distributions
The Bernoulli distribution
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Distributions and expected value
Chapter 5 Sampling Distributions
5. Functions of a Random Variable
5. Functions of a Random Variable
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
11. Conditional Density Functions and Conditional Expected Values
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
11. Conditional Density Functions and Conditional Expected Values
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Chapter 11 Probability.
Presentation transcript:

1 Let X represent a Binomial r.v,Then from => for large n. In this context, two approximations are extremely useful. (4-1) 4. Binomial Random Variable Approximations, Conditional Probability Density Functions and Stirling’s Formula

2 4.1 The Normal Approximation (Demoivre-Laplace Theorem) Suppose with p held fixed. Then for k in the neighborhood of np, we can approximate And we have: where (4-2) (4-3)

3 (4-2) Thus if and in (4-1) are within or around the neighborhood of the interval we can approximate the summation in (4-1) by an integration. In that case (4-1) reduces to where We can express (4-3) in terms of the normalized integral that has been tabulated extensively (See Table 4.1). (4-3) (4-4)

4 Table 4.1

5 Example 4.1 A fair coin is tossed 5,000 times. Find the probability that the number of heads is between 2,475 to 2,525. Solution: We need Here n is large so that we can use the normal approximation. In this case so that and Since and the approximation is valid for and

6 Example 4.1(cont.) Here From table(4-1):

The Poisson Approximation for large n, the Gaussian approximation of a binomial r.v is valid only if p is fixed, i.e., only if and what if is small, or if it does not increase with n? for example, as such that is a fixed number.

8 The Poisson Approximation(cont.) Consider random arrivals such as telephone calls over a line. n: total number of calls in the interval as we have Suppose  : a small interval of duration as in Fig. 4.2 Fig. 4.2

9 The Poisson Approximation(cont.) p: probability of single call occurring in . as => normal approximation is invalid here. Suppose the interval  in Fig. 4.2: : (H) “success” : A call inside , : (T ) “failure” : A call outside  : probability of obtaining k calls (in any order) in, an interval of duration ,

10 The Poisson Approximation(cont.) rewrite (4-6) as: The right side of (4-8) represents the Poisson p.m.f. Thus (4-7) (4-8)

11 Example 4.2 Winning a Lottery: Suppose two million lottery tickets are issued with 100 winning tickets among them. (a) If a person purchases 100 tickets, what is the probability of winning? (b) How many tickets should one buy to be 95% confident of having a winning ticket? Solution: P:The probability of buying a winning ticket

12 Example 4.2(cont.) X : number of winning tickets, n: number of purchased tickets, P: P: an approximate Poisson distribution with parameter

13 Example 4.2(cont.) (a) (a)Probability of winning (b) In this case we need But or Thus one needs to buy about 60,000 tickets to be 95% confident of having a winning ticket!

14 Example 4.3 A space craft has 100,000 components The probability of any one component being defective is The mission will be in danger if five or more components become defective. Find the probability of such an event. Solution: n is large and p is small => Poisson approximation is valid with parameter

15 Example 4.3(cont.)

16 Conditional Probability Density Function (4-9) (4-10) (4-11) (4-12)

17 Conditional Probability Density Function Further Since for (4-13) (4-15) (4-14) (4-16) (4-17)

18 Example 4.4 Toss a coin and X(T)=0, X(H)=1. Suppose Determine Solution: has the following form. We need for all x. For so that and (a) 1 1 (b) 1 1 Fig. 4.3

19 Example 4.4(cont.) For so that For and (see Fig. 4.3(b)).

20 Example 4.5 Given suppose Find Solution: We will first determine From (4-11) and B as given above, we have For so that (4-18) (4-19)

21 Example 4.5(cont.) For so that Thus and hence (4-20) (4-21)

22 Example 4.5(cont.) (a) (b) Fig. 4.4

23 Example 4.6 Let B represent the event with For a given determine and Solution:

24 Example 4.6(cont.) For we have and hence For we have and hence For we have so that (4-24) (4-25) (4-23)

25 Example 4.6(cont.) Using (4-23)-(4-25), we get (see Fig. 4.5) Fig. 4.5

26 Conditional p.d.f & Bayes’ theorem First, we extend the conditional probability results to random variables: We know from chapter 2 (2-41) that: If is a partition of S and B is an arbitrary event, then:

27 Conditional p.d.f & Bayes’ theorem 1. 1.Setting in (2-41) we obtain

28 conditional p.d.f& Bayes’ theorem 2. From the identity It follows that: (4-26)

29 Conditional p.d.f & Bayes’ theorem 3. Setting in (4-27) we conclude that: (4-27) (4-28)

30 conditional p.d.f& Bayes’ theorem(cont.) 4. Further, let so that in the limit as or (4-29) (4-30)

31 conditional p.d.f& Bayes’ theorem(cont.) Total probability theorem. From (4-30), we also get or (4-31) (4-32)

32 Conditional p.d.f & Bayes’ theorem Bayes’ theorem. using (4-32) in (4-30), we get the desired result (4-33)

33 Example 4.7 Reexamine the coin tossing problem: : probability of obtaining a head in a toss. For a given coin, a-priori p can possess any value in the interval (0,1). In the absence of any additional information, we may assume the a-priori p.d.f to be a uniform distribution in that interval. Now suppose we actually perform an experiment of tossing the coin n times, and k heads are observed. This is new information. How can we update

34 Example 4.7(cont.) Solution: Let A= “k heads in n specific tosses”. Since these tosses result in a specific sequence, and using (4-32) we get The a-posteriori p.d.f represents the updated information given the event A, and from (4-30) (4-34) Fig.4.6 (4-35)

35 Example 4.7(cont.) Notice that the a-posteriori p.d.f of p in (4-36) is not a uniform distribution, but a beta distribution. We can use this a-posteriori p.d.f to make further predictions, For example, in the light of the above experiment, what can we say about the probability of a head occurring in the next (n+1)th toss? Fig. 4.7 (4-36)

36 Example 4.7(cont.) Let B= “head occurring in the (n+1)th toss, given that k heads have occurred in n previous tosses”. Clearly and from (4-32) Thus, if n =10, and k = 6, then which is more realistic compare to p = 0.5. (4-38)