Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.

Slides:



Advertisements
Similar presentations
Discrete Random Variables and Probability Distributions
Advertisements

MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Discrete Uniform Distribution
DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Chapter 5 Discrete Random Variables and Probability Distributions
Note 6 of 5E Statistics with Economics and Business Applications Chapter 4 Useful Discrete Probability Distributions Binomial, Poisson and Hypergeometric.
Chapter 2 Discrete Random Variables
1 1 Slide 2009 University of Minnesota-Duluth, Econ-2030 (Dr. Tadesse) Chapter 5: Probability Distributions: Discrete Probability Distributions.
Discrete Probability Distributions Introduction to Business Statistics, 5e Kvanli/Guynes/Pavur (c)2000 South-Western College Publishing.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Chapter 6 Continuous Random Variables and Probability Distributions
Probability Distributions
Continuous Random Variables and Probability Distributions
Discrete Random Variables and Probability Distributions
Class notes for ISE 201 San Jose State University
Discrete Random Variables and Probability Distributions
McGraw-Hill Ryerson Copyright © 2011 McGraw-Hill Ryerson Limited. Adapted by Peter Au, George Brown College.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Chapter 5 Several Discrete Distributions General Objectives: Discrete random variables are used in many practical applications. These random variables.
Copyright © Cengage Learning. All rights reserved. 3.5 Hypergeometric and Negative Binomial Distributions.
Chapter 5 Discrete Probability Distributions
Discrete Distributions
6- 1 Chapter Six McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Standard Statistical Distributions Most elementary statistical books provide a survey of commonly used statistical distributions. The reason we study these.
Chapter 5 Some Discrete Probability Distributions.
McGraw-Hill/Irwin Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.
Probability Distribution
Lecture 4 1 Discrete distributions Four important discrete distributions: 1.The Uniform distribution (discrete) 2.The Binomial distribution 3.The Hyper-geometric.
Random Sampling, Point Estimation and Maximum Likelihood.
The Negative Binomial Distribution An experiment is called a negative binomial experiment if it satisfies the following conditions: 1.The experiment of.
Introduction Discrete random variables take on only a finite or countable number of values. Three discrete probability distributions serve as models for.
STAT 111 Chapter 5 Special Probability Distributions 1.
MATB344 Applied Statistics Chapter 5 Several Useful Discrete Distributions.
HYPERGEOMETRIC DISTRIBUTION PREPARED BY :A.TUĞBA GÖRE
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Biostatistics Class 3 Discrete Probability Distributions 2/8/2000.
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
Math 4030 – 4a Discrete Distributions
Introduction to Probability and Statistics Thirteenth Edition Chapter 5 Several Useful Discrete Distributions.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
1 Topic 3 - Discrete distributions Basics of discrete distributions Mean and variance of a discrete distribution Binomial distribution Poisson distribution.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
1 1 Slide University of Minnesota-Duluth, Econ-2030 (Dr. Tadesse) University of Minnesota-Duluth, Econ-2030 (Dr. Tadesse) Chapter 5: Probability Distributions:
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Topic 3 - Discrete distributions Basics of discrete distributions - pages Mean and variance of a discrete distribution - pages ,
Some Common Discrete Random Variables. Binomial Random Variables.
Chapter 3 DeGroot & Schervish. Functions of a Random Variable the distribution of some function of X suppose X is the rate at which customers are served.
Random Variables Example:
Chapter 5 Discrete Random Variables Probability Distributions
Chapter 4. Random Variables - 3
Copyright ©2006 Brooks/Cole A division of Thomson Learning, Inc. Introduction to Probability and Statistics Twelfth Edition Robert J. Beaver Barbara M.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Chapter 5 Special Distributions Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
PROBABILITY AND STATISTICS WEEK 5 Onur Doğan. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Chapter Six McGraw-Hill/Irwin
Handout Ch5(1) 實習.
Discrete Random Variables and Probability Distributions
Math 4030 – 4a More Discrete Distributions
Chapter 2 Discrete Random Variables
Discrete Probability Distributions
PROBABILITY AND STATISTICS
Handout Ch 4 實習.
Handout Ch 4 實習.
Handout Ch 5.
Each Distribution for Random Variables Has:
Presentation transcript:

Chapter 4-5 DeGroot & Schervish

Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional expectation (or conditional mean) of Y given X = x is denoted by E(Y|x) and is defined to be the expectation of the conditional distribution of Y given X = x.

Conditional Expectation/Mean if Y has a continuous conditional distribution given X=x with conditional p.d.f. g2(y|x), then Similarly, if Y has a discrete conditional distribution given X = x with conditional p.f. g2(y|x), then

Example The conditional p.f. of Y given X = 4 is g2(y|4) = f (4, y)/f1(4), which is the x = 4 column of table divided by f1(4) =0.208 g2(0|4) = , g2(1|4) = , g2(2|4) = , g2(3|4) = The conditional mean of Y given X = 4 is then E(Y|4) = 0 × × × × =

Theorem Let X and Y be random variables such that Y has finite mean. E[E(Y|X)]= E(Y). Proof

Example Suppose that a point X is chosen on the interval [0, 1]. Also, suppose that after the value X = x has been observed (0<x <1), a point Y is chosen on the interval [x, 1]. Determine the value of E(Y). For each given value of x (0 < x <1), E(Y|x) will be equal to the midpoint (1/2)(x + 1) of the interval [x, 1]. Therefore, E(Y|X) = (1/2)(X + 1) and

Markov Inequality Suppose that X is a random variable such that Pr(X ≥ 0) = 1. Then for every real number t > 0, Proof

Chebyshev Inequality Let X be a random variable for which Var(X) exists. Then for every number t > 0,

Properties of the Sample Mean Let X1,..., Xn be a random sample from a distribution with mean μ and variance σ 2. Let Xn be the sample mean. Then E(Xn) = μ and Var(Xn) = σ 2 /n. Since X1,..., Xn are independent,

Determining the Required Number of Observations Suppose that a random sample is to be taken from a distribution: the value of the mean μ is not known, the standard deviation σ is 2 units or less. We shall determine how large the sample size must be in order to make the probability at least 0.99 that |Xn − μ| will be less than 1 unit. Since σ 2 ≤ 2 2 = 4, it follows that for every sample size n,

Bernoulli Distribution The simplest type of experiment has only two possible outcomes, call them 0 and 1. If X equals the outcome from such an experiment, then X is a member of the family of Bernoulli distributions. If n independent random variables X1,..., Xn all have the same Bernoulli distribution, then their sum is equal to the number of the Xi ’s that equal 1, and the distribution of the sum is a member of the binomial family.

Bernoulli Distribution A random variable X has the Bernoulli distribution with parameter p (0 ≤ p ≤ 1) if X can take only the values 0 and 1 and the probabilities are Pr(X = 1) = p and Pr(X = 0) = 1− p. The p.f. of X can be written as follows: To verify that this p.f. f (x|p) actually does represent the Bernoulli distribution specified by the given probabilities, it is simply necessary to note that f (1|p) = p and f (0|p) = 1− p.

Bernoulli Distribution Suppose that 10 percent of the items produced by a certain machine are defective and the parts are independent of each other. We will sample n items at random and inspect them. Let Xi = 1 if the ith item is defective, and let Xi = 0 if it is nondefective (i = 1,..., n). Then the variables X1,..., Xn form n Bernoulli trials with parameter p = 1/10.

Binomial Distribution Let X = X X10, which equals the number of defective parts among the 10 sampled parts. The distribution of X is the binomial distribution with parameters 10 and 1/10.

Binomial Distribution A random variable X has the binomial distribution with parameters n and p if X has a discrete distribution for which the p.f. is as follows: In this distribution, n must be a positive integer, and p must lie in the interval 0 ≤ p ≤ 1.

Theorems If the random variables X1,..., Xn form n Bernoulli trials with parameter p, and if X = X Xn, then X has the binomial distribution with parameters n and p. If X1,..., Xk are independent random variables, and if Xi has the binomial distribution with parameters ni and p (i = 1,..., k), then the sum X Xk has the binomial distribution with parameters n = n nk and p.

The Variance of a Binomial Distribution Since X1,..., Xn are independent, it follows from the theorem E(Xi) = p for i = 1,..., n. Since Xi 2 = Xi for each i, E(Xi 2 ) = E(Xi) = p. Var(Xi) = E(Xi 2 ) − [E(Xi)] 2 = p − p 2 = p(1− p). Var(X) = np(1− p).

Bernoulli and Binomial Distributions Every random variable that takes only the two values 0 and 1 must have a Bernoulli distribution. However, not every sum of Bernoulli random variables has a binomial distribution. There are two conditions needed: The Bernoulli random variables must be mutually independent, and they must all have the same parameter.

Sampling without Replacement Suppose that a box contains A red balls and B blue balls. Suppose also that n ≥ 0 balls are selected at random from the box without replacement, and let X denote the number of red balls that are obtained. For cases with n ≥ 1, Xi = 1 if the ith ball drawn is red and Xi = 0 if not. Then each Xi has a Bernoulli distribution, but X1,..., Xn are not independent in general.

Sampling without Replacement

The Hypergeometric Distributions The distribution of X has the p.f. For max{0, n − B} ≤ x ≤ min{n, A}, and f (x|A, B, n) = 0 otherwise. X has the hypergeometric distribution with parameters A, B, and n.

The Mean and Variance for a Hypergeometric Distribution For i = 1,..., n, Since X = X Xn, the mean of X is the sum of the means of the Xi ’s,

The Mean and Variance for a Hypergeometric Distribution

Plugging this value back into previous equation

Comparison of Sampling Methods If we had sampled with replacement, the number of red balls would have the binomial distribution with parameters n and A/(A + B). In that case, the mean number of red balls would still be nA/(A + B), but the variance would be different. To see how the variances from sampling with and without replacement are related, let T = A + B denote the total number of balls in the box, and let p = A/T denote the proportion of red balls in the box. Then Var(X) can be rewritten as follows:

Comparison of Sampling Methods

The Poisson Distributions Many experiments consist of observing the occurrence times of random arrivals. Examples include arrivals of customers for service, arrivals of calls at a switchboard, occurrences of floods and other natural and man-made disasters, and so forth. The family of Poisson distributions is used to model the number of such arrivals that occur in a fixed time period.

The Poisson Distributions Let λ > 0. A random variable X has the Poisson distribution with mean λ if the p.f. of X is as follows:

Example Suppose that the average number of accidents occurring weekly on a highway equals 3. Calculate the probability that there is at least one accident this week. Let X denote the number of accidents occurring on the highway in question during this week. Because it is reasonable to suppose that there are a large number of cars passing along that highway, each having a small probability of being involved in an accident, the number of such accidents should be approximately Poisson distributed. Hence,

The Poisson Approximation to Binomial Distributions when the value of n is large and the value of p is close to 0, the binomial distribution with parameters n and p can be approximated by the Poisson distribution with mean np. Suppose that in a large population the proportion of people who have a certain disease is Determine the probability that in a random group of 200 people at least four people will have the disease. In this example, we can assume that the exact distribution of the number of people having the disease among the 200 people in the random group is the binomial distribution with parameters n = 200 and p = Therefore, this distribution can be approximated by the Poisson distribution for which the mean is λ = np = 2. If X denotes a random variable having this Poisson distribution, then it can be found from the table of the Poisson distribution at the end textbook that Pr(X ≥ 4) = Hence, the probability that at least four people will have the disease is approximately The actual value is

The Negative Binomial Distributions Suppose that a machine produces parts that can be either good or defective. Let Xi = 1 if the ith part is defective and Xi = 0 otherwise. Assume that the parts are good or defective independently of each other with Pr(Xi = 1) = p for all i. An inspector observes the parts produced by this machine until she sees four defectives. Let X be the number of good parts observed by the time that the fourth defective is observed. The distribution of X is negative binomial distibution.

The Negative Binomial Distributions Suppose that an infinite sequence of Bernoulli trials with probability of success p are available. The number X of failures that occur before the rth success has the following p.d.f.:

The Geometric Distributions The most common special case of a negative binomial random variable is one for which r = 1. This would be the number of failures until the first success. A random variable X has the geometric distribution with parameter p (0<p<1) if X has a discrete distribution for which the p.f. f (x|1, p) is as follows:

Theorem If X1,..., Xr are i.i.d. random variables and if eachXi has the geometric distribution with parameter p, then the sum X Xr has the negative binomial distribution with parameters r and p.

Properties of Negative Binomial and Geometric Distributions Moment Generating Function: Let X1,...,Xr be a random sample of r geometric random variables each with parameter p. The m.g.f. ψ1(t) of X1 is it is known from elementary calculus that for every number α (0 <α <1), Each of X1,...,Xr has the same m.g.f., namely, ψ1. Hence, the m.g.f. of X = X Xr is ψ(t) = [ψ1(t)] r.

Mean and Variance Let X1 have the geometric distribution with parameter p. We will find the mean and variance by differentiating the m.g.f. of geometric distribution If X has the negative binomial distribution with parameters r and p, represent it as the sum X = X Xr of r independent random variables, each having the same distribution as X1.