Bernoulli Processes. Stochastic Processes A stochastic process is a collection of random variables: –{X(t), t  T} t may be either discrete: –E.g., number.

Slides:



Advertisements
Similar presentations
Special random variables Chapter 5 Some discrete or continuous probability distributions.
Advertisements

Many useful applications, especially in queueing systems, inventory management, and reliability analysis. A connection between discrete time Markov chains.
Why this can be happen to me?. Can you think, who’ll be the faster catch the fish??
Discrete Uniform Distribution
Acknowledgement: Thanks to Professor Pagano
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Probability Distribution
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Binomial Random Variables. Binomial experiment A sequence of n trials (called Bernoulli trials), each of which results in either a “success” or a “failure”.
1 Review of Probability Theory [Source: Stanford University]
Chapter 3-Normal distribution
Bernoulli Distribution
1 1 Slide © 2006 Thomson/South-Western Chapter 5 Discrete Probability Distributions n Random Variables n Discrete Probability Distributions.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Special discrete distributions Sec
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Class notes for ISE 201 San Jose State University
381 Discrete Probability Distributions (The Binomial Distribution) QSCI 381 – Lecture 13 (Larson and Farber, Sect 4.2)
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
LECTURE 14 SIMULATION AND MODELING Md. Tanvir Al Amin, Lecturer, Dept. of CSE, BUET CSE 411.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Discrete Random Variables Chapter 4.
Section 15.8 The Binomial Distribution. A binomial distribution is a discrete distribution defined by two parameters: The number of trials, n The probability.
Random Variables & Probability Distributions Outcomes of experiments are, in part, random E.g. Let X 7 be the gender of the 7 th randomly selected student.
Binomial and Geometric Distributions Delta On-Time Performance at Hartsfield- Jackson Atlanta International (June, June, 2015)
1 Bernoulli trial and binomial distribution Bernoulli trialBinomial distribution x (# H) 01 P(x)P(x)P(x)P(x)(1 – p)p ?
The Negative Binomial Distribution An experiment is called a negative binomial experiment if it satisfies the following conditions: 1.The experiment of.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
ENGR 610 Applied Statistics Fall Week 3 Marshall University CITE Jack Smith.
JMB Chapter 5 Part 2 EGR Spring 2011 Slide 1 Multinomial Experiments  What if there are more than 2 possible outcomes? (e.g., acceptable, scrap,
BIA 2610 – Statistical Methods Chapter 5 – Discrete Probability Distributions.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 5 Discrete Random Variables.
Math 4030 – 4a Discrete Distributions
Binomial Distributions. Quality Control engineers use the concepts of binomial testing extensively in their examinations. An item, when tested, has only.
Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 11 Probability Models for Counts.
COMP 170 L2 L17: Random Variables and Expectation Page 1.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Section 4.6: Negative Binomial Distribution Jiaping Wang.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
The Binomial Distribution
Some Common Discrete Random Variables. Binomial Random Variables.
1 Week NORMAL DISTRIBUTION BERNOULLI TRIALS BINOMIAL DISTRIBUTION EXPONENTIAL DISTRIBUTION UNIFORM DISTRIBUTION POISSON DISTRIBUTION.
6.2 BINOMIAL PROBABILITIES.  Features  Fixed number of trials (n)  Trials are independent and repeated under identical conditions  Each trial has.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Chapter 8 Counting, Probability Distributions, and Further Probability Topics Probability Distributions & Expected Value , 8.2 Multiplication.
AP Statistics Chapter 8 Section 2. If you want to know the number of successes in a fixed number of trials, then we have a binomial setting. If you want.
8.2 The Geometric Distribution 1.What is the geometric setting? 2.How do you calculate the probability of getting the first success on the n th trial?
Math 4030 – 4a More Discrete Distributions
Known Probability Distributions
Discrete random variable X Examples: shoe size, dosage (mg), # cells,…
Multinomial Experiments
Useful Discrete Random Variables
Discrete Probability Distributions
The Binomial Distribution
Part 2: Named Discrete Random Variables
Multinomial Experiments
Useful Discrete Random Variable
Some Discrete Probability Distributions Part 2
Discrete Variables Classes
Some Discrete Probability Distributions Part 2
Multinomial Experiments
Chapter 3 : Random Variables
Bernoulli Trials Two Possible Outcomes Trials are independent.
Multinomial Experiments
Multinomial Experiments
Multinomial Experiments
The Geometric Distributions
Chapter 5 Discrete Probability Distributions
Multinomial Experiments
Each Distribution for Random Variables Has:
Multinomial Experiments
Presentation transcript:

Bernoulli Processes

Stochastic Processes A stochastic process is a collection of random variables: –{X(t), t  T} t may be either discrete: –E.g., number of trials or continuous: –E.g., time, location

Bernoulli Processes A Bernoulli process is a collection of independent, identically distributed Bernoulli random variables: –P(X i = 1) = p –P(X i = 0) = 1 – p –E(X) = p –Variance(X) = p – p 2

Number of Successes in n Trials Number of successes in n trials: –Binomial(n, p) –, k = 0, 1, …, n –E(K) = n p –Variance(K) = n p (1 – p)

Number of Trials until a Success Number of trials between 2 arrivals (inter-arrival time): –Geometric(p) –P(N = 1) = p –P(N = 2) = p (1 – p), … –P(N = n) = p (1 – p) n-1, n = 1, 2, … –E(N) = 1/p –Variance(N) = (1 – p)/p 2

Number of Trials until a Success Number of remaining trials until next arrival is also geometric(p): –Regardless of how many trials have already elapsed! The process is “memory-less”

Number of Trials until r th Success Number of trials until the r th success (r th -order inter-arrival time): –Pascal distribution (Drake), or “negative binomial” distribution (Ross) –P(r th success on n th trial) = P(r–1 successes in n–1 trials) P(success on n th )=, n = r, r + 1, … –E(N r ) = r/p –Variance(N r ) = r (1 – p)/p 2

# of Failures until r th Success Number of failures until the r th success: –Negative Binomial distribution (Drake) –P(k failures before r th success) = P(k failures in k+r–1 trials) P(success on k+r) =, k = 0, 1, 2, … –E(K r ) = r (1/p – 1) = r (1 – p)/p

Sample Applications Number of customers wanting a particular service (Binomial) Number of defects out of n products (Binomial) Number of wasted items to complete an order (Negative Binomial) Number of customers until we run out (Pascal) Number of visits until a success call (Geometric) Number of uses until an item fails (Geometric): –If it is memory-less!

Random Erasures Consider a Bernoulli process with success probability p 1 Now assume that each success is randomly “erased” (turned into a failure) with probability p 2 The resulting sequence of successes and failures is a Bernoulli process with success probability p 1 (1 - p 2 )

Systematic Erasures Consider a Bernoulli process with success probability p 1 Now assume that every 2 nd success is “erased” (turned into a failure) Is the resulting sequence of successes and failures a Bernoulli process? –Think about the assumptions of a Bernoulli process!

Example Fred is giving out samples of dog food: –He makes calls door to door –He leaves a sample when the door is answered and a dog is there –P(door answered) = 3/4 –P(dog there) = 2/3 –Assume that “door answered” and “dog there” are independent (?) –Assume also that successive calls are independent (?)

Example By independence: –P(answered) = 3/4 –P(dog) = 2/3 –P(door answered and dog there) = 3/4 (2/3) = 1/2

Example Probability Fred gives out 1 st sample on 3 rd call: –P(no samples on 1 st 2 calls) P(sample on 3 rd )= (1 – 1/2) 2 1/2 = (1/4) (1/2) = 1/8 Probability Fred gives out 5 th sample on 11 th call: –Given that he gave out 4 samples in 1 st 8 calls –P[Fred gives out (5-4) th sample on (11-8) th call] = (1 – 1/2) 2 1/2 = (1/4) (1/2) = 1/8 Geometric distribution!

Example Probability Fred gives out 2 nd sample on 5 th call: –P(1 success in 1 st 4 trials) P(success on 5 th ) = – = 4 (1/2) 5 = 1/8 –Pascal distribution Probability Fred gives out 2 nd sample on 5 th call: –Given that he did not give out 2 nd sample on 2 nd call –P(2 nd sample on 5 th call)/P(2 nd sample not on 2 nd call) = [1/8]/[1 – (1/2) 2 ] = (1/8)/(1 – 1/4) = (1/8)/(3/4) = 1/6 –Conditional probability!

Example Fred needs new supply after he gives out last can: –Starts with two cans Probability that Fred makes  5 calls before he needs new supply: –Probability that Fred gives 2 nd sample  5 th call = –1 – P(Fred gives 2 nd sample on 2 nd, 3 rd, or 4 th call) = – = 1 – 1/4 – 1/4 - 3/16 = 5/16 –Each individual term is a Pascal distribution!