Flipping an unfair coin three times Consider the unfair coin with P(H) = 1/3 and P(T) = 2/3. If we flip this coin three times, the sample space S is the.

Slides:



Advertisements
Similar presentations
Discrete Random Variables and Probability Distributions
Advertisements

MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Why this can be happen to me?. Can you think, who’ll be the faster catch the fish??
Discrete Uniform Distribution
DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
The Bernoulli distribution Discrete distributions.
Unit 18 Section 18C The Binomial Distribution. Example 1: If a coin is tossed 3 times, what is the probability of obtaining exactly 2 heads Solution:
22C:19 Discrete Structures Discrete Probability Fall 2014 Sukumar Ghosh.
Today Today: Finish Chapter 4, Start Chapter 5 Reading: –Chapter 5 (not 5.12) –Important Sections From Chapter (excluding the negative hypergeometric.
Discrete Random Variables and Probability Distributions
Probability Distributions
Ka-fu Wong © 2003 Chap 6- 1 Dr. Ka-fu Wong ECON1003 Analysis of Economic Data.
Lecture 8. Discrete Probability Distributions David R. Merrell Intermediate Empirical Methods for Public Policy and Management.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Quantitative Methods (65211)
Irwin/McGraw-Hill © The McGraw-Hill Companies, Inc., 2000 LIND MASON MARCHAL 1-1 Chapter Five Discrete Probability Distributions GOALS When you have completed.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
6- 1 Chapter Six McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Class 3 Binomial Random Variables Continuous Random Variables Standard Normal Distributions.
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
Probability Distribution
The Binomial Distribution Permutations: How many different pairs of two items are possible from these four letters: L, M. N, P. L,M L,N L,P M,L M,N M,P.
Chapter 5 Discrete Random Variables Statistics for Business 1.
Copyright © 2015 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 C H A P T E R F I V E Discrete Probability Distributions.
MTH3003 PJJ SEM I 2015/2016.  ASSIGNMENT :25% Assignment 1 (10%) Assignment 2 (15%)  Mid exam :30% Part A (Objective) Part B (Subjective)  Final Exam:
Random Variables A random variable is simply a real-valued function defined on the sample space of an experiment. Example. Three fair coins are flipped.
Section 7.2. Section Summary Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Independence Bernoulli.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
COMP 170 L2 L17: Random Variables and Expectation Page 1.
King Saud University Women Students
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Discrete Random Variables. Discrete random variables For a discrete random variable X the probability distribution is described by the probability function,
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Review of Chapter
Random Variables Learn how to characterize the pattern of the distribution of values that a random variable may have, and how to use the pattern to find.
Random Variables Example:
Chapter 4 Random Variables - 1. Outline Random variables Discrete random variables Expected value 2.
Chapter 4. Random Variables - 3
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
Discrete Math Section 16.3 Use the Binomial Probability theorem to find the probability of a given outcome on repeated independent trials. Flip a coin.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
3/7/20161 Now it’s time to look at… Discrete Probability.
2.2 Discrete Random Variables 2.2 Discrete random variables Definition 2.2 –P27 Definition 2.3 –P27.
Binomial Probability Theorem In a rainy season, there is 60% chance that it will rain on a particular day. What is the probability that there will exactly.
Chapter 5 Discrete Probability Distributions 1. Chapter 5 Overview 2 Introduction  5-1 Probability Distributions  5-2 Mean, Variance, Standard Deviation,
CHAPTER 5 Discrete Probability Distributions. Chapter 5 Overview  Introduction  5-1 Probability Distributions  5-2 Mean, Variance, Standard Deviation,
1 What Is Probability?. 2 To discuss probability, let’s begin by defining some terms. An experiment is a process, such as tossing a coin, that gives definite.
Discrete Random Variable Random Process. The Notion of A Random Variable We expect some measurement or numerical attribute of the outcome of a random.
Chapter 3 Applied Statistics and Probability for Engineers
Now it’s time to look at…
Chapter Six McGraw-Hill/Irwin
ICS 253: Discrete Structures I
ONE DIMENSIONAL RANDOM VARIABLES
Chapter 5 Created by Bethany Stubbe and Stephan Kogitz.
Engineering Probability and Statistics - SE-205 -Chap 3
CS104:Discrete Structures
The Bernoulli distribution
Discrete Probability Chapter 7 With Question/Answer Animations
Simple Random Sample A simple random sample (SRS) of size n consists of n elements from the population chosen in such a way that every set of n elements.
Probability distributions
Some Discrete Probability Distributions
Now it’s time to look at…
Now it’s time to look at…
Discrete Random Variables and Probability Distributions
Presentation transcript:

Flipping an unfair coin three times Consider the unfair coin with P(H) = 1/3 and P(T) = 2/3. If we flip this coin three times, the sample space S is the following set of ordered triples: S = {HHH, HHT, HTH, THH, TTH, THT, HTT, TTT}. How should we assign probabilities to each of the points of S? Using independence, we assign the probability value to the point HTT. The same value should be assigned to any point with 1 H and 2 Ts, and there are such points. Thus, P(1H and 2 Ts, in any order) = More generally, P(k Hs and 3–k Ts, in any order) =, in any order) =

Bernoulli trials If we have a two outcome experiment that can be repeated in such a way that the outcome of an experiment does not affect the outcome of subsequent experiments, we call such an experiment a Bernoulli trial or a Bernoulli random variable. We call the two outcomes of a Bernoulli trial “success” and “failure”. We suppose that the probability of “success” is p and the probability of “failure” is q, where p and q are positive and p + q = 1. If n Bernoulli trials are carried out, then probabilities can be assigned in the fashion previously used for the unfair coin. This results in: P(k “successes” and n – k “failures”) =

Binomial random variable The random variable X whose probability mass function is given by is said to be a binomial random variable with parameters n and p. A binomial random variable gives the number of “successes” that occur when n independent trials, each of which result in a “success” with probability p, are performed. Example. Let X be the number of girls born to a family with 5 children. X is a binomial r. v. with n = 5, p = 0.5. Theorem. For a binomial r. v. X with parameters n and p,

Expected value for a binomial random variable: parameters n, p E[X] = The last summation equals 1, by the binomial theorem. Make sure you can justify all the steps shown above.

The maximum value of p(i) for a binomial r.v. Let [t] denote the greatest integer less than or equal t. Theorem. For a binomial random variable with parameters n and p, 0 < p < 1, the maximum value of the probability mass function p(i) occurs when Example. Let n = 10, p = 0.5. Then the maximum of the probability mass function occurs at [11(0.5)] = 5. Example. Let n = 11, p = 0.5. Then the maximum of the probability mass function occurs at [12(0.5)] = 6. By symmetry, the maximum also occurs at 11 – 6 = 5.

Error detection using a parity bit ASCII code uses 7 bits and 1 parity bit. If an odd number of bits are flipped in transmission, the parity will be wrong and the error will be detected. If an even number of bits are flipped, the error will not be detected, however. Assume the probability of an error in transmission is 0.01 both for a 1 changing to 0 and for 0 changing to 1. Further assume that the probability for an error is the same regardless of the location of the error. We let a “success” be flipping a bit and “failure” be flipping no bit. The parity checking situation is modeled as 8 Bernoulli trials. We have P(exactly one error) = P(exactly two errors) = which is quite small and the probabilities of four, six and eight errors are even smaller. We conclude that the probability of an error going undetected by the parity method is small.

Poisson random variable The random variable X whose probability mass function is given by is said to be a Poisson random variable with parameter, and A Poisson r. v. may be used as an approximation for a binomial r. v. with parameters (n, p) provided n is large and p is small enough that np has a moderate size (that is, np = for some constant ). The Poisson approximation for a binomial is generally good if p 10, use the normal approximation in Chapter 7 of the textbook. Example. The number of misprints on a page in a book is Poisson.

Example of Poisson Random Variable Problem. Suppose that, on average, in every three pages of a book there is one typographical error. If the number of typographical errors on a single page is approximately a Poisson random variable, what is the probability of at least one error on a specific page of the book? Solution. Let X be the number of errors on the page we are interested in. Then X is Poisson with E(X) = 1/3 =, and Therefore,

Poisson Processes Suppose that in the interval of time [0, t], we have a number of random events N(t) occurring. We note that for each t, N(t) is is a discrete r. v. with values in the nonnegative integers. We make the following assumptions for a Poisson process: Stationarity: probability of n events in a time interval depends only on the length of the interval Independent Increments: the number of events in nonoverlapping intervals are independent Orderliness:

Theorem on existence of Poisson Process Suppose that stationarity, independent increments, and orderliness hold, and N(0) = 0, and for all t > 0, then there exists a positive number such that That is, for all t > 0, N(t) is a Poisson r. v. with parameter Hence, E[N(t)] = and therefore, = E[N(1)]. We say that the process described in the theorem is a Poisson process with rate. For a Poisson process, suppose = 3. Evaluate

Probability that a car passes your house Suppose that you check the traffic on the street in front of your house every day after lunch. Suppose further you find that about five vehicles pass by each hour. Tomorrow after lunch you sit on a chair in front of your house at 1pm. What is the probability that at least one vehicle passes in the next 15 minutes? Solution. Use a Poisson process with λ = 5 and t in hours. Therefore, the probability that at least one vehicle passes is 1– =

Fishing example A fisherman catches fish at a Poisson rate of 2 per hour. Yesterday, he started fishing at 10 am and caught 1 fish by 10:30, and a total of 3 by noon. What is the probability that he can do this again tomorrow? Let the fishing tomorrow start at t = 0. Make sure you can justify these steps.

Geometric random variable The random variable X whose probability mass function is given by is said to be a geometric random variable with parameter p. Such a random variable represents the trial number of the first success when each trial is independently a success with probability p. Its mean and variance are given by Example. Draw a ball, with replacement, from an urn containing N white and M black balls. The number of draws required until a black ball is selected is a geometric random variable. What is p in this case?

Memoryless property of a geometric random variable A discrete random variable X with values {1, 2, 3, … } is called memoryless in case, for all positive integers m and n, Theorem. A geometric random variable is memoryless. Proof: Interpretation of theorem: In successive independent Bernoulli trials, the probability that the next n outcomes are all failures does not change if we are given that the previous m successive outcomes were all failures.

Negative binomial random variable The random variable X whose probability mass function is given by is said to be a negative binomial random variable with parameters r and p. Such a random variable represents the trial number of the rth success when each trial is independently a success with probability p. Its mean and variance are given by Example. Let X be the number of times one must throw a die until the outcome 1 has occurred 4 times. Then X is a negative binomial random variable with parameters r = 4 and p = 1/6.

Hypergeometric random variable The random variable X whose probability mass function is given by is said to be a hypergeometric random variable with parameters n, N, and m. Note: n min(m, N – m). Such a random variable represents the number of white balls selected when n balls are randomly chosen (without replacement) from an urn that contains N balls, of which m are white. With p = m/N, its mean and variance are Problem. Suppose N = 10, n = 4, and m = 5. Let X be the number of white balls. What is P(X = 4)?