The Bernoulli distribution

Slides:



Advertisements
Similar presentations
Special random variables Chapter 5 Some discrete or continuous probability distributions.
Advertisements

Discrete Uniform Distribution
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
The Bernoulli distribution Discrete distributions.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Today Today: Finish Chapter 4, Start Chapter 5 Reading: –Chapter 5 (not 5.12) –Important Sections From Chapter (excluding the negative hypergeometric.
Chapter 4 Discrete Random Variables and Probability Distributions
Review.
Chapter 6 Continuous Random Variables and Probability Distributions
Probability Distributions
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Continuous Random Variables and Probability Distributions
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved.
Class notes for ISE 201 San Jose State University
Discrete and Continuous Distributions G. V. Narayanan.
Jointly distributed Random variables
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Chapter 4 Continuous Random Variables and Probability Distributions
Binomial and Related Distributions 學生 : 黃柏舜 學號 : 授課老師 : 蔡章仁.
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Section 15.8 The Binomial Distribution. A binomial distribution is a discrete distribution defined by two parameters: The number of trials, n The probability.
Random Variables & Probability Distributions Outcomes of experiments are, in part, random E.g. Let X 7 be the gender of the 7 th randomly selected student.
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Moment Generating Functions
Random Sampling, Point Estimation and Maximum Likelihood.
The Negative Binomial Distribution An experiment is called a negative binomial experiment if it satisfies the following conditions: 1.The experiment of.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Continuous Distributions The Uniform distribution from a to b.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Discrete Probability Distributions. Random Variable Random variable is a variable whose value is subject to variations due to chance. A random variable.
2.1 Introduction In an experiment of chance, outcomes occur randomly. We often summarize the outcome from a random experiment by a simple number. Definition.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Chapter 2 Random variables 2.1 Random variables Definition. Suppose that S={e} is the sampling space of random trial, if X is a real-valued function.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Math b (Discrete) Random Variables, Binomial Distribution.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Methodology Solving problems with known distributions 1.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Discrete Random Variables. Discrete random variables For a discrete random variable X the probability distribution is described by the probability function,
4.3 More Discrete Probability Distributions NOTES Coach Bridges.
Random Variables Example:
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
Continuous Random Variables and Probability Distributions
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
Chap 5-1 Chapter 5 Discrete Random Variables and Probability Distributions Statistics for Business and Economics 6 th Edition.
2.2 Discrete Random Variables 2.2 Discrete random variables Definition 2.2 –P27 Definition 2.3 –P27.
Continuous Distributions
Chapter 3 Applied Statistics and Probability for Engineers
3. Random Variables (Fig.3.1)
Math 4030 – 4a More Discrete Distributions
The Exponential and Gamma Distributions
Random Variables.
Expectations of Random Variables, Functions of Random Variables
Engineering Probability and Statistics - SE-205 -Chap 3
Chapter 2 Discrete Random Variables
Discrete Random Variables
Chapter 5 Statistical Models in Simulation
Moment Generating Functions
Probability Review for Financial Engineers
Using the Tables for the standard normal distribution
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
Continuous Distributions
Moments of Random Variables
Presentation transcript:

The Bernoulli distribution Discrete distributions The Bernoulli distribution

The Binomial distribution p(x) x X = the number of successes in n repetitions of a Bernoulli trial p = the probability of success

The Poisson distribution Events are occurring randomly and uniformly in time. X = the number of events occuring in a fixed period of time.

P[X = x] = p(x) = p(1 – p)x – 1 = pqx – 1 The Geometric distribution the Bernoulli trials are repeated independently the first success occurs (,k = 1) and X = the trial on which the 1st success occurred. P[X = x] = p(x) = p(1 – p)x – 1 = pqx – 1 The Negative Binomial distribution the Bernoulli trials are repeated independently until a fixed number, k, of successes has occurred and X = the trial on which the kth success occurred. Geometric ≡ Negative Binomial with k = 1

The Hypergeometric distribution Suppose we have a population containing N objects. The population are partitioned into two groups. a = the number of elements in group A b = the number of elements in the other group (group B). Note N = a + b. n elements are selected from the population at random. X = the elements from group A. (n – X will be the number of elements from group B.)

Example: Hyper-geometric distribution Suppose that N = 10 automobiles have just come off the production line. Also assume that a = 3 are defective (have serious defects). Thus b = 7 are defect-free. A sample of n = 4 are selected and tested to see if they are defective. Let X = the number in the sample that are defective. Find the probability function of X. From the above discussion X will have a hyper-geometric distribution i.e.

Table and Graph of p(x)

Sampling with and without replacement Suppose we have a population containing N objects. Suppose the elements of the population are partitioned into two groups. Let a = the number of elements in group A and let b = the number of elements in the other group (group B). Note N = a + b. Now suppose that n elements are selected from the population at random. Let X denote the elements from group A. (n – X will be the number of elements from group B.) Find the probability distribution of X. If the sampling was done with replacement. If the sampling was done without replacement

Solution: If the sampling was done with replacement. Then the distribution of X is the Binomial distn.

If the sampling was done without replacement. Then the distribution of X is the hyper-geometric distn.

Note:

for large values of N, a and b Thus for large values of N, a and b Thus for large values of N, a and b sampling with replacement is equivalent to sampling without replacement.

Continuous Distributions

Continuous random variables For a continuous random variable X the probability distribution is described by the probability density function f(x), which has the following properties : f(x) ≥ 0

Graph: Continuous Random Variable probability density function, f(x)

The Uniform distribution from a to b

Definition: A random variable , X, is said to have a Uniform distribution from a to b if X is a continuous random variable with probability density function f(x):

Graph: the Uniform Distribution (from a to b)

The Cumulative Distribution function, F(x) (Uniform Distribution from a to b)

Cumulative Distribution function, F(x)

The Normal distribution

Definition: A random variable , X, is said to have a Normal distribution with mean m and standard deviation s if X is a continuous random variable with probability density function f(x):

Graph: the Normal Distribution (mean m, standard deviation s)

Note: Thus the point m is an extremum point of f(x). (In this case a maximum)

Thus the points m – s, m + s are points of inflection of f(x)

Also Proof: To evaluate Make the substitution

Consider evaluating Note: Make the change to polar coordinates (R, q) z = R sin(q) and u = R cos(q)

Hence and or and Using

and or

The Exponential distribution

Consider a continuous random variable, X with the following properties: P[X ≥ 0] = 1, and P[X ≥ a + b] = P[X ≥ a] P[X ≥ b] for all a > 0, b > 0. These two properties are reasonable to assume if X = lifetime of an object that doesn’t age. The second property implies:

The property: models the non-aging property i.e. Given the object has lived to age a, the probability that is lives a further b units is the same as if it was born at age a.

Let F(x) = P[X ≤ x] and G(x) = P[X ≥ x] . Since X is a continuous RV then G(x) = 1 – F(x) (P[X ≥ x] = 0 for all x.) The two properties can be written in terms of G(x): G(0) = 1, and G(a + b) = G(a) G(b) for all a > 0, b > 0. We can show that the only continuous function, G(x), that satisfies 1. and 2. is a exponential function

From property 2 we can conclude Using induction Hence putting a = 1. Also putting a = 1/n. Finally putting a = 1/m.

Since G(x) is continuous for all x ≥ 0. If G(1) = 0 then G(x) = 0 for all x > 0 and G(0) = 0 if G is continuous. (a contradiction) If G(1) = 1 then G(x) = 1 for all x > 0 and G() = 1 if G is continuous. (a contradiction) Thus G(1) ≠ 0, 1 and 0 < G(1) < 1 Let l = - ln(G(1)) then G(1) = e-l

To find the density of X we use: A continuous random variable with this density function is said to have the exponential distribution with parameter l.

Graphs of f(x) and F(x) f(x) F(x)

Another derivation of the Exponential distribution Consider a continuous random variable, X with the following properties: P[X ≥ 0] = 1, and P[x ≤ X ≤ x + dx|X ≥ x] = ldx for all x > 0 and small dx. These two properties are reasonable to assume if X = lifetime of an object that doesn’t age. The second property implies that if the object has lived up to time x, the chance that it dies in the small interval x to x + dx depends only on the length of that interval, dx, and not on its age x.

Determination of the distribution of X Let F (x ) = P[X ≤ x] = the cumulative distribution function of the random variable, X . Then P[X ≥ 0] = 1 implies that F(0) = 0. Also P[x ≤ X ≤ x + dx|X ≥ x] = ldx implies

We can now solve the differential equation for the unknown F.

Now using the fact that F(0) = 0. This shows that X has an exponential distribution with parameter l.