Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.

Slides:



Advertisements
Similar presentations
The Poisson distribution
Advertisements

Discrete Random Variables and Probability Distributions
Discrete Random Variables and Probability Distributions
Binomial Random Variables. Binomial experiment A sequence of n trials (called Bernoulli trials), each of which results in either a “success” or a “failure”.
Review.
Probability Distributions
Chapter 4 Probability Distributions
Probability Distributions Random Variables: Finite and Continuous Distribution Functions Expected value April 3 – 10, 2003.
Discrete Random Variables and Probability Distributions
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Lecture Slides Elementary Statistics Twelfth Edition
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Continuous Random Variables and Probability Distributions
Copyright © Cengage Learning. All rights reserved. 6 Point Estimation.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 3.5 Hypergeometric and Negative Binomial Distributions.
HAWKES LEARNING SYSTEMS Students Matter. Success Counts. Copyright © 2013 by Hawkes Learning Systems/Quant Systems, Inc. All rights reserved. Section 5.2.
Copyright © Cengage Learning. All rights reserved. 3.4 The Binomial Probability Distribution.
Discrete Random Variables and Probability Distributions
Chapter 6: Probability Distributions
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Review and Preview This chapter combines the methods of descriptive statistics presented in.
Chapter 8 Day 1. The Binomial Setting - Rules 1. Each observations falls under 2 categories we call success/failure (coin, having a child, cards – heart.
Slide 1 Copyright © 2004 Pearson Education, Inc..
Slide Slide 1 Copyright © 2007 Pearson Education, Inc Publishing as Pearson Addison-Wesley. Lecture Slides Elementary Statistics Tenth Edition and the.
Chapter 7 Lesson 7.5 Random Variables and Probability Distributions
1 Lecture 7: Discrete Random Variables and their Distributions Devore, Ch
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Statistical Applications Binominal and Poisson’s Probability distributions E ( x ) =  =  xf ( x )
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Binomial Experiment A binomial experiment (also known as a Bernoulli trial) is a statistical experiment that has the following properties:
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Chapter 8: Binomial and Geometric Distributions. Binomial vs. Geometric The Binomial Setting The Geometric Setting 1.Each observation falls into one of.
Copyright (C) 2002 Houghton Mifflin Company. All rights reserved. 1 Understandable Statistics Seventh Edition By Brase and Brase Prepared by: Mistah Flynn.
Copyright © Cengage Learning. All rights reserved. Elementary Probability Theory 5.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
1 7.3 RANDOM VARIABLES When the variables in question are quantitative, they are known as random variables. A random variable, X, is a quantitative variable.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 2 Probability.
Chapter 6: Random Variables
1 Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Copyright ©2011 Brooks/Cole, Cengage Learning Random Variables Class 34 1.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Copyright (C) 2002 Houghton Mifflin Company. All rights reserved. 1 Understandable Statistics Seventh Edition By Brase and Brase Prepared by: Lynn Smith.
Slide 1 Copyright © 2004 Pearson Education, Inc. Chapter 5 Probability Distributions 5-1 Overview 5-2 Random Variables 5-3 Binomial Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 8 PROBABILITY DISTRIBUTIONS AND STATISTICS.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Copyright © Cengage Learning. All rights reserved. 8 PROBABILITY DISTRIBUTIONS AND STATISTICS.
Probability Distribution for Discrete Random Variables
Chapter 3 Applied Statistics and Probability for Engineers
Chapter Five The Binomial Probability Distribution and Related Topics
Chapter 7 Lesson 7.5 Random Variables and Probability Distributions
Discrete Random Variables and Probability Distributions
3 Discrete Random Variables and Probability Distributions
Section 7.3: Probability Distributions for Continuous Random Variables
Discrete Random Variables and Probability Distributions
Expected Values.
Discrete Random Variables and Probability Distributions
Cumulative Distribution Function
Presentation transcript:

Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions

Copyright © Cengage Learning. All rights reserved. 3.2 Probability Distributions for Discrete Random Variables

3 Probability Distributions for Discrete Random Variables Probabilities assigned to various outcomes in in turn determine probabilities associated with the values of any particular rv X. The probability distribution of X says how the total probability of 1 is distributed among (allocated to) the various possible X values. Suppose, for example, that a business has just purchased four laser printers, and let X be the number among these that require service during the warranty period.

4 Probability Distributions for Discrete Random Variables Possible X values are then 0, 1, 2, 3, and 4. The probability distribution will tell us how the probability of 1 is subdivided among these five possible values— how much probability is associated with the X value 0, how much is apportioned to the X value 1, and so on. We will use the following notation for the probabilities in the distribution: p (0) = the probability of the X value 0 = P(X = 0) p (1) = the probability of the X value 1 = P(X = 1) and so on. In general, p (x) will denote the probability assigned to the value x.

5 Example 3.7 The Cal Poly Department of Statistics has a lab with six computers reserved for statistics majors. Let X denote the number of these computers that are in use at a particular time of day. Suppose that the probability distribution of X is as given in the following table; the first row of the table lists the possible X values and the second row gives the probability of each such value.

6 Example 3.7 We can now use elementary probability properties to calculate other probabilities of interest. For example, the probability that at most 2 computers are in use is P(X  2) = P(X = 0 or 1 or 2) = p(0) + p(1) + p(2) = =.30 cont’d

7 Example 3.7 Since the event at least 3 computers are in use is complementary to at most 2 computers are in use, P(X  3) = 1 – P(X  2) = 1 –.30 =.70 which can, of course, also be obtained by adding together probabilities for the values, 3, 4, 5, and 6. cont’d

8 Example 3.7 The probability that between 2 and 5 computers inclusive are in use is P(2  X  5) = P(X = 2, 3, 4, or 5) = =.75 whereas the probability that the number of computers in use is strictly between 2 and 5 is P(2 < X < 5) = P(X = 3 or 4) = =.45 cont’d

9 Probability Distributions for Discrete Random Variables Definition

10 Probability Distributions for Discrete Random Variables In words, for every possible value x of the random variable, the pmf specifies the probability of observing that value when the experiment is performed. The conditions p (x)  0 and  all possible x p (x) = 1 are required of any pmf. The pmf of X in the previous example was simply given in the problem description. We now consider several examples in which various probability properties are exploited to obtain the desired distribution.

11 A Parameter of a Probability Distribution

12 A Parameter of a Probability Distribution The pmf of the Bernoulli rv X in Example 3.9was p(0) =.8 and p(1) =.2 because 20% of all purchasers selected a desktop computer. At another store, it may be the case that p(0) =.9 and p(1) =.1. More generally, the pmf of any Bernoulli rv can be expressed in the form p (1) =  and p (0) = 1 – , where 0 <  < 1. Because the pmf depends on the particular value of  we often write p (x;  ) rather than just p (x): (3.1)

13 A Parameter of a Probability Distribution Then each choice of a in Expression (3.1) yields a different pmf. Definition

14 A Parameter of a Probability Distribution The quantity  in Expression (3.1) is a parameter. Each different number  between 0 and 1 determines a different member of the Bernoulli family of distributions.

15 Example 3.12 Starting at a fixed time, we observe the gender of each newborn child at a certain hospital until a boy (B) is born. Let p = P (B), assume that successive births are independent, and define the rv X by x = number of births observed. Then p(1) = P(X = 1) = P(B) = p

16 Example 3.12 p(2) = P(X = 2) = P(GB) = P(G )  P(B) = (1 – p)p and p(3) = P(X = 3) = P(GGB) = P(G)  P(G)  P(B) = (1 – p) 2 p cont’d

17 Example 3.12 Continuing in this way, a general formula emerges: The parameter p can assume any value between 0 and 1. Expression (3.2) describes the family of geometric distributions. In the gender example, p =.51 might be appropriate, but if we were looking for the first child with Rh-positive blood, then we might have p =.85. cont’d (3.2)

18 The Cumulative Distribution Function

19 The Cumulative Distribution Function For some fixed value x, we often wish to compute the probability that the observed value of X will be at most x. For example, let X be the number of number of beds occupied in a hospital’s emergency room at a certain time of day; suppose the pmf of X is given by Then the probability that at most two beds are occupied is

20 The Cumulative Distribution Function

21 The Cumulative Distribution Function

22 The Cumulative Distribution Function Definition

23 Example 3.13 A store carries flash drives with either 1 GB, 2 GB, 4 GB, 8 GB, or 16 GB of memory. The accompanying table gives the distribution of Y = the amount of memory in a purchased drive:

24 Example 3.13 Let’s first determine F (y) for each of the five possible values of Y: F (1) = P (Y  1) = P (Y = 1) = p (1) =.05 F (2) = P (Y  2) = P (Y = 1 or 2) = p (1) + p (2) =.15 cont’d

25 Example 3.13 F(4) = P(Y  4) = P(Y = 1 or 2 or 4) = p(1) + p(2) + p(4) =.50 F(8) = P(Y  8) = p(1) + p(2) + p(4) + p(8) =.90 F(16) = P(Y  16) = 1 cont’d

26 Example 3.13 Now for any other number y, F (y) will equal the value of F at the closest possible value of Y to the left of y. For example, F(2.7) = P(Y  2.7) = P(Y  2) = F(2) =.15 F(7.999) = P(Y  7.999) = P(Y  4) = F(4) =.50 cont’d

27 Example 3.13 If y is less than 1, F (y) = 0 [e.g. F(.58) = 0], and if y is at least 16, F (y) = 1[e.g. F(25) = 1]. The cdf is thus cont’d

28 Example 3.13 A graph of this cdf is shown in Figure 3.5. Figure 3.13 A graph of the cdf of Example 3.13 cont’d

29 The Cumulative Distribution Function For X a discrete rv, the graph of F (x) will have a jump at every possible value of X and will be flat between possible values. Such a graph is called a step function. Proposition

30 The Cumulative Distribution Function The reason for subtracting F (a–)rather than F (a) is that we want to include P(X = a) F (b) – F (a); gives P (a < X  b). This proposition will be used extensively when computing binomial and Poisson probabilities in Sections 3.4 and 3.6.

31 Example 3.15 Let X = the number of days of sick leave taken by a randomly selected employee of a large company during a particular year. If the maximum number of allowable sick days per year is 14, possible values of X are 0, 1,..., 14.

32 Example 15 With F(0) =.58, F(1) =.72, F(2) =.76, F(3) =.81, F(4) =.88, F(5) =.94, P(2  X  5) = P(X = 2, 3, 4, or 5) = F(5) – F(1) =.22 and P(X = 3) = F(3) – F(2) =.05 cont’d