Download presentation
Presentation is loading. Please wait.
1
Discrete Probability Distributions
This lecture was organized by Dr. Alemi. Farrokh Alemi Ph.D.
2
Discrete Probability Distributions
Bernoulli Geometric Binomial Poisson This lecture introduces you to various probability mass functions including, Bernoulli, Binominal, Geometric and Poisson distributions. We show how these are related to each other and how they can be used to describe real processes in health care settings.
3
Definitions Density function
A function assigns numbers to events. A probability density function gives the probability of a group or events or a single event. On the X-axis is the categories of events of interest arranged from low to high values. On the Y axis is the probability of cases that fall within the category.
4
Definitions Cumulative probability function
In contrast, a cumulative function gives the probability of all events less than or equal to a specific level. Both functions require us to sort events from a low value to a high value.
5
Definitions In this example, the number of medication errors are sorted and listed to the left. The first column shows the probability density function. At each value, it provides the probability of the event listed. For example, 90% of patients have no medication errors, 6% of patients have 1 medication error and 4% have 2 medication errors. The cumulative distribution function is also given in the right column. It starts at 90% and increases or stays the same thereafter. The cumulative distribution function gives probability of occurrence of all of the events with values less than or equal to the event. For example, 96% is the probability of having patients with 1 or 0 medication errors. The steps in a cumulative distribution function are equality to the probability of the event at the step.
6
Expected Value Probability density function can be used to calculate expected value for an uncertain event. The expected value is calculated by multiplying the probability of the event by its value and summing across all events.
7
Expected Value The probability of event i occurring is multiplied by
8
Expected Value The value of event i.
9
Expected Value And summed over all possible values of the event. Here we show this as a sum over event i going from 1 to n.
10
Example Here is an application of the formula to data. In this example, the number of medication errors are sorted and listed to the left. The first column shows the probability density function. At each value, it provides the probability of the event listed. For example, 90% of patients have no medication errors, 6% of patients have 1 medication error and 4% have 2 medication errors. To calculate the expected value, first we multiply the value of each event by its probability. In the first row, 0 medication errors is multiplied by 90% chance to obtain 0.
11
Example In the second row, 6% is multiplied by 1 to obtain 0.06 and 4% is multiplied by 2 to obtain 0.08.
12
Example The expected medication error is the sum of the products of the event value and its probability. In this example we expect 0.12 medication errors.
13
Example Let us say that we want to chart the density and distribution functions of the following data. Calculate expected number of medication errors.
14
Example We begin with calculation of the cumulative function from the density function. A cumulative density function shows the probability of events less than or equal to a particular value. It never decreases. At each step we add the value of the density function to the previous value of the cumulative function.
15
Example The expected number of events is calculated as the sum of number of events times the density function.
16
Density & Cumulative Distributions
This chart shows the probability density and the cumulative distribution functions.. The size of the step in the cumulative distribution is the same as the probability of the event. For example the rise from 0 to 1 medication error is equal to the probability of 1 medication error. The shape of distribution functions tells a great deal about the function. For discrete variables, the type of data we focus on in this course, there are some very unique distribution functions, each with a signature shape.
17
Typical Probability Density Functions
Bernoulli Geometric Binomial Poisson Knowing the probability density function is a very important step in deciding what to expect. Naturally, a great deal of thought has gone into recognizing different probability distributions. The most common density probability functions for discrete variables are Bernoulli, Binomial, Geometric, and Poisson functions. We will describe each of these functions and explain the relationships among them. These functions are useful in describing a large number of events, for example the probability of wrong side surgery, the time to next gun shot in a hospital, the probability of medication errors over a large number of visits, the arrival rate of security incidences and so on. If our focus remains on events that either happen or do not occur, then these four distributions are sufficient to describe many aspects of these events.
18
Bernoulli Probability Density Function
Mutually exclusive Bernoulli density function is the most common discrete density function. It assumes that two outcomes are possible. Either the event occurs or it does not. In other words the events of interest are mutually exclusive.
19
Bernoulli Probability Density Function
Exhaustive It also assumes that the possible outcomes are exhaustive, meaning at least one of the outcomes must occur.
20
Bernoulli Probability Density Function
In a Bernoulli density function, the event occurs with a constant probability of p. Typically, it is assumed that the probability is calculated for a specific number of trials or a specific period of time.
21
Bernoulli Probability Density Function
For example, we can talk of probability of patient elopement in our facility to be 0.05 in a month. 95% of patients do not elope in any single month but 5% do. This chart shows both the density and cumulative probability distribution for these data. The red line shows the cumulative function and the bar graph shows the density function.
22
Independent Bernoulli Trials
Lets now think through a situation where a Bernoulli event is repeatedly tried. Let us assume that in these trials the probability of occurrence of the event is not affected by its past occurrence, in other words that each trial is independent of all previous ones. As we will see shortly when we are dealing with independent repeated trials of Bernoulli events, a number of common density functions may describe various events.
23
Independent Bernoulli Trials
Let us start with looking at an example of repeated independent trials. In this situation we have 3 repeated trials for tracking patients elopement over time. We are assuming that the probability of elopement does not change if a patient has eloped in the prior months. In month 1, the patient may elope or not. In month two, the same event may repeat. The process continues until month three. As you can see the patient may elope in different month and on each month this probability of elopement is constant and equal to prior months. Independence cannot be assumed for all repeated trials. For example, probability of contagious infection changes if there was an infection in the prior time periods. Therefore, independent trials cannot be assumed in this situation. But in many situations it can and when we can there is a lot we can tell about the probability function under this assumption.
24
Geometric Density Function
If a Bernoulli event is repeated until the event occurs, then the number of trials to the occurrence of the event has a Geometric distribution. The Geometric density function is given by multiplying probability of 1 occurrence by probability of k-1 non-occurrence that should precede it.
25
Geometric Density Function
An interesting property of Geometric probability density function is that the expected number of trials prior to occurrence of the event is given by dividing 1 by the probability of the occurrence of the event in every trial.
26
Geometric Density Function
If we know that wrong side surgery has occurred only once in last decade, how many surgical days before we can expect another wrong side surgery?
27
Do One No medication errors have occurred in the past 90 days. What is the maximum daily probability of medication error in our facility? See if you can calculate these probabilities.
28
Do One The time between patient falls was calculated to be 3 days, 60 days and 15 days. What is the daily probability of patient falls? See if you can calculate these probabilities.
29
Binomial Probability Distribution
Number of k occurrences of the event in n independent trials Assume that we repeat the Bernoulli trials and that each time we do so the probability of occurrence of the event stays the same and is independent of what has happened in the past. The Binomial density function gives the probability of having k occurrences of the event in n Bernoulli trials.
30
Independent Bernoulli Trials
For example, having 2 patients elope in 3 months is possible if patients elope in months 1 and 2 but not 3 and if they elope in months 1 and 3 and not 2. Finally the third possibility is the patients elope in month 2 and 3 and not 1.
31
Independent Bernoulli Trials
The probability of elopement is P. The probability of no elopement is one minus P. The probability of exactly two patients eloping in the first two months is P squared times one minus P.
32
Independent Bernoulli Trials
The probability of exactly two patients eloping in the last two months is given again as one minus P times P squared.
33
Independent Bernoulli Trials
The probability of two patients eloping in the first and last months is given again as P squared times one minus P. P x (1-P) x P
34
Binominal Probability Distribution
Different combinations Success probabilities Failure probabilities The exercise with elopement has shown how binominal distribution is constructed. There are three components.
35
Binomial Probability Distribution
The first component is the count of different combinations of getting exactly certain number of success and failures. In the binominal formula, n! is referred to as n factorial. The proportion of n!/(k!(n-k)!) counts the number of different ways k occurrence of the event might be arranged in n trials.
36
Binomial Probability Distribution
The second component is the probability of the event to power of number of trials with the event.
37
Binomial Probability Distribution
The third component is the probability of failure to power of number of failures. The term (1-p) to the power of n-k measures the probability of n-k non-occurrence of the event.
38
6 Trials of Binomial p=1/2 Let us look at some examples so you can have an intuitive understanding of the Binomial distribution. Here you see a Binomial distribution with 6 trials. There are seven possibilities. Either the event never occurs or it occurs once, twice, three, four, five, or six times. The probability density function shows the likely occurrences. The most likely situation is for the event to occur three times in 6 days.
39
6 Trials of Binomial p=1/2 This is also the expected value and can be obtained by multiplying the number of trials by the probability of occurrence of the Bernoulli event.
40
6 Trials of Binomial p=0.05 Let us now go back to the patient elopement example, where the monthly probability of elopement was 5%. Over a 6 month period, we are most likely to see no patients elope. There is a 23% chance of seeing one elopement and there is a 3% chance of seeing two elopements.
41
Example If the monthly probability of elopement is 0.05, how many patients will elope in 2 years? Now we can answer this exercise more fully. If the daily probability of elopement is 0.05,
42
Example If the monthly probability of elopement is 0.05, how many patients will elope in 2 years? the expected number of elopement over two years is probability of elopement times the number of trials, in this case 24 months.
43
Example If the daily probability of death due to injury from a ventilation machine is 0.002, what is the probability of having 1 or more deaths in 30 days? If the daily probability of death due to injury from a ventilation machine is 0.002, what is the probability of having 1 or more deaths in 30 days? What is the probability of 1 or more deaths in 4 months?
44
Example If the daily probability of death due to injury from a ventilation machine is 0.002, what is the probability of having 1 or more deaths in 30 days? Here the number of trials is 30 days. There is only one combination of getting 0 death so the factorial term is equal to 1. The probability of survival is and we expect 30 days of survival. The probability of death is but we expect no one to die. So the probability of zero deaths is given as So the probability of having 1 or more deaths is
45
Do One Which is more likely, 2 patients failing to comply with medication orders in 15 days or 3 patients failing to comply with medication orders in 30 days. See if you can answer this question
46
Poisson Density Function
Large number of trials Small probabilities of occurrence As the number of trials increases and the probability of occurrence drops, Poisson distribution approximates Binomial distribution. In healthcare management, this occurs often. Typically we are looking at a large number of visits or days and the sentinel event has a very low probability of occurrence. In these circumstances, the number of occurrence of the event can be estimated by Poisson distribution.
47
Poisson Density Function
In this formula, lambda is the expected number of trials and is equal to n times p, where n is the number of trials and p is the probability of the occurrence of the event in one trial.
48
Poisson Density Function
k is the number of occurrences of the sentinel event and e is , the base of natural logarithms
49
Take Home Lesson Repeated independent Bernoulli trials is the foundation of many distributions
50
Do One What is the probability of observing one or more security violations, when the daily probability of violations is .01 and we are monitoring the organization for 4 months. See if you can answer this question.
51
Do Another How many visits will it take to have at least one medication error if the estimated probability of medication error in a visit is 0.03? See if you can answer this question.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.