Presentation is loading. Please wait.

Presentation is loading. Please wait.

Virtual University of Pakistan

Similar presentations


Presentation on theme: "Virtual University of Pakistan"— Presentation transcript:

1 Virtual University of Pakistan
Lecture No. 22 of the course on Statistics and Probability by Miss Saleha Naghmi Habibullah

2 IN THE LAST LECTURE, YOU LEARNT
Independent and Dependent Events Multiplication Theorem of Probability for Independent Events Marginal Probability

3 TOPICS FOR TODAY Bayes’ Theorem Discrete Random Variable
Discrete Probability Distribution Graphical Representation of a Mean, Standard Deviation and Coefficient of Variation of a Discrete Probability Distribution Distribution Function of a Discrete Random Variable.

4 First of all, let us discuss the BAYES’ THEOREM.
This theorem deals with conditional probabilities in an interesting way:

5 BAYES’ THEOREM If the events A1, A2, …, Ak form a PARTITION of a sample space S (that is, the events Ai are mutually exclusive and exhaustive (i.e.their union is S)), and if B is any other event of S such that it can occur ONLY IF ONE OF THE Ai OCCURS, then for any i,

6 BAYES’ THEOREM If A1, A2, ...., and Ak are mutually exclusive events of which one must occur, then If k = 2, we obtain:

7 Bayes’ Theorem for two mutually exclusive events A1 and A2:
where i = 1, 2.

8 In other words, And:

9 EXAMPLE In a developed country where cars are tested for the emission of pollutants, 25 percent of all cars emit excessive amounts of pollutants. When tested, 99 percent of all cars that emit excessive amounts of pollutants will fail, but 17 percent of the cars that do not emit excessive amounts of pollutants will also fail.

10 What is the probability that a car that fails the test actually emits excessive amounts of pollutants?

11 SOLUTION 1. Let A1 denote the event that it emits EXCESSIVE amounts of pollutants. 2. A2 denote the event that a car does NOT emit excessive amounts of pollutants. (In other words, A2 is the complement of A1.) 3. let B denote the event that a car FAILS the test. The first thing to note is that any car will either emit or not emit excessive amounts of pollutants. In other words, A1 and A2 are mutually exclusive and exhaustive events i.e. A1 and A2 form a PARTITION of the sample space S. Hence, we are in a position to apply the Bayes’ theorem.

12 4. We need to calculate P(A1|B), and, according to the Bayes’ theorem:

13 P(A1) = 0.25 P(B|A1) = 0.99, and P(B|A2) = 0.17

14 Substituting the above values in the Bayes’ theorem, we obtain:

15

16 This is the probability that a car which fails the test ACTUALLY emits excessive amounts of pollutants.

17 The example that we just considered pertained to the simplest case when we have only two mutually exclusive and exhaustive events A1 and A2. As stated earlier, the Bayes’ theorem can be extended to the case of three, four, five or more mutually exclusive and exhaustive events.

18 Let us consider another example:

19 EXAMPLE In a bolt factory, 25% of the bolts are produced by machine A, 35% are produced by machine B, and the remaining 40% are produced by machine C. Of their outputs, 2%, 4% and 5% respectively are defective bolts. If a bolt is selected at random and found to be defective, what is the probability that it came from machine A?

20 In this example, we realize that “a bolt is produced by machine A”, “a bolt is produced by machine B” and “a bolt is produced by machine C” represent three mutually exclusive and exhaustive events i.e. we can regard them as A1, A2 and A3. The event “defective bolt” represents the event B.

21 Hence, in this example, we need to determine P(A1/B).
The students are encouraged to work on this problem on their own, in order to understand the application and significance of the Bayes’ Theorem.

22 This brings us to the END of the discussion of various basic concepts of probability.
We now begin the discussion of a very important concept in mathematical statistics, i.e., the concept of PROBABILITY DISTRIBUTIONS. As stated in the very beginning of this course, there are two types of quantitative variables --- the discrete variable, and the continuous variable. Accordingly, we have the discrete probability distribution as well as the continuous probability distribution. We begin with the discussion of the discrete probability distribution. In this regard, the first concept that we need to consider is the concept of RANDOM VARIABLE.

23 RANDOM VARIABLE Such a numerical quantity whose value is determined by the outcome of a random experiment is called a random variable.

24 For example, if we toss three dice together, and let X denote the number of heads, then the random variable X consists of the values 0, 1, 2, and 3. Obviously, in this example, X is a discrete random variable.

25 Let us now discuss the concept of discrete probability distribution in detail with the help of the following example: EXAMPLE: If a biologist is interested in the number of petals on a particular flower, this number may take the values 3, 4, 5, 6, 7, 8, 9, and each one of these numbers will have its own probability.

26 Suppose that upon observing a large no
Suppose that upon observing a large no. of flowers, say 1000 flowers, of that particular species, the following results are obtained:

27

28 Since 1000 is quite a large number, hence the proportions f/f can be regarded as probabilities and hence we can write

29

30 Properties of a Discrete Probability Distribution:
(1) for each Xi (i = 1, 2, … 7) and (2)

31 And, since the number of petals on a leaf can only be a whole number, hence the variable X is known as a discrete random variable, and the probability distribution of this variable is known as a DISCRETE probability distribution. In other words: Any discrete variable that is associated with a random experiment, and attached to whose various values are various probabilities (such that is known as a Discrete Random Variable, and its probability distribution is known as a Discrete Probability Distribution.

32 Just as we can depict a frequency distribution graphically, we can draw the GRAPH of a probability distribution.

33 EXAMPLE Going back to the probability distribution of the number of petals on the flowers of a particulars species, i.e.: This distribution can be represented in the form of a line chart.

34 Line Chart Representation of the Discrete Probability Distribution
No. of Petals (x) Probability P(x) .05 .10 .25 .20 .15 .30

35 Evidently, this particular probability distribution is approximately symmetric.

36 In addition, this graph clearly shows that, just as in the case of a frequency distribution, every discrete probability distribution has a CENTRAL point and a SPREAD. Hence, similar to a frequency distribution, the discrete probability distribution has a MEAN and a STANDARD DEVIATION.

37 How do we calculate the mean and the standard deviation of a probability distribution?
Let us first consider the computation of the MEAN:

38 We know that in the case of a frequency distribution such as
the mean is given by

39 In case of a discrete probability distribution, such as the one that we have been considering i.e.
the mean is given by:

40 Hence we construct the column of XP(X), as shown below:
Hence  = E(X) = XP(X) = i.e. the mean of the given probability distribution is

41 In other words, considering a very large number of flowers of that particular species, we would expect that, on the average, a flower contains petals --- or, rounding this number, 6 petals.

42 This interpretation points to the reason why the mean of the probability distribution of a random variable X is technically called the EXPECTED VALUE of the random variable X. (“Given that the probability that the flower has 3 petals is 5%, the probability that the flower has 4 petals is 10%, and so ON, we EXPECT that on the average a flower contains petals.)

43 Next, let us consider the computation of the STANDARD DEVIATION:

44 COMPUTATION OF THE STANDARD DEVIATION:
Just as in case of a frequency distribution, we have

45 Similarly, in case of a probability distribution, we have

46 In the above example

47 Hence: =

48 Graphical Representation:
.30 Probability P(x) .25 .20 .15 .10 .05 No. of Petals (x)  = 1.3  = 5.925

49 Now that we have both the mean and the standard deviation, we are in a position to compute the coefficient of variation of this distribution:

50 Coefficient of Variation

51 Let us consider another example to understand the concept of discrete probability distribution:

52 EXAMPLE a) Find the probability distribution of the sum of the dots when two fair dice are thrown. b) Use the probability distribution to find the probabilities of obtaining (i) a sum that is greater than 8, and (ii) a sum that is greater than 5 but less than or equal to 10.

53 SOLUTION a) The sample space S is represented by the following 36 outcomes: S = {(1, 1), (1, 2), (1, 3), (1, 5), (1, 6); (2, 1), (2, 2), (2, 3), (2, 5), (2, 6); (3, 1), (3, 2), (3, 3), (3, 5), (3, 6); (4, 1), (4, 2), (4, 3), (4, 5), (4, 6); (5, 1), (5, 2), (5, 3), (5, 5), (5, 6); (6, 1), (6, 2), (6, 3), (6, 5), (6, 6) }

54 Since each of the 36 outcomes is equally likely to occur, therefore each outcome has probability 1/36.

55 Let X be the random variable representing the sum of dots which appear on the dice.
Then the values of the r.v. are 2, 3, 4, …, 12.

56 The probabilities of these values are computed as below:

57

58 Therefore the desired probability distribution of the r.v. X is

59 The probabilities in the above table clearly indicate that if we draw the line chart of this distribution, we will obtain a triangular-shaped graph. The students are encouraged to draw the graph of this probability distribution, in order to be able to develop a visual picture in their minds.

60 b) Using the probability distribution, we get the required probabilities as follows:

61

62

63 Next, we consider the concept of the DISTRIBUTION FUNCTION of a discrete random variable:

64 DISTRIBUTION FUNCTION
The distribution function of a random variable X, denoted by F(x), is defined by F(x) = P(X < x). The function F(x) gives the probability of the event that X takes a value LESS THAN OR EQUAL TO a specified value x.

65 The distribution function is abbreviated to d. f
The distribution function is abbreviated to d.f. and is also called the cumulative distribution function (cdf) as it is the cumulative probability function of the random variable X from the smallest value upto a specific value x.

66 Let us illustrate this concept with the help of the same example that we have been considering --- that of the probability distribution of the sum of the dots when two fair dice are thrown. As explained earlier, the probability distribution of this example is:

67 F(x) = P(X < x) = p(X = 2) + p(X = 3) + p(X = 4) + p(x = 5) = 1/36 + 2/36 + 3/36 + 4/36 = 10/36

68 The term ‘distribution function’ implies the cumulation of the probabilities similar to the cumulation of frequencies in the case of the frequency distribution of a discrete variable.

69

70 If we are interested in finding the probability that we obtain a sum of five or less, the column of cumulative probabilities immediately indicates that this probability is

71 In the next lecture, we will discuss the concept of distribution function in further detail, after which we will begin in a formal manner the concept of MATHEMATICAL EXPECTATION.

72 IN TODAY’S LECTURE, YOU LEARNT
Bayes’ Theorem Discrete Random Variable Discrete Probability Distribution Graphical Representation of a Mean, Standard Deviation and Coefficient of Variation of a Discrete Probability Distribution Distribution Function of a Discrete Random Variable

73 IN THE NEXT LECTURE, YOU WILL LEARN
Graphical Representation of the Distribution Function of a Discrete Random Variable Mathematical Expectation Mean, Variance and Moments of a Discrete Probability Distribution Chebychev’s Inequality


Download ppt "Virtual University of Pakistan"

Similar presentations


Ads by Google