Presentation is loading. Please wait.

Presentation is loading. Please wait.

Continuous Distributions

Similar presentations


Presentation on theme: "Continuous Distributions"— Presentation transcript:

1 Continuous Distributions
The Uniform distribution from a to b

2 The Normal distribution (mean m, standard deviation s)

3 The Exponential distribution

4 Weibull distribution with parameters a and b.

5 The Weibull density, f(x)
(a = 0.9, b = 2) (a = 0.7, b = 2) (a = 0.5, b = 2)

6 The Gamma distribution
Let the continuous random variable X have density function: Then X is said to have a Gamma distribution with parameters a and l.

7 Expectation

8 Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected value of X, E(X) is defined to be: and if X is continuous with probability density function f(x)

9 Example: Suppose we are observing a seven game series where the teams are evenly matched and the games are independent. Let X denote the length of the series. Find: The distribution of X. the expected value of X, E(X).

10 Solution: Let A denote the event that team A, wins and B denote the event that team B wins. Then the sample space for this experiment (together with probabilities and values of X) would be (next slide):

11 outcome AAAA BBBB BAAAA ABAAA AABAA AAABA Prob (½ )4 (½ )5 X 4 5 ABBBB BABBB BBABB BBBAB BBAAAA (½ )6 6 BABAAA BAABAA BAAABA ABBAAA ABABAA ABAABA AABBAA AABABA AAABBA AABBBB ABABBB ABBABB continued At this stage it is recognized that it might be easier to determine the distribution of X using counting techniques

12 The possible values of X are {4, 5, 6, 7}
The probability of a sequence of length x is (½)x The series can either be won by A or B. If the series is of length x and won by one of the teams (A say) then the number of such series is: In a series of that lasts x games, the winning team wins 4 games and the losing team wins x - 4 games. The winning team has to win the last games. The no. of ways of choosing the games that the losing team wins is:

13 Thus x 4 5 6 7 p(x) The probability of a series of length x.
The no. of ways of choosing the winning team The no. of ways of choosing the games that the losing team wins x 4 5 6 7 p(x)

14

15 Interpretation of E(X)
The expected value of X, E(X), is the centre of gravity of the probability distribution of X. The expected value of X, E(X), is the long-run average value of X. (shown later –Law of Large Numbers) E(X)

16 Example: The Binomal distribution
Let X be a discrete random variable having the Binomial distribution. i. e. X = the number of successes in n independent repetitions of a Bernoulli trial. Find the expected value of X, E(X).

17 Solution:

18

19 Example: A continuous random variable
The Exponential distribution Let X have an exponential distribution with parameter l. This will be the case if: P[X ≥ 0] = 1, and P[ x ≤ X ≤ x + dx| X ≥ x] = ldx. The probability density function of X is: The expected value of X is:

20 We will determine using integration by parts.

21 Summary: If X has an exponential distribution with parameter l then:

22 Example: The Uniform distribution Suppose X has a uniform distribution from a to b. Then: The expected value of X is:

23 Example: The Normal distribution Suppose X has a Normal distribution with parameters m and s. Then: The expected value of X is: Make the substitution:

24 Hence Now

25 Example: The Gamma distribution Suppose X has a Gamma distribution with parameters a and l. Then: Note: This is a very useful formula when working with the Gamma distribution.

26 The expected value of X is:
This is now equal to 1.

27 Thus if X has a Gamma (a ,l) distribution then the expected value of X is:
Special Cases: (a ,l) distribution then the expected value of X is: Exponential (l) distribution: a = 1, l arbitrary Chi-square (n) distribution: a = n/2, l = ½.

28 The Gamma distribution

29 The Exponential distribution

30 The Chi-square (c2) distribution

31 Expectation of functions of Random Variables

32 Definition Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected value of g(X), E[g(X)] is defined to be: and if X is continuous with probability density function f(x)

33 Example: The Uniform distribution Suppose X has a uniform distribution from 0 to b. Then: Find the expected value of A = X2 . If X is the length of a side of a square (chosen at random form 0 to b) then A is the area of the square = 1/3 the maximum area of the square

34 Example: The Geometric distribution Suppose X (discrete) has a geometric distribution with parameter p. Then: Find the expected value of X A and the expected value of X2.

35 Recall: The sum of a geometric Series
Differentiating both sides with respect to r we get: Differentiating both sides with respect to r we get:

36 Thus This formula could also be developed by noting:

37 This formula can be used to calculate:

38 To compute the expected value of X2.
we need to find a formula for Note Differentiating with respect to r we get

39 Differentiating again with respect to r we get
Thus

40 implies Thus

41 Thus

42 Moments of Random Variables

43 Definition Let X be a random variable (discrete or continuous), then the kth moment of X is defined to be: The first moment of X , m = m1 = E(X) is the center of gravity of the distribution of X. The higher moments give different information regarding the distribution of X.

44 Definition Let X be a random variable (discrete or continuous), then the kth central moment of X is defined to be: where m = m1 = E(X) = the first moment of X .

45 The central moments describe how the probability distribution is distributed about the centre of gravity, m. = 2nd central moment. depends on the spread of the probability distribution of X about m. is called the variance of X. and is denoted by the symbol var(X).

46 is called the standard deviation of X and is denoted by the symbol s. The third central moment contains information about the skewness of a distribution.

47 The third central moment
contains information about the skewness of a distribution. Measure of skewness

48 Positively skewed distribution

49 Negatively skewed distribution

50 Symmetric distribution

51 The fourth central moment
Also contains information about the shape of a distribution. The property of shape that is measured by the fourth central moment is called kurtosis The measure of kurtosis

52 Mesokurtic distribution

53 Platykurtic distribution

54 leptokurtic distribution

55 Example: The uniform distribution from 0 to 1
Finding the moments

56 Finding the central moments:

57 Thus The standard deviation The measure of skewness The measure of kurtosis

58 Rules for expectation

59 Rules: Proof The proof for discrete random variables is similar.

60 Proof The proof for discrete random variables is similar.

61 Proof The proof for discrete random variables is similar.

62 Proof

63 Moment generating functions

64 Recall Definition Let X denote a random variable, Then the moment generating function of X , mX(t) is defined by:

65 Examples The Binomial distribution (parameters p, n)
The moment generating function of X , mX(t) is:

66 The Poisson distribution (parameter l)
The moment generating function of X , mX(t) is:

67 The Exponential distribution (parameter l)
The moment generating function of X , mX(t) is:

68 The Standard Normal distribution (m = 0, s = 1)
The moment generating function of X , mX(t) is:

69 We will now use the fact that
We have completed the square This is 1

70 The Gamma distribution (parameters a, l)
The moment generating function of X , mX(t) is:

71 We use the fact Equal to 1

72 Properties of Moment Generating Functions

73 mX(0) = 1 Note: the moment generating functions of the following distributions satisfy the property mX(0) = 1

74 We use the expansion of the exponential function:

75 Now

76 Property 3 is very useful in determining the moments of a random variable X.
Examples

77

78 To find the moments we set t = 0.

79

80

81 The moments for the exponential distribution can be calculated in an alternative way. This is note by expanding mX(t) in powers of t and equating the coefficients of tk to the coefficients in: Equating the coefficients of tk we get:

82 The moments for the standard normal distribution
We use the expansion of eu. We now equate the coefficients tk in:

83 If k is odd: mk = 0. For even 2k:


Download ppt "Continuous Distributions"

Similar presentations


Ads by Google