Continuous Distributions The Uniform distribution from a to b.

Slides:



Advertisements
Similar presentations
Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
Advertisements

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Measures of Dispersion
Engineering Probability and Statistics - SE-205 -Chap 4 By S. O. Duffuaa.
Probability Densities
Review.
Chapter 6 Continuous Random Variables and Probability Distributions
Probability and Statistics Review
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
The moment generating function of random variable X is given by Moment generating function.
Continuous Random Variables and Probability Distributions
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Continuous Random Variables and Probability Distributions.
Jointly distributed Random variables
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
Moment Generating Functions 1/33. Contents Review of Continuous Distribution Functions 2/33.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Continuous Random Variables and Probability Distributions
CPSC 531: Probability Review1 CPSC 531:Distributions Instructor: Anirban Mahanti Office: ICT Class Location: TRB 101.
Dept of Bioenvironmental Systems Engineering National Taiwan University Lab for Remote Sensing Hydrology and Spatial Modeling STATISTICS Random Variables.
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Moment Generating Functions
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Random Variables Numerical Quantities whose values are determine by the outcome of a random experiment.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Measures of Dispersion & The Standard Normal Distribution 9/12/06.
Probability & Statistics I IE 254 Summer 1999 Chapter 4  Continuous Random Variables  What is the difference between a discrete & a continuous R.V.?
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Sampling and estimation Petter Mostad
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Continuous Random Variables and Probability Distributions
Section 5 – Expectation and Other Distribution Parameters.
Review Know properties of Random Variables
7.2 Means & Variances of Random Variables AP Statistics.
Selecting Input Probability Distributions. 2 Introduction Part of modeling—what input probability distributions to use as input to simulation for: –Interarrival.
Statistics -Continuous probability distribution 2013/11/18.
Random Variables By: 1.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Continuous Distributions
Expectations of Random Variables, Functions of Random Variables
4-1 Continuous Random Variables 4-2 Probability Distributions and Probability Density Functions Figure 4-1 Density function of a loading on a long,
Engineering Probability and Statistics - SE-205 -Chap 4
The Exponential and Gamma Distributions
Expectations of Random Variables, Functions of Random Variables
The Bernoulli distribution
3.1 Expectation Expectation Example
Some Rules for Expectation
Moment Generating Functions
Mean & Variance of a Distribution
Probability Review for Financial Engineers
CENTRAL MOMENTS, SKEWNESS AND KURTOSIS
Continuous Distributions
MATH 3033 based on Dekking et al
Moments of Random Variables
Presentation transcript:

Continuous Distributions The Uniform distribution from a to b

The Normal distribution (mean , standard deviation  )

The Exponential distribution

Weibull distribution with parameters  and .

The Weibull density, f(x) (  = 0.5,  = 2) (  = 0.7,  = 2) (  = 0.9,  = 2)

The Gamma distribution Let the continuous random variable X have density function: Then X is said to have a Gamma distribution with parameters  and.

Expectation

Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected value of X, E(X) is defined to be: and if X is continuous with probability density function f(x)

Interpretation of E(X) 1.The expected value of X, E(X), is the centre of gravity of the probability distribution of X. 2.The expected value of X, E(X), is the long-run average value of X. (shown later –Law of Large Numbers) E(X)E(X)

Example: The Uniform distribution Suppose X has a uniform distribution from a to b. Then: The expected value of X is:

Example: The Normal distribution Suppose X has a Normal distribution with parameters  and . Then: The expected value of X is: Make the substitution:

Hence Now

Example: The Gamma distribution Suppose X has a Gamma distribution with parameters  and. Then: Note: This is a very useful formula when working with the Gamma distribution.

The expected value of X is: This is now equal to 1.

Thus if X has a Gamma ( , ) distribution then the expected value of X is: Special Cases: ( , ) distribution then the expected value of X is: 1. Exponential ( ) distribution:  = 1, arbitrary 2. Chi-square ( ) distribution:  = / 2, = ½.

The Gamma distribution

The Exponential distribution

The Chi-square (  2 ) distribution

Expectation of functions of Random Variables

Definition Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected value of g(X), E[g(X)] is defined to be: and if X is continuous with probability density function f(x)

Example: The Uniform distribution Suppose X has a uniform distribution from 0 to b. Then: Find the expected value of A = X 2. If X is the length of a side of a square (chosen at random form 0 to b) then A is the area of the square = 1/3 the maximum area of the square

Example: The Geometric distribution Suppose X (discrete) has a geometric distribution with parameter p. Then: Find the expected value of X A and the expected value of X 2.

Recall: The sum of a geometric Series Differentiating both sides with respect to r we get:

Thus This formula could also be developed by noting:

This formula can be used to calculate:

To compute the expected value of X 2. we need to find a formula for Note Differentiating with respect to r we get

Differentiating again with respect to r we get Thus

implies Thus

Moments of Random Variables

Definition Let X be a random variable (discrete or continuous), then the k th moment of X is defined to be: The first moment of X,  =  1 = E(X) is the center of gravity of the distribution of X. The higher moments give different information regarding the distribution of X.

Definition Let X be a random variable (discrete or continuous), then the k th central moment of X is defined to be: where  =  1 = E(X) = the first moment of X.

The central moments describe how the probability distribution is distributed about the centre of gravity, . and is denoted by the symbol var(X). = 2 nd central moment. depends on the spread of the probability distribution of X about . is called the variance of X.

is called the standard deviation of X and is denoted by the symbol . The third central moment contains information about the skewness of a distribution.

The third central moment contains information about the skewness of a distribution. Measure of skewness

Positively skewed distribution

Negatively skewed distribution

Symmetric distribution

The fourth central moment Also contains information about the shape of a distribution. The property of shape that is measured by the fourth central moment is called kurtosis The measure of kurtosis

Mesokurtic distribution

Platykurtic distribution

leptokurtic distribution

Example: The uniform distribution from 0 to 1 Finding the moments

Finding the central moments:

Thus The standard deviation The measure of skewness The measure of kurtosis

Rules for expectation

Rules: Proof The proof for discrete random variables is similar.

Proof The proof for discrete random variables is similar.

Proof The proof for discrete random variables is similar.

Proof

Moment generating functions

Definition Let X denote a random variable, Then the moment generating function of X, m X (t) is defined by: Recall

Examples The moment generating function of X, m X (t) is: 1.The Binomial distribution (parameters p, n)

The moment generating function of X, m X (t) is: 2.The Poisson distribution (parameter )

The moment generating function of X, m X (t) is: 3.The Exponential distribution (parameter )

The moment generating function of X, m X (t) is: 4.The Standard Normal distribution (  = 0,  = 1)

We will now use the fact that We have completed the square This is 1

The moment generating function of X, m X (t) is: 4.The Gamma distribution (parameters , )

We use the fact Equal to 1

Properties of Moment Generating Functions

1. m X (0) = 1 Note: the moment generating functions of the following distributions satisfy the property m X (0) = 1

We use the expansion of the exponential function:

Now

Property 3 is very useful in determining the moments of a random variable X. Examples

To find the moments we set t = 0.

The moments for the exponential distribution can be calculated in an alternative way. This is note by expanding m X (t) in powers of t and equating the coefficients of t k to the coefficients in: Equating the coefficients of t k we get:

The moments for the standard normal distribution We use the expansion of e u. We now equate the coefficients t k in:

If k is odd:  k = 0. For even 2k: