Download presentation
Presentation is loading. Please wait.
Published byMilo Osborne Modified over 8 years ago
1
week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number of arrivals in a time period. In t time periods, average number of arrivals is λt. How long do I have to wait until the first arrival? Let Y = waiting time for the first arrival (a continuous r.v.) then we have Therefore, which is the exponential cdf. The waiting time for the first occurrence of an event when the number of events follows a Poisson distribution is exponentially distributed.
2
week 62 Expectation In the long run, rolling a die repeatedly what average result do you expact? In 6,000,000 rolls expect about 1,000,000 1’s, 1,000,000 2’s etc. Average is For a random variable X, the Expectation (or expected value or mean) of X is the expected average value of X in the long run. Symbols: μ, μ X, E(X) and EX.
3
week 63 Expectation of discrete random variable For a discrete random variable X with pmf whenever the sum converge absolutely.
4
week 64 Examples 1) Roll a die. Let X = outcome on 1 roll. Then E(X) = 3.5. 2) Bernoulli trials and. Then 3) X ~ Binomial(n, p). Then 4) X ~ Geometric(p). Then 5) X ~ Poisson(λ). Then
5
week 65 Expectation of continuous random variable For a continuous random variable X with density whenever this integral converge absolutely.
6
week 66 Examples 1) X ~ Uniform(a, b). Then 2) X ~ Exponential(λ). Then 3) X is a random variable with density (i) Check if this is a valid density. (ii) Find E(X)
7
week 67 4) X ~ Gamma(α, λ). Then 5) X ~ Beta(α, β). Then
8
week 68 Theorem For g: R R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let Y = g(X) then
9
week 69 Example to illustrate steps in proof Suppose i.e. and the possible values of X are so the possible values of Y are then,
10
week 610 Examples 1. Suppose X ~ Uniform(0, 1). Let then, 2. Suppose X ~ Poisson(λ). Let, then
11
week 611 Properties of Expectation For X, Y random variables and constants, E(aX + b) = aE(X) + b Proof: Continuous case E(aX + bY) = aE(X) + bE(Y) Proof to come… If X is a non-negative random variable, then E(X) = 0 if and only if X = 0 with probability 1. If X is a non-negative random variable, then E(X) ≥ 0 E(a) = a
12
week 612 Moments The k th moment of a distribution is E(X k ). We are usually interested in 1 st and 2 nd moments (sometimes in 3 rd and 4 th ) Some second moments: 1. Suppose X ~ Uniform(0, 1), then 2. Suppose X ~ Geometric(p), then
13
week 613 Variance The expected value of a random variable E(X) is a measure of the “center” of a distribution. The variance is a measure of how closely concentrated to center (µ) the probability is. It is also called 2nd central moment. Definition The variance of a random variable X is Claim: Proof: We can use the above formula for convenience of calculation. The standard deviation of a random variable X is denoted by σ X ; it is the square root of the variance i.e..
14
week 614 Properties of Variance For X, Y random variables and are constants, then Var(aX + b) = a 2 Var(X) Proof: Var(aX + bY) = a 2 Var(X) + b 2 Var(Y) + 2abE[(X – E(X ))(Y – E(Y ))] Proof: Var(X) ≥ 0 Var(X) = 0 if and only if X = E(X) with probability 1 Var(a) = 0
15
week 615 Examples 1. Suppose X ~ Uniform(0, 1), then and therefore 2. Suppose X ~ Geometric(p), then and therefore 3. Suppose X ~ Bernoulli(p), then and therefore,
16
week 616 Example Suppose X ~ Uniform(2, 4). Let. Find. What if X ~ Uniform(-4, 4)?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.