Download presentation
Presentation is loading. Please wait.
Published byEthan Atkins Modified over 9 years ago
1
lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function Calculation of probabilities P( a < X < b ) = f(t) dt Discrete random variable Sample space is finite or countably many elements The probability function f(x) Is often tabulated Calculation of probabilities P( a < X < b) = f(t) a<t<b baba
2
lecture 3 2 Mean / Expected value Definition Definition: Let X be a random variable with probability / Density function f(x). The mean or expected value of X is give by if X is discrete, and if X is continuous.
3
lecture 3 3 Mean / Expected value Interpretation Interpretation: The total contribution of a value multiplied by the probability of the value – a weighted average. Mean value= 1,5 x 0 1 2 3 0.1 0.2 0.3 0.4 f(x) Example:
4
lecture 3 4 Mean / Expected value Example Problem: A private pilot wishes to insure his plane valued at 1 mill kr. The insurance company expects a loss with the following probabilities: Total loss with probability 0.001 50% loss with probability 0.01 25% loss with probability 0.1 1. What is the expected loss in kroner ? 2. What premium should the insurance company ask if they want an expected profit of 3000 kr ?
5
lecture 3 5 Theorem: Let X be a random variable with probability / density function f(x). The expected value of g(X) is if X is discrete, and if X is continuous. Mean / Expected value Function of a random variable
6
lecture 3 6 Theorem: Linear combination Let X be a random variable (discrete or continuous), and let a and b be constants. For the random variable aX + b we have E(aX+b) = aE(X)+b Expected value Linear combination
7
lecture 3 7 Mean / Expected value Example Problem: The pilot from before buys a new plane valued at 2 mill kr. The insurance company’s expected losses are unchanged: Total loss with probability 0.001 50% loss with probability 0.01 25% loss with probability 0.1 1. What is the expected loss for the new plane?
8
lecture 3 8 Mean / Expected value Function of a random variables Definition: Let X and Y be random variables with joint probability / density function f(x,y). The expected value of g(X,Y) is if X and Y are discrete, and if X and Y are continuous.
9
lecture 3 9 Problem: Burger King sells both via “drive-in” and “walk-in”. Let X and Y be the fractions of the opening hours that “drive-in” and “walk- in” are busy. Assume that the joint density for X and Y are given by Mean / Expected value Function of two random variables 4xy 0 x 1, 0 y 1 0 otherwise The turn over g(X,Y) on a single day is given by g(X,Y) = 6000 X + 9000Y What is the expected turn over on a single day? f(x,y) = {
10
lecture 3 10 Mean / Expected value Sums and products Theorem: Sum/Product Let X and Y be random variables then E[X+Y] = E[X] + E[Y] If X and Y are independent then E[X Y] = E[X] E[Y]..
11
lecture 3 11 Variance Definition Definition: Let X be a random variable with probability / density function f(x) and expected value . The variance of X is then given if X is discrete, and if X is continuous. The standard deviation is the positive root of the variance:
12
lecture 3 12 Variance Interpretation Varians = 0.5Varians = 2 The variance expresses, how dispersed the density / probability function is around the mean. x 1 2 3 0.1 0.2 0.3 0.4 f(x) x 0 1 2 3 4 0.1 0.2 0.3 0.4 f(x) 0.5 Rewrite of the variance:
13
lecture 3 13 Theorem: Linear combination Let X be a random variable, and let a be b constants. For the random variable aX + b the variance is Variance Linear combinations Examples: Var (X + 7) = Var (X) Var (-X ) = Var (X) Var ( 2X ) = 4 Var (X)
14
lecture 3 14 Covariance Definition Definition: Let X and Y be to random variables with joint probability / density function f(x,y). The covariance between X and Y is if X and Y are discrete, and if X and Y are continuous.
15
lecture 3 15 Covariance Interpretation Covariance between X and Y expresses how X and Y influence each other. Examples: Covariance between X = sale of bicycle and Y = bicycle pumps is positive. X = Trips booked to Spain and Y = outdoor temperature is negative. X = # eyes on red dice and Y = # eyes on the green dice is zero.
16
lecture 3 16 Theorem: The covariance between two random variables X and Y with means X and Y, respectively, is Covariance Properties Notice!Cov (X,X) = Var (X) If X and Y are independent random variables, then Cov (X,Y) = 0 Notice! Cov(X,Y) = 0 does not imply independence!
17
lecture 3 17 Variance/Covariace Linear combinations Theorem: Linear combination Let X and Y be random variables, and let a and b be constants. For the random variables aX + bY the variance is Specielt: Var[X+Y] = Var[X] + Var[Y] +2Cov (X,Y) If X and Y are independent, the variance is Var[X+Y] = Var[X] + Var[Y]
18
lecture 3 18 Correlation Definition Definition: Let X and Y be two random variables with covariance Cov (X,Y) and standard deviations X and Y, respectively. The correlation coefficient of X and Y is It holds that If X and Y are independent, then
19
lecture 3 19 Mean, variance, covariace Collection of rules Sums and multiplications of constants: E (aX) = a E(X) Var(aX) = a 2 Var (X)Cov(aX,bY) = abCov(X,Y) E (aX+b) = aE(X)+b Var(aX+b) = a 2 Var (X) Sum: E (X+Y) = E(X) + E(Y)Var(X+Y) = Var(X) + Var(Y) + 2Cov(X,Y) X and Y are independent: E(XY) = E(X) E(Y) Var(X+Y) = Var(X) + Var(Y)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.