Chapter Eight Expectation of Discrete Random Variable

Slides:



Advertisements
Similar presentations
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Advertisements

Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Chapter 5 Discrete Random Variables and Probability Distributions
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Sections.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Independence of random variables
Chapter 2: Probability Random Variable (r.v.) is a variable whose value is unknown until it is observed. The value of a random variable results from an.
Chapter 4 Discrete Random Variables and Probability Distributions
Class notes for ISE 201 San Jose State University
Chapter 6 Continuous Random Variables and Probability Distributions
Chap 10 More Expectations and Variances Ghahramani 3rd edition
P robability Midterm Practice Condition Independence 郭俊利 2009/04/13.
Probability &Statistics Lecture 8
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
Probability Mass Function Expectation 郭俊利 2009/03/16
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
Review of Probability and Statistics
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Random Variable and Probability Distribution
Chapter6 Jointly Distributed Random Variables
Chapter 5 Discrete Probability Distribution I. Basic Definitions II. Summary Measures for Discrete Random Variable Expected Value (Mean) Variance and Standard.
Section 8 – Joint, Marginal, and Conditional Distributions.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
Chapter 3 Random vectors and their numerical characteristics.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
Random Variables A random variable is simply a real-valued function defined on the sample space of an experiment. Example. Three fair coins are flipped.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
Continuous Distributions The Uniform distribution from a to b.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
1 G Lect 2M Examples of Correlation Random variables and manipulated variables Thinking about joint distributions Thinking about marginal distributions:
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Chapter 2: Random Variable and Probability Distributions Yang Zhenlin.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Probability Theory Modelling random phenomena. Permutations the number of ways that you can order n objects is: n! = n(n-1)(n-2)(n-3)…(3)(2)(1) Definition:
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
1 Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR SCIENTISTS AND ENGINEERS Systems.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 3: Discrete Random Variables and Their Distributions CIS.
ONE DIMENSIONAL RANDOM VARIABLES
Expectations of Random Variables, Functions of Random Variables
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
Chapter 4: Mathematical Expectation:
Chapter 3: Getting the Hang of Statistics
Multinomial Distribution
Tutorial 9: Further Topics on Random Variables 2
How accurately can you (1) predict Y from X, and (2) predict X from Y?
Chapter 3: Getting the Hang of Statistics
Handout Ch 4 實習.
Handout Ch 4 實習.
RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC.
AP Statistics Chapter 16 Notes.
Financial Econometrics Fin. 505
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Covariance and Correlation
Mathematical Expectation
Presentation transcript:

Chapter Eight Expectation of Discrete Random Variable STAT 111 Chapter Eight Expectation of Discrete Random Variable

Expectation of Discrete Random Variable One of the important concept in probability theory is that of the expectation of a random variable. The expected value of a random variable X, denoted by E(X) or , measure where the Probability distribution is centered. Definition : Let X be a discrete random variable having a probability mass Function f(x). If Then , the expected value (or mean) of X exist and is define as

Expectation of Discrete Random Variable In words , the expected value of X is the weighted average of the possible values of X can take on , each value being weighted by the probability that X assumes it.

Example The probability mass function of the random variable X is given by Find the expected value of X. x 1 2 3 x f (x) 1/2 1/3 1/6

Solution: x 1 2 3 sum f(x) 1/2 1/3 1/6 x f(x) 2/3 3/6 10\6 Then , = 10/6 The value of E(x)

Example Get the values of f(y) such as When y=0 then When y=1 then The probability distribution of the discrete random variable Y is Find the mean of Y. Solution: Get the values of f(y) such as When y=0 then When y=1 then And so on then

Example (continued) Then we can form the following table So , E(y) = 48/64 = 3/4 y 1 2 3 sum f(y) 27/64 9/64 1/64 y f(y) 18/64 3/64 48/64 The value of E(y)

Example x 1 2 3 sum f(x) 1/2 1/3 1/6 x f(x) 2/3 3/6 10\6 A pair of fair dice is tossed. Let X assign to each point (a,b) in S the maximum of its number, i.e. X (a,b) = max (a, b) . Find the probability mass function of X , and the mean of X . Solution: When toss a pair of fair dice S={(1,1),(1,2),(1,3)……………..(6,5),(6,6)} E(x) = 161/36 x 1 2 3 sum f(x) 1/2 1/3 1/6 x f(x) 2/3 3/6 10\6 E(x)

Example X=no. of chemist Find the expected number of chemists on committee of 3 selected at random from 4 chemists and 3 biologists . Solution: Now we want to form the table of function 1- get the value of X = the number of chemist in the committee x = 0,1,2,3 2- get the values of mass functions f(x) x=0 , then f(0)=P(X=0) = x=2 , then f(2)=P(X=2) =

Example E(X) = 60/35 =1.7 Note: E(X) need not be an integer . x 1 2 3 1 2 3 sum f(x) 1/35 12/35 18/35 4/35 x f(x) 36/35 12/36 60/35 E(x)

Example Let X be the following probability mass function Find E(X3( Solution: = 1/6+16/6+81/6=98/6

Expected value(mean) of some distributions E(X)= mean Binomail dist. E(X) = np Hypergeometric E(X) = n M / N Geometric dist. E(X)= 1/ P Poisson dist. E(X) = ג Uniform dist. E(X) = (N+1)/ 2

Examples Example 1: A fair die is tossed 1620 times. Find the expected number of times the face 6 occurs. Solution: X= # of times faces {6} occurs X ~Bin(1620,1/6) ,then E(X) = np = 1620 * 1/6 = 270 Example 2: If the probability of engine malfunction during any 1-hour period is p=0.02 and X is the number of 1-hour interval until the first malfunction , find the mean of X. X ~g(0.02) ,then E(X) = 1/P= 1/0.02=50

Example 3: A coin is biased so that a head is three times as likely to occur as a tail. Find the expected number of tails when this coin is tossed twice. Solution: Since coin is biased then P(H) = 3 P(T) [ since P(H)+P(T) = 1 3P(T)+P(T)=1 4P(T)=1 P(T)= ¼ ] X= # of tails (T) X ~Bin(2,1/4) ,then E(X) = np = 2 *1/4 = 1/2

Example 4: If X has a Poisson distribution with mean 3 . Find the expected value of X. Solution : X ~Poisson(3) then E(x) = ג = 3

Properties of Expectation: 1.If X is a random variable with probability distribution f(x).The mean or expected value of the random variable g(x) is (Law of unconscious statistician) 2. If a and b are constants ,then (I) E(a) = a (II) E(aX) = a E(X) (III) E(aX ± b) =E(aX) ± E(b) = a E(X) ± b (IV)

Example If X is the number of points rolled with a balanced die,find the expected value of the random variable g(x)= 2 X2 +1 Solution: S={1,2,3,4,5,6} each with probability 1/6 E(g(x))=E(2 X2 +1)=2E(X2)+E(1) =2 * 91/6 + 1=188/6=31.3 x 1 2 3 4 5 6 sum f(x) 1/6 xf(x) 2/6 3/6 4/6 5/6 6/6 21/6 x2 f(x) 9/6 16/6 25/6 36/6 91/6

Expectation and moments for Bivariate Distribution We shall now extend our concept of mathematical expectation to the case of n random variable X1 , X2 ,….,Xn with joint probability distribution f(x1 , x2 ,….,xn ). Definition: Let X1 , X2 ,….,Xn be a discrete random vector with joint probability distribution f(x1 , x2 ,….,xn ) and let g be a real valued function. Then the random variable Z=g(x1 , x2 ,….,xn )expectation has finite expectation if and only if

and in this case the expected value of Z, denoted by Example: Let X and Y be the random variable with the following joint probability function. Find the expected value of g(x,y)= XY Solution: =0*0*f(0,0)+0*1*f(0,1)+…+1*1*f(1,1)+…+2*0*f(2,0) =f(1,1) =3/4 y/x 1 2 3/28 9/28 3/14 1/28

Theorem 1: Corollary: And in general The expected value of the sum or difference of two or more functions of the random variables X,Y is the sum or difference of the expected values of the function. That is Generalized of the above theorem to n random variables is straightforward. Corollary: Setting g(x,y)=g(x) and h(x,y)=h(y),we see that Setting g(x,y)= x and h(x,y)=y ,we see that And in general

Theorem 2: (Independence) If X and Y are two independent random variables having finite expectations. Then XY has finite expectation and E(XY) = E(X)E(Y) Note: opposite are not true If E(XY) = E(X)E(Y) X,Y independent In general, if X1, X2 ,….., Xn are n independent random variables such that each expectation E(Xi) exists (i=1,2,…n), then

Example1 : Prob.= 1/n=1/4 Let (X,Y) assume the values (1,0),(0,1),(-1,0),(0,-1) with equal probabilities. Show that the equation satisfied E(XY) = E(X)E(Y) Solution: However, the random Variables X and Y are not independent Then, f(y) x Y -1 1 sum 1/4 1/4 1/4 1/4 2/4 1/4 1/4 f(x) 1/4 2/4 1/4 1

Example E (X)= -1x1/4+0x2/4+1x1/4=-1/4+0+1/4=0 E (Y)= -1x1/4+0x2/4+1x1/4=-1/4+0+1/4=0 E(X) E(Y) = 0 Now, E(XY) = (-1x -1x0) + (-1x0x1/4)+…..+(1x0x1/4)+(1x1x0) = 0 + 0 +……..+ 0 + 0 = 0 Then, E(XY) = E(X)E(Y) 0 = 0 (equation is satisfied) However, X and y are not independent since

Remember if X,Y indep,then Example Suppose that X1, X2 and X3 are independent random variables such that E(Xi)=0 and E(Xi2(=1 for i=1,2,3 Find E(X12) X2 -4 X3( 2 ( Solution: Since X1, X2 and X3 are independent X12 and) X2 -4 X3( 2 are also independent E(X12) X2 -4 X3( 2 =(E(X12)E( X2 -4 X3( 2 = 1x E(X22- 8X2 X3 +16 X32 ( =E(X22 (- 8E(X2 X3( +16 E(X32 ( =1-8 E(X2)E(X3) + 16x 1 = 1-(8x0x0) + 16=17 Remember if X,Y indep,then

Conditional Expectation Definition: Let X and Y be two random variables with joint probability Distribution f(x,y). The conditional expectation of X, given Y=y, is defined as Where f(x\y) is the conditional distribution of X given Y = y

Example If the joint probability distribution function of X and Y are as shown in The following table: Find: 1.The conditional distribution of X given Y= -1, That is [f(x\y= -1) for every x] When X= -1 When X = 1 y x -1 1 sum 1/8 1/2 5/8 1/4 2/8 3/4 x -1 1 f(x/y=-1) 1/5 4/5

Example 2. The conditional mean of X given Y= -1 x -1 1 sum f(x/y=-1) 1/5 4/5 xf(x/y=-1) -1/5 3/5

Variance The variance measures the degree to which a distribution is concentrated around its mean. Such measure is called the variance (or dispersion). Definition: The variance of a random variable X, denoted by Var(X) or σx2 where In other words, Since the variance is the expected value of the nonnegative random variable (X-μx2(,then it has some properties.

Properties of variance: 1. Var(X)≥0 Is called the standard deviation of X. The variance a distribution provides a measure of the spread or dispersion of the distribution around its mean μx. If a, b are constants , then (i) Var(a)=0 (ii) Var(aX)=a2Var(X) (iii) Var(aX ± b)=Var(aX)+Var(b)= a2Var(X)

Variances of some distributions: Binomail dist. Var(X) = npq Geometric dist. Var(X)= 1/ P2 Poisson dist. Var(X) = λ

Example Let X be a random variable which take each of the five values -2,0,1,3 and 4, with equal probabilities. Find the standard deviation of Y=4X-7 Solution: equal probabilities each value has prob.=1/5 standard deviation of Y=√var(Y) E(X)=6/5 , E(X2)= 30/5 Var(X)=E(X2 ) – [E(X)]2 = 30/5 –(6/5)2 = 4.56 Var(Y)=Var(4X-7)=Var(4X)+Var(7) =42 Var(X)+0 = 16 Var(X)=16 x 4.56 =72.96 standard deviation of Y=√var(Y)= √ 72.96= 8.542 x -2 1 3 4 sum f(x) 1/5 xf(x) -2/5 3/5 4/5 6/5 x2f(x) 9/5 16/5 30/5

Example If E(X) = 2, Var(X) =5 , find 1. E(2+X)2 2.Var(4+3X) Solution: 1. E(2+X)2 =E(4+4X+X2( = 4+4E(X)+E(X2) To get the value of E(X2) we use Var(X) =5 Var(X) = E(X2) – [E(X)]2 5 = E(X2) - 22 E(X2) = 5+4 =9 So, E(2+X)2 =E(4+4X+X2( = 4+4E(X)+E(X2) = 4+(4x2)+9= 4+8+9=21 2.Var(4+3X)=Var(4)+32Var(X) = 0 + (9x5) =45

Variance of the sum : Def : Let X and Y be two random variables each having finite second moments. Then X+Y has finite second moments and hence finite variance. Now Var(X+Y)=Var(X)+Var(Y)+2E[(X-E(X))(Y-E(Y))] Thus unlike the mean ,the variance of a sum of two random variables is in general not the sum of the variances. Where the quantity E[(X-E(X))(Y-E(Y))] Is called the covariance of X and Y and written Cov(X,Y). Thus, we have the formula Var(X+Y)=Var(X)+Var(Y)+2Cov(X,Y) Note that: Cov(X,Y) = E[(X-E(X))(Y-E(Y))] = E[XY-YE(X)-XE(Y)+E(X)E(Y)] = E(XY) – E(X)E(Y)

Corollary: Properties of Cov(X,Y): If X and Y are independent then Cov(X,Y)=0 Then, Var(X+Y) = Var(X) + Var(Y) In general, If X1 , X2 ,….,Xn are independent random variables each each having a finite second moment, then Properties of Cov(X,Y): Let X and Y be two random variables , the Cov(X,Y) has the following properties: 1. Symmetry , i.e. Cov(X,Y) = Cov(Y,X) 2. Cov(a1X1 + a2X2 , b1Y1+b2Y2 ( = = a1b1Cov(X1, Y1)+ a1b2Cov(X1, Y2) + a2b1Cov(X2, Y1)+ a2b2Cov(X2, Y2) 3. If X and Y are independent then Cov(X,Y)=0

4. Note that: 5. Cov(a,X) = 0 , where a is a constant. Var (aX+bY) =a2 Var(X)+b2 Var(Y)+2abCov(X,Y) In general, If X1 , X2 ,….,Xn are random variables and Y= a1X1+ a2X2 +…….+ anXn where a1 , a2 ,….,an are constants, then Where the double sum extends over all the values of I and j,from 1 to n for which i< j

Example If the random variable X,Y,Z have meanes,respectively,2,-3,4 and variances 1,5,2 and Cov(X,Y)= -2,Cov(X,Z)= -1,Cov(Y,Z)=1. Find the mean and the variance of W= 3X - Y +2Z Solution: E(W)=E(3X - Y +2Z)= 3E(X) – E(Y) + 2E(Z) = (3 x2) – (-3) + 2x4 = 6 + 3 + 8 =17 Var(W) =Var(3X - Y +2Z)=Var(3X)+Var(Y)+Var(2Z)+2Cov(3X,-Y) + 2Cov(3X,2Z)+2Cov(-Y,2Z) = 9Var(X)+Var(Y)+4Var(Z)+(2x3x-1)Cov(X,Y)+(2x3x2)Cov(X,Z) +(2x-1x2)Cov(Y,Z)=(9x1)+(5)+(4x2)+(-6x-2)+(12x-1)+(-4x1) = 9+5+8+12-12-4 = 18

Remember if X,Y indep, then Example Let X and Y be two independent random variables having finite second moments. Compute the mean and variance of 2X+3Y in terms of those of X and Y. Solution: E(2X+3Y)= 2E(X)+3E(Y) Var(2X+3Y) = 4Var(X)+9Var(Y) Remember if X,Y indep, then

Example If X and Y are random variables with 2 and 4 respectively and Cov(X,Y) = -2,Find the variance of the random variable Z=3X-4Y+8 Solution: Var(Z)=Var(3X-4Y+8)= 9Var(X)+16Var(Y)+Var(8)+2Cov(3X,-4Y) = (9x2)+(16x4) + 0 +(2x 3x-4x-2) = 18 + 64+ 48 =130

Example If X and Y are independent random variables with variances 5 and 7 respectively ,find 1-The variance of T =X-2Y Var(T )=Var(X-2Y)=Var(X)+4Var(Y)=5+(4x7)=33 2-The Variance of Z= -2X+3Y Var(Z)= Var(-2X+3Y)=4 Var(x)+9Var(Y)=83 3- The Cov(T,Z) Cov(T,Z)=Cov(X-2Y, -2X+3Y) =Cov(X,-2X)+Cov(X,3Y)+Cov(-2Y,-2X)+Cov(-2Y,3Y) = -2Cov(X,X)+3Cov(X,Y)+(-2x-2)Cov(Y,X)+(-2x3)Cov(Y,Y) = -2Var(X)+(3x0) +(4x0)-6xVar(Y) = (-2x5)+0+0 -(6x7)= -10 - 42= -52 Note: Cov (x,x)=Var(x) Cov(y,y)=Var(Y)

Correlation Coefficient: Let X and Y be two random variables having finite variances . One measure of the degree of dependence between the two random variables is the correlation coefficient ρ(X,Y) defined by These random variables are said to be uncorrelated if ρ =0 (since Cov(X,Y)=0). If X and Y are independent ,we see at once that independent random variables are uncorrelated (the converse is not always true) , i.e. it is possible for dependent random variable to be uncorrelated .

Theorem: If Y= a+ bX , then

Example: Let X and Y be the random variables with the following joint probability Function: Find 1- E(XY) = ?? =(1x-3x0.1)+(1x2x0.2)+(1x4x0.2)+ (3x-3x0.3)+(3x2x0.1)+(3x4x0.1)= = -0.3+0.4+0.8-2.7+0.6+1.2 = 0 X\Y -3 2 4 Sum 1 0.1 0.2 0.5 3 0.3 0.4

Example (Continue) From table: 2-E(X)=2 ,E(X2)= 5 Var(X)= E(X2)-E(X)=3 4-E(X+Y)=E(X)+E(Y) = 2+0.6 = 2.6 From table: 3-E(Y)=0.6 ,E(Y2)= 9.6 Var(Y)= E(Y2)-E(Y)=9.24 5-Cov(X,Y)=E(XY)-E(X)E(Y) = 0 – (2x0.6) = - 1.2 x 1 3 sum f(x) 0.5 xf(x) 1.5 2 x2f(x) 4.5 5 y -3 2 4 Sum f(y) 0.4 0.3 1 yf(y) -1.2 0.6 1.2 y2f(y) 3.6 4.8 9.6

Example (Continue) 6- Var(X,Y)=Var(X)+Var(Y)+2Cov(X,Y) 7- find the correlation coefficient (ρ) ?? 8- Are X and Y independent?? No , since Cov(X,Y) = -0.6 ≠0 Or No, Since ρ has a value ,so X is related to Y

Moment Generating Function In the following we concentrate on applications of moment generating functions. The obvious purpose of the moment generating function is in determining moments of distributions. However the most important contribution is to establish distributions of functions of random variables. Definition: The moments generating function of the random variable X is given by E(etx ) and denoted by Mx(t) . Hence Mx(t)= E(etx ) = ∑etx f(x)

Example: Some properties of moment generating functions: Given that the probability distribution x= 1,2,3,4,5 Find the moment generating function of this random variable Mx(t)= E(etx ) = ∑etx f(x) = [ 3 et + 4 e2t +5 e3t +6e4t +7e5t ]/25 Some properties of moment generating functions: Where a, b constant

Moments generating functions Mx(t) of some distributions: mean Var(X) Mx(t) Binomail dist. E(X)= np Var(X) = npq Geometric dist. E(X) = 1/p Var(X)= 1/ P2 Poisson dist. E(X) = λ Var(X) = λ Hypergeometric dist. E(X) = n M / N -- Uniform distribution E(X) = (N+1)/ 2 ---

Example For each of the following moment generating function, find the mean and the variance of X 1- The distribution is Binamail with n=4 , P=0.4 E(X)=np= 4 x 0.4 =1.6 Var(x) = npq =4x 0.4x0.6 = 0.96 2- The distribution is Poisson with λ=6 E(X)= λ = 6 Var(X) = λ =6

Example 3- The distribution is geometric with P=0.2 E(X)= 1/P= 1/0.2 =50 Var(X) =1/P2 =1/0.22 P(X=1) = pqx-1 = pq0 = 0.2x(0.8)0=0.2

Example The moment generating function of the random variable X and Y are If X and Y are independent ,find 1- E(XY) 2- Var(X+Y) 3 –Cov(X+2,Y-3) Solution: X has Poisson distribution with λ = 2 E(x)=Var(X)=λ=2 Y has Binomail distribution with n=10,P=0.75 E(Y)=10x0.75 = 7.5 ,Var(Y)= 10x0.75x0.25=0.1878

Example Since X and Y independent ,then 1- E(XY)=E(X)E(Y)=2x7.5=15 2- Var(X+Y)=Var(X) + Var(Y) = 2 + 0.1878 = 2.1878 3 –Cov(X+2,Y-3)= Cov(X,Y)+Cov(X,3)+Cov(2,-3) = 0 + 0 +0 =0