Download presentation
Presentation is loading. Please wait.
Published byWilfrid Berry Modified over 9 years ago
1
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 7 – Discrete Random Variables: Conditioning and Independence Farinaz Koushanfar ECE Dept., Rice University Sept 15, 2009
2
ELEC 303, Koushanfar, Fall’09 Lecture outline Reading: Finish Chapter 2 Review Joint PMFs Conditioning Independence
3
ELEC 303, Koushanfar, Fall’09 Random Variables A random variable is a Real-valued function of an experiment outcome A function of a random variable defines another random variable We associate with each RV some averages of interest, such as mean and variance A random variable can be conditioned on an event or another random variable There is a notion of independence of a random variable from an event or from another
4
ELEC 303, Koushanfar, Fall’09 Discrete random variables It is a real-valued function of the outcome of the experiments – can take a finite or infinitely finite number of values A discrete random variable has an associated probability mass function (PMF) – It gives the probability of each numerical value that the random variable can take A function of a discrete random variable defines another discrete random variable (RV) – Its PMF can be found from the PMF of the original RV
5
ELEC 303, Koushanfar, Fall’09 Probability mass function (PMF) Notations – Random variable: X – Experimental value: x – P X (x) = P({X=x}) It mathematically defines a probability law Probability axiom: x P X (x) = 1 Example: Coin toss – Define X(H)=1, X(T)=0 (indicator RV)
6
ELEC 303, Koushanfar, Fall’09 Review: discrete random variable PMF, expectation, variance Probability mass function (PMF) P X (x) = P (X=x) x P X (x)=1
7
ELEC 303, Koushanfar, Fall’09 Expected value for functions of RV Let X be a random variable with PMF pX, and let g(X) be a function of X. Then, the expected value of the random variable g(X) is given by E[g(X)] = x g(x)p X (x) Var(X) = E[(X-E[X]) 2 ] = x (x-E[X]) 2 p X (x) Similarly, the nth moment is given by – E[X n ]= x x n p X (x)
8
ELEC 303, Koushanfar, Fall’09 Properties of variance
9
ELEC 303, Koushanfar, Fall’09 Joint PMFs of multiple random variables Joint PMF of two random variabels: p X,Y P X,Y (x,y)=P(X=x,Y=y) Calculate the PMFs of X and Y by the formula – P X (x)= y P X,Y (x,y) – P Y (y)= X P X,Y (x,y) We refer to P X and P Y as the marginal PMFs
10
ELEC 303, Koushanfar, Fall’09 Tabular method For computing marginal PMFs Assume Z=X+2Y Find E[Z]?
11
ELEC 303, Koushanfar, Fall’09 Expectation
12
ELEC 303, Koushanfar, Fall’09 Variances
13
ELEC 303, Koushanfar, Fall’09 Example: Binomial mean and variance
14
ELEC 303, Koushanfar, Fall’09 More than two variables P X,Y,Z (x,y,z) = P(X=x,Y=y,Z=z) P X,Y (x,y) = z P X,Y,Z (x,y,z) P X (x) = y z P X,Y,Z (x,y,z) The expected value rule: E[g(X,Y,Z)] = x y z g(x,y,z)P X,Y,Z (x,y,z) http://www.coventry.ac.uk
15
ELEC 303, Koushanfar, Fall’09 Conditioning Conditional PMF of a RV on an event A P X|A (x)=P(X=x|A) = P({X=x} A)/P(A) P(A) = x P({X=x} A) x P X|A (x) = 1
16
ELEC 303, Koushanfar, Fall’09 Example A student will take a certain test up to a max of n times, each time with a probability p of passing independent of the number of attempts Find the PMF of the number of attempts given that the student passes the test A={the event of passing} X is a geometric RV with parameter p and A={X n} P(A) = {m=1 to n} (1-p) m-1 p
17
ELEC 303, Koushanfar, Fall’09 Conditioning a RV on another P X|Y (x|y) = P(X=x|Y=y) P X|Y (x|y) = P(X=x,Y=y)/P(Y=y) = P X,Y (x,y)/P Y (y) The conditional PMF is often used for the joint PMF, using a sequential approach P X,Y (x,y) = P Y (y)P X|Y (x|y)
18
ELEC 303, Koushanfar, Fall’09 Conditional expectation Conditional expectation of X given A (P(A)>0) E(X|A)= x x P X|A (x|A) E[g(X)|A] = x g(x) P X|A (x|A) If A 1,..,A n are disjoint events partitioning the sample space, then E[X]= i P(A i )E[X|A i ] For any event B with P(A i B)>0 for all i E[X|B]= i P(A i |B)E[X|A i B] E(X)= y p Y (y)E(X|Y=y)
19
ELEC 303, Koushanfar, Fall’09 Mean and variance of Geometric Assume there is a probability p that your program works correctly (independent of how many times you write). Find the mean and variance of X, the number of tries till it works correctly? p X (k)=(1-p) k-1 p, k=1,2,… E[X] = k k(1-p) k-1 p Var(X) = k (k-E[X]) 2 (1-p) k-1 p
20
ELEC 303, Koushanfar, Fall’09 Mean and variance of Geometric E[X|X=1]=1, E[X|X>1]=1+E(X) – E[X] E[X 2 |X=1]=1, E[X 2 |X>1]=E[(1+X) 2 ]=1+2E[x]+E[X 2 ] E[X 2 ] = 1+2(1-p)E[X]/p
21
ELEC 303, Koushanfar, Fall’09 Independence Independence from an event P(X=x, A) = P(X=x)P(A) = P X (x) P(A), for all x P(X=x, A) = P(X=x and A) = P X|A (x)(A), P X|A (x)=P X (x), for all x Independence of random variables P(X=x,Y=y|A) =P(X=x|A)P(Y=y|A) for all x and y For two independent RVs: E[XY] = E[X]E[Y] Also, E[g(X)h(Y)] = E[g(X)]E[h(Y)]
22
ELEC 303, Koushanfar, Fall’09 Multiple RVs, sum of RVs Three RVs X, Y, and Z are said to be independent if P X,Y,Z (x,y,z) = P X (x)P Y (y)P Z (z)
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.