ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 10 – Conditioning, Continuous Bayes rule Farinaz Koushanfar ECE Dept., Rice University Sept 24, 2009
ELEC 303, Koushanfar, Fall’09 Lecture outline Reading: Conditioning Independence Continuous Bayes rule
ELEC 303, Koushanfar, Fall’09 Properties of CDF (review) Defined by: F X (x) = P(X x), for all x F X (x) is monotonically nondecreasing – If x<y, then F X (x) F X (y) – F X (x) tends to 0 as x - , and tends to 1 as x – For discrete X, F X (x) is piecewise constant – For continuous X, F X (x) is a continuous function – PMF and PDF obtained by summing/differentiate
ELEC 303, Koushanfar, Fall’09 Properties of distribution (review)
ELEC 303, Koushanfar, Fall’09 Conditioning (review) A RV on an event If we condition on an event of form X A, with P(X A)>0, then we have By comparing, we get
ELEC 303, Koushanfar, Fall’09 Example: the exponential RV The time t until a light bulb dies is an exponential RV with parameter If one turns the light, leaves the room and return t seconds later(A={T>t}) X is the additional time until bulb is burned What is the conditional CDF of X, given A? Memoryless property of exponential CDF!
ELEC 303, Koushanfar, Fall’09 Example: total probability theorem Train arrives every 15 mins startng 6am You walk to the station between 7:10-7:30am Your arrival is uniform random variable Find the PDF of the time you have to wait for the first train to arrive x f X (x) y f y|A (y) 7:107:30 5 1/5 y f y|B (y) 15 1/15 y f y (y) 5 1/ /20
ELEC 303, Koushanfar, Fall’09 Conditioning a RV on another The conditional PDF Can use marginal to compute f Y (y) Note that we have
ELEC 303, Koushanfar, Fall’09 Summary of concepts Courtesy of Prof. Dahleh, MIT
ELEC 303, Koushanfar, Fall’09 Conditional expectation Definitions The expected value rule Total expectation theorem
ELEC 303, Koushanfar, Fall’09 Mean and variance of a piecewise constant PDF Consider the events – A 1 ={x is in the first interval [0,1]} – A 2 ={x is in the second interval (1,2]} Find P(A 1 ), P(A 2 )? Use total expectation theorem to find E[X] and Var(X)?
ELEC 303, Koushanfar, Fall’09 Example: stick breaking (1)
ELEC 303, Koushanfar, Fall’09 Example: stick breaking (2)
ELEC 303, Koushanfar, Fall’09 Example: stick breaking (3)
ELEC 303, Koushanfar, Fall’09 Independence Two RVs X and Y are independent if This is the same as (for all y with f Y (y)>0) Can be easily generalized to multiple RVs
ELEC 303, Koushanfar, Fall’09 The continuous Bayes rule X: unobserved RV with PDF f X A noisy measurement Y is available, which is related to X by a conditional PDF f Y|X Once Y measured, what do we know about X? The inference problem is to evaluate f X|Y (x|y) MeasurementInference XY f X (x) f Y|X (y|x)f X|Y (x|y)
ELEC 303, Koushanfar, Fall’09 Continuous Bayes rule f X f Y|X = f X,Y = f Y f X|Y If follows:f X|Y (x|y)= f X (x).f Y|X (y|x) / f Y (y) Use normalization property This is equivalent to:
ELEC 303, Koushanfar, Fall’09 Example: light bulbs A light bulb with an exponential PDF lifetime Y is a RV on any given day, uniformly distributed in the interval [1,3/2] We test a light bulb and record its lifetime What can be said about the parameter ? Solution: model in [1,3/2] as: f ( )=2 Simply write the conditional PDF
ELEC 303, Koushanfar, Fall’09 Conditioning slices the joint PDF
ELEC 303, Koushanfar, Fall’09 Inference about a discrete RV Let P(A) be the probability of event A The conditional PDFs f Y|A and f Y|A’ are known Estimate P(A|Y=y) given value y of Y Instead of conditioning on event {Y=y} (0 prob) condition on {y Y y+ } – is a small positive Take the limit as goes to zero
ELEC 303, Koushanfar, Fall’09 Inference about a discrete RV (Cont’d) Total probability theorem So More generally, if N is discrete with PMF p N, then we have f Y (y)= i P N (i)f Y|N (y|i) Thus,
ELEC 303, Koushanfar, Fall’09 Example: signal detection For a binary transmitted signal S, we are given P(S=1)=p and P(S=-1)=1-p The received signal Y has noise added: Y=S+N N is standard normal N(0,1), independent of S Find the probability S=1, as a function of observed value y of Y
ELEC 303, Koushanfar, Fall’09 Inference based on discrete observations The formula can be turned around: Based on the normalization property: