Download presentation
Presentation is loading. Please wait.
Published byRoy McDonald Modified over 9 years ago
1
Lecture 5,6,7: Random variables and signals Aliazam Abbasfar
2
Outline Random variables overview Random signals Signals correlation Power spectral density
3
Random variables (RV) PDF, CDF f X (x) = d/dx [ F X (x) ] Mean, variance, momentsE[x], Var[x], E[x n ] Functions of RVs Y = g(X) Several RVs Joint PDF, CDF Conditional probability Sum Independent RVs Correlation of 2 RVsE[x y] Example : Binary communication with noise
4
Binomial distribution X = # of successes in N independent trials p : success probability (1-p : failure) Sum of N binary RVs : X = x i If N is large, it becomes a Gaussian PDF x =Np x 2 =Npq Example : Error probability in binary packets
5
Gaussian RVs and the CLT PDF (mean and variance) CDF defined by error function (erf()) Central Limit Theorem: X 1,…,X n i.i.d Let Y= i X i, Z=(Y- Y )/ Y As n, Z becomes Gaussian, x =0, x 2 =1. Uncorrelated Gaussian RVs are independent xx xx N(x,x2)N(x,x2) Z~ N ( ) Tails decrease exponentially
6
Random Processes Ensemble of random signals (sample functions) Deterministic signals with RVs Voltage waveforms Message signals Thermal noise Samples of a random signal x(t) ; a random variable E[x(t)], Var[x(t)] x(t 1 ), x(t 2 ) joint random variables
7
Correlation Correlation = statistic similarity Cross correlation of two random signals R XY (t 1,t 2 )=E[x(t 1 )y(t 2 )] Uncorrelated/Independent RSs Autocorrelation R(t 1,t 2 )=E[x(t 1 )x(t 2 )] R X (t,t) = E[x 2 (t)] = Var[x(t)]+E[x] 2 Average power P = E[P i ] = E[ ] = Most of RSs are power signals ( 0< P < )
8
Wide Sense Stationary (WSS) A process is WSS if E[x(t)]= X R X (t 1,t 2 )= E[x(t 1 )x(t 2 )]=R X (t 2 -t 1 )= R X () R X (0)=E[x 2 (t)]< Stationary in 1 st and 2 nd moments Autocorrelation R X ()= R X (-) |R X ()| R X () R X ()=0 : samples separated by uncorrelated Average power P = = R x (0)
9
Ergodic process Time average of any sample function = Ensemble average ( any i and any g) = E[g(x(t))] Ensemble averages are time-independent DC : = E[ x(t) ] = m x Total power : = E[ x 2 (t) ] = (s x ) 2 + (m x ) 2 Average power : P = E[ ] = P i Use one sample function to estimate signal statistics Time-average instead of ensemble average
10
Examples Sinusoid with random phase DC signal with random level Binary NRZ signaling
11
Power spectral density Time-averaged autocorrelation Power spectral density Average power
12
Examples Y(t) = X(t) cos(w c t) WSS ? R Y () and G Y (f)
13
Correlations for LTI systems If x(t) is WSS, x(t) and y(t) are jointly WSS m Y = H(0) m X R YX () = h() R xx () R XY () = R YX (-)= h(-) R xx () R YY () = h() h(-) R xx () G Y (f) = |H(f)| 2 G X (f)
14
Sum process z(t) = x(t) + y(t) R Z () = R X () + R Y () + R XY () + R XY (-) G Z (f) = G X (f) + G Y (f) + 2 Re[G XY (f)] If X and Y are uncorrelated R XY () = m X m Y G Z (f) = G X (f) + G Y (f) + 2 m X m Y (f)
15
Reading Carlson Ch. 9.1, 9.2 Proakis&Salehi 4.1, 4.2, 4.3 4.4
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.