Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.

Similar presentations


Presentation on theme: "1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic."— Presentation transcript:

1 1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern Mediterranean University

2 2 EE571 1.7 Statistics of Stochastic Processes n-th Order Distribution (Density) Expected Value Autocorrelation Cross-correlation Autocovariance Cross-covariance

3 3 EE571 First-Order and 2nd-Order Distribution First-Order Dustribution For a specific t, X(t) is a random variable with first-order distribution function F X(t) (x) = P{X(t)  x}, The first-order density of X(t) is defined as 2 nd -Order Distribution F X(t 1 )X(t 2 ) (x 1, x 2 ) = P{X(t 1 )  x 1, X(t 2 )  x 2 }

4 4 EE571 nth-Order Distribution Definition The nth-order distribution of X(t) is the joint distribution of the random variables X(t 1 ), X(t 2 ), …, X(t n ), i.e., F X(t 1 )…X(t n ) (x 1, …, x n ) = P{X(t 1 )  x 1,…, X(t n )  x n } Properties

5 5 EE571 Expected Value Definition : The expected value of a stochastic process X(t) is the deterministic function  X (t) = E[X(t)] Example 1.10 : If R is a nonnegative random variable, find the expected value of X(t)=R|cos2  ft|. Sol :

6 6 EE571 Autocorrelation Definition The autocorrelation function of the stochastic process X(t) is R X (t,  ) = E[X(t)X(t+  )], The autocorrelation function of the random sequence X n is R X [m,k] = E[X m X m+k ]. P.S. If X(t) and X n are complex, then R X (t,  ) = E[X(t)X * (t+  )], R X [m,k] = E[X m X * m+k ], where X * (t+  ) and X * m+k are the conjugate.

7 7 EE571 Complex Process and Vector Processes Definitions The complex process Z(t) = X(t) + jY(t) is specified in terms of the joint statistics of the real processes X(t) and Y(t). The vector process (n-dimensional process) is a family of n stochastic processes.

8 8 EE571 Example 1.11 Find the autocorrelation R X (t,  ) of the process X(t) = rcos(  t+  ), where the R.V. r and  are independent and  is uniform in the interval ( ,  ). Sol :

9 9 EE571 Cross-correlation Definition The cross-correlation function of two stochastic processes X(t) and Y(t) is R XY (t,  ) = E[X(t)Y * (t+  )], The cross-correlation function of two random sequences X n and Y n is R XY [m,k] = E[X m Y * m+k ] If, R XY (t,  ) = 0 then X(t) and Y(t) are called orthogonal.

10 10 EE571 Autocovariance Definition The autocovariance function of the stochastic processes X(t) is C X (t,  ) = Cov[X(t), X(t+  )] = E{[X(t)  X (t)][X * (t+  )  * X (t +  )]} The autocovariance function of the random sequence X n is C X [m, k] = Cov[X m, X m+k ] = E{[X m  X (m)][X * m+k  * X (m+k)]}

11 11 EE571 Example 1.12 The input to a digital filter is an iid random sequence …, X -1, X 0, X 1, X 2, … with E[X i ] = 0 and Var[X i ] = 1. The output is a random sequence …, Y -1, Y 0, Y 1, Y 2, … related to the input sequence by the formula Y n = X n + X n-1 for all integers n. Find the expected value E[Y n ] and autocovariance C Y [m, k]. Sol :

12 12 EE571 Example 1.13 Suppose that X(t) is a process with E[X(t)] = 3, R X (t,  ) = 9 + 4e  0.2|  |, Find the mean, variance, and the covariance of the R.V. Z = X(5) and W = X(8). Sol :

13 13 EE571 Cross-covariance Definition The cross-covariance function of two stochastic process X(t) and Y(t) is C XY (t,  ) = Cov[X(t), Y(t+  )] = E{[X(t)  X (t)][Y * (t+  )  * Y (t+  )]} The cross-covariance function of two random sequences X n and Y n is C XY [m, k] = Cov[X m, Y m+k ] = E{[X m  X (m)][Y * m+k  * Y (m+k)]} Uncorrelated : C XY (t,  ) = 0, C XY [m,k] = 0.

14 14 EE571 Correlation Coefficient, a-Dependent Definitions The correlation coefficient of the process X(t) is The process X(t) is called a-dependent if X(t) for t t 0 +a are mutually independent. Then we have, C X (t,  ) = 0, for |  |>a. The process X(t) is called correlation a- dependent if C X (t,  ) satisfies C X (t,  ) = 0, for |  |>a.

15 15 EE571 Example 1.14 (a) Find the autocorrelation R X (t,  ) if X(t) = ae j  t. (b) Find the autocorrelation R X (t,  ) and autocovariance C X (t,  ), if, where the random variables a i are uncorrelated with zero mean and variance  i 2. Sol :

16 16 EE571 Theorem 1.2 The autocorrelation and autocovariance functions of a stochastic process X(t) satisfy C X (t,  ) = R X (t,  )  X (t)  * X (t+  ) The autocorrelation and autocovariance functions of the random sequence X n satisfy C X [m,k] = R X [m,k]  X (m)  * X (m+k) The cross-correlation and cross-covariance functions of a stochastic process X(t) and Y(t) satisfy C XY (t,  ) = R XY (t,  )  X (t)  * Y (t+  ) The cross-correlation and cross-covariance functions of two random sequences X n and Y n satisfy C XY [m,k] = R XY [m,k]  X (m)  * Y (m+k)

17 17 EE571 1.8 Stationary Processes Definition A stochastic process X(t) is stationary if and only if for all sets of time instants t 1, t 2, …, t m, and any time difference , A random sequence X n is stationary if and only if for any set of integer time instants n 1, n 2, …, n m, and integer time difference k, Also called as Strict Sense Stationary (SSS).

18 18 EE571 Theorem 1.3 Let X(t) be a stationary random process. For constants a > 0 and b, Y(t) = aX(t)+b is also a stationary process. Pf :  Y(t) is stationary. X(t) is stationary

19 19 EE571 Theorem 1.4(a) For a stationary process X(t), the expected value, the autocorrelation, and the autocovariance have the following properties for all t : (a)  X (t) =  X (b) R X (t,  ) = R X (0,  ) = R X (  ) (c) C X (t,  ) = R X (  )   X 2 = C X (  ) Pf :

20 20 EE571 Theorem 1.4(b) For a stationary random sequence X n, the expected value, the autocorrelation, and the autocovariance satisfy for all n : (a) E[X n ] =  X (b) R X [n,k] = R X [0,k] = R X [k] (c) C X [n,k] = R X [k]   X 2 = C X [k] Pf : D.I.Y.

21 21 EE571 Example 1.15 At the receiver of an AM radio, the received signal contains a cosine carrier signal at the carrier frequency f c with a random phase  that is a sample value of the uniform (0,2  ) random variable. The received carrier signal is X(t) = A cos(2  f c t +  ). What are the expected value and autocorrelation of the process X(t) ? Sol :

22 22 EE571 Wide Sense Stationary Processes Definition X(t) is a wide sense stationary (WSS) stochastic process if and only if for all t, E[X(t)] =  X, and R X (t,  ) = R X (0,  ) = R X (  ). (ref: Example 1.15) X n is a wide sense stationary random sequence if and only if for all n, E[X n ] =  X, and R X [n,k] = R X [0,k] = R X [k]. Q : SSS implies WSS? WSS implies SSS?

23 23 EE571 Theorem 1.5 For a wide sense stationary stochastic process X(t), the autocorrelation function R X (  ) has the following properties : R X (0)  0, R X (  ) = R X (  ) and R X (0)  | R X (  )|. If X n is a wide sense stationary random sequence : R X [0]  0, R X [n] = R X [  n] and R X [0]  |R X [0]|. Pf :

24 24 EE571 Average Power Definition The average power of a wide sense stationary process X(t) is R X (0) = E[X 2 (t) ]. The average power of a wide sense stationary random sequence X n is R X [0] = E[X n 2 ]. Example : use x(t) = v 2 (t)/R=i 2 (t)R to model the instantaneous power.

25 25 EE571 Ergodic Processes Definition For a stationary random process X(t), if the ensemble average equals the time average, it is called ergodic. For a stationary process X(t), we can view the time average X(T), as an estimate of  X.

26 26 EE571 Theorem 1.6 Let X(t) be a stationary process with expected value  X and autocovariance C X (  ). If, then is an unbiased, consistent sequence of estimates of  X. Pf : unbiased. consistent sequence: we must show that

27 27 EE571 Jointly Wide Sense Stationary Processes Continuous-time random process X(t) and Y(t) are jointly wide sense stationary if X(t) and Y(t) are both wide sense stationary, and the cross- correlation depends only on the time difference between the two random variables : R XY (t,  ) = R XY (  ). Random Sequences X n and Y n are jointly wide sense stationary if X n and Y n are both wide sense stationary, and the cross-correlation depends only on the index difference between the two random variables : R XY [m,k] = R XY [k].

28 28 EE571 Theorem 1.7 If X(t) and Y(t) are jointly wide sense stationary continuous-time processes, then R XY (  ) = R YX (  ). If X n and Y n are jointly wide sense stationary random sequences, then R XY [k] = R YX [  k]. Pf :

29 29 EE571 Example 1.16 Suppose we are interested in X(t) but we can observe only and Y(t) = X(t) + N(t), where N(t) is a noise process. Assume X(t) and N(t) are independent WSS with E[X(t)] =  X and E[N(t)] =  N = 0. Is Y(t) WSS? Are X(t) and Y(t) jointly WSS ? Are Y(t) and N(t) jointly WSS ? Sol :

30 30 EE571 Example 1.17 X n is a WSS random sequence with R X [k]. The random sequence Y n is obtained from X n by reversing the sign of every other random variable in X n : Y n = (  1) n X n. (a) Find R Y [n,k] in terms of R X [k]. (b) Find R XY [n,k] in terms of R X [k]. (c) Is Y n WSS? (d) Are X n and Y n jointly WSS? Sol :


Download ppt "1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic."

Similar presentations


Ads by Google