1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.

Slides:



Advertisements
Similar presentations
Communication System Overview
Advertisements

ELEN 5346/4304 DSP and Filter Design Fall Lecture 15: Stochastic processes Instructor: Dr. Gleb V. Tcheslavski Contact:
Eeng Chapter 2 Orthogonal Representation, Fourier Series and Power Spectra  Orthogonal Series Representation of Signals and Noise Orthogonal Functions.
ELEC 303 – Random Signals Lecture 20 – Random processes
STAT 497 APPLIED TIME SERIES ANALYSIS
Lecture 6 Power spectral density (PSD)
Random Processes ECE460 Spring, Random (Stocastic) Processes 2.
Stochastic processes Lecture 8 Ergodicty.
Copyright Robert J. Marks II ECE 5345 Random Processes - Example Random Processes.
EE322 Digital Communications
Sep 22, 2005CS477: Analog and Digital Communications1 Random Processes and PSD Analog and Digital Communications Autumn
Stochastic Processes Dr. Talal Skaik Chapter 10 1 Probability and Stochastic Processes A friendly introduction for electrical and computer engineers Electrical.
SYSTEMS Identification
1 11 Lecture 14 Random Signals and Noise (I) Fall 2008 NCTU EE Tzu-Hsien Sang.
Lecture 16 Random Signals and Noise (III) Fall 2008 NCTU EE Tzu-Hsien Sang.
Sep 20, 2005CS477: Analog and Digital Communications1 Random variables, Random processes Analog and Digital Communications Autumn
Random Variables.
ELEC 303 – Random Signals Lecture 21 – Random processes
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Chapter 4. Random Processes
Probability Theory and Random Processes
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Time Series Analysis.
COSC 4214: Digital Communications Instructor: Dr. Amir Asif Department of Computer Science and Engineering York University Handout # 2: Random Signals.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
Week 2ELE Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.
EE484: Probability and Introduction to Random Processes Autocorrelation and the Power Spectrum By: Jason Cho
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
Mathematical Preliminaries. 37 Matrix Theory Vectors nth element of vector u : u(n) Matrix mth row and nth column of A : a(m,n) column vector.
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
2. Stationary Processes and Models
Elements of Stochastic Processes Lecture II
September 9, 2004 EE 615 Lecture 2 Review of Stochastic Processes Random Variables DSP, Digital Comm Dr. Uf Tureli Department of Electrical and Computer.
1 EE571 PART 2 Probability and Random Variables Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
CHAPTER 5 SIGNAL SPACE ANALYSIS
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
COSC 4214: Digital Communications Instructor: Dr. Amir Asif Department of Computer Science and Engineering York University Handout # 3: Baseband Modulation.
Random Processes and Spectral Analysis
Chapter 1 Random Process
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
EE354 : Communications System I
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Random Processes Gaussian and Gauss-Markov processes Power spectrum of random processes and white processes.
Geology 5600/6600 Signal Analysis 14 Sep 2015 © A.R. Lowry 2015 Last time: A stationary process has statistical properties that are time-invariant; a wide-sense.
ELEC 303 – Random Signals Lecture 19 – Random processes Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 12, 2009.
Lecture 5,6,7: Random variables and signals Aliazam Abbasfar.
ECE4270 Fundamentals of DSP Lecture 8 Discrete-Time Random Signals I School of Electrical and Computer Engineering Center for Signal and Image Processing.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
EEE Chapter 6 Random Processes and LTI Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
Random process UNIT III Prepared by: D.MENAKA, Assistant Professor, Dept. of ECE, Sri Venkateswara College of Engineering, Sriperumbudur, Tamilnadu.
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
Eeng360 1 Chapter 2 Linear Systems Topics:  Review of Linear Systems Linear Time-Invariant Systems Impulse Response Transfer Functions Distortionless.
디지털통신 Random Process 임 민 중 동국대학교 정보통신공학과 1.
Chapter 6 Random Processes
Stochastic Process - Introduction
Time Series Analysis.
SIGNALS PROCESSING AND ANALYSIS
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
STOCHASTIC HYDROLOGY Random Processes
APPENDIX B Multivariate Statistics
Chapter 6 Random Processes
Hi everybody I am robot. You can call me simply as robo
Presentation transcript:

1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern Mediterranean University

2 EE Statistics of Stochastic Processes n-th Order Distribution (Density) Expected Value Autocorrelation Cross-correlation Autocovariance Cross-covariance

3 EE571 First-Order and 2nd-Order Distribution First-Order Dustribution For a specific t, X(t) is a random variable with first-order distribution function F X(t) (x) = P{X(t)  x}, The first-order density of X(t) is defined as 2 nd -Order Distribution F X(t 1 )X(t 2 ) (x 1, x 2 ) = P{X(t 1 )  x 1, X(t 2 )  x 2 }

4 EE571 nth-Order Distribution Definition The nth-order distribution of X(t) is the joint distribution of the random variables X(t 1 ), X(t 2 ), …, X(t n ), i.e., F X(t 1 )…X(t n ) (x 1, …, x n ) = P{X(t 1 )  x 1,…, X(t n )  x n } Properties

5 EE571 Expected Value Definition : The expected value of a stochastic process X(t) is the deterministic function  X (t) = E[X(t)] Example 1.10 : If R is a nonnegative random variable, find the expected value of X(t)=R|cos2  ft|. Sol :

6 EE571 Autocorrelation Definition The autocorrelation function of the stochastic process X(t) is R X (t,  ) = E[X(t)X(t+  )], The autocorrelation function of the random sequence X n is R X [m,k] = E[X m X m+k ]. P.S. If X(t) and X n are complex, then R X (t,  ) = E[X(t)X * (t+  )], R X [m,k] = E[X m X * m+k ], where X * (t+  ) and X * m+k are the conjugate.

7 EE571 Complex Process and Vector Processes Definitions The complex process Z(t) = X(t) + jY(t) is specified in terms of the joint statistics of the real processes X(t) and Y(t). The vector process (n-dimensional process) is a family of n stochastic processes.

8 EE571 Example 1.11 Find the autocorrelation R X (t,  ) of the process X(t) = rcos(  t+  ), where the R.V. r and  are independent and  is uniform in the interval ( ,  ). Sol :

9 EE571 Cross-correlation Definition The cross-correlation function of two stochastic processes X(t) and Y(t) is R XY (t,  ) = E[X(t)Y * (t+  )], The cross-correlation function of two random sequences X n and Y n is R XY [m,k] = E[X m Y * m+k ] If, R XY (t,  ) = 0 then X(t) and Y(t) are called orthogonal.

10 EE571 Autocovariance Definition The autocovariance function of the stochastic processes X(t) is C X (t,  ) = Cov[X(t), X(t+  )] = E{[X(t)  X (t)][X * (t+  )  * X (t +  )]} The autocovariance function of the random sequence X n is C X [m, k] = Cov[X m, X m+k ] = E{[X m  X (m)][X * m+k  * X (m+k)]}

11 EE571 Example 1.12 The input to a digital filter is an iid random sequence …, X -1, X 0, X 1, X 2, … with E[X i ] = 0 and Var[X i ] = 1. The output is a random sequence …, Y -1, Y 0, Y 1, Y 2, … related to the input sequence by the formula Y n = X n + X n-1 for all integers n. Find the expected value E[Y n ] and autocovariance C Y [m, k]. Sol :

12 EE571 Example 1.13 Suppose that X(t) is a process with E[X(t)] = 3, R X (t,  ) = 9 + 4e  0.2|  |, Find the mean, variance, and the covariance of the R.V. Z = X(5) and W = X(8). Sol :

13 EE571 Cross-covariance Definition The cross-covariance function of two stochastic process X(t) and Y(t) is C XY (t,  ) = Cov[X(t), Y(t+  )] = E{[X(t)  X (t)][Y * (t+  )  * Y (t+  )]} The cross-covariance function of two random sequences X n and Y n is C XY [m, k] = Cov[X m, Y m+k ] = E{[X m  X (m)][Y * m+k  * Y (m+k)]} Uncorrelated : C XY (t,  ) = 0, C XY [m,k] = 0.

14 EE571 Correlation Coefficient, a-Dependent Definitions The correlation coefficient of the process X(t) is The process X(t) is called a-dependent if X(t) for t t 0 +a are mutually independent. Then we have, C X (t,  ) = 0, for |  |>a. The process X(t) is called correlation a- dependent if C X (t,  ) satisfies C X (t,  ) = 0, for |  |>a.

15 EE571 Example 1.14 (a) Find the autocorrelation R X (t,  ) if X(t) = ae j  t. (b) Find the autocorrelation R X (t,  ) and autocovariance C X (t,  ), if, where the random variables a i are uncorrelated with zero mean and variance  i 2. Sol :

16 EE571 Theorem 1.2 The autocorrelation and autocovariance functions of a stochastic process X(t) satisfy C X (t,  ) = R X (t,  )  X (t)  * X (t+  ) The autocorrelation and autocovariance functions of the random sequence X n satisfy C X [m,k] = R X [m,k]  X (m)  * X (m+k) The cross-correlation and cross-covariance functions of a stochastic process X(t) and Y(t) satisfy C XY (t,  ) = R XY (t,  )  X (t)  * Y (t+  ) The cross-correlation and cross-covariance functions of two random sequences X n and Y n satisfy C XY [m,k] = R XY [m,k]  X (m)  * Y (m+k)

17 EE Stationary Processes Definition A stochastic process X(t) is stationary if and only if for all sets of time instants t 1, t 2, …, t m, and any time difference , A random sequence X n is stationary if and only if for any set of integer time instants n 1, n 2, …, n m, and integer time difference k, Also called as Strict Sense Stationary (SSS).

18 EE571 Theorem 1.3 Let X(t) be a stationary random process. For constants a > 0 and b, Y(t) = aX(t)+b is also a stationary process. Pf :  Y(t) is stationary. X(t) is stationary

19 EE571 Theorem 1.4(a) For a stationary process X(t), the expected value, the autocorrelation, and the autocovariance have the following properties for all t : (a)  X (t) =  X (b) R X (t,  ) = R X (0,  ) = R X (  ) (c) C X (t,  ) = R X (  )   X 2 = C X (  ) Pf :

20 EE571 Theorem 1.4(b) For a stationary random sequence X n, the expected value, the autocorrelation, and the autocovariance satisfy for all n : (a) E[X n ] =  X (b) R X [n,k] = R X [0,k] = R X [k] (c) C X [n,k] = R X [k]   X 2 = C X [k] Pf : D.I.Y.

21 EE571 Example 1.15 At the receiver of an AM radio, the received signal contains a cosine carrier signal at the carrier frequency f c with a random phase  that is a sample value of the uniform (0,2  ) random variable. The received carrier signal is X(t) = A cos(2  f c t +  ). What are the expected value and autocorrelation of the process X(t) ? Sol :

22 EE571 Wide Sense Stationary Processes Definition X(t) is a wide sense stationary (WSS) stochastic process if and only if for all t, E[X(t)] =  X, and R X (t,  ) = R X (0,  ) = R X (  ). (ref: Example 1.15) X n is a wide sense stationary random sequence if and only if for all n, E[X n ] =  X, and R X [n,k] = R X [0,k] = R X [k]. Q : SSS implies WSS? WSS implies SSS?

23 EE571 Theorem 1.5 For a wide sense stationary stochastic process X(t), the autocorrelation function R X (  ) has the following properties : R X (0)  0, R X (  ) = R X (  ) and R X (0)  | R X (  )|. If X n is a wide sense stationary random sequence : R X [0]  0, R X [n] = R X [  n] and R X [0]  |R X [0]|. Pf :

24 EE571 Average Power Definition The average power of a wide sense stationary process X(t) is R X (0) = E[X 2 (t) ]. The average power of a wide sense stationary random sequence X n is R X [0] = E[X n 2 ]. Example : use x(t) = v 2 (t)/R=i 2 (t)R to model the instantaneous power.

25 EE571 Ergodic Processes Definition For a stationary random process X(t), if the ensemble average equals the time average, it is called ergodic. For a stationary process X(t), we can view the time average X(T), as an estimate of  X.

26 EE571 Theorem 1.6 Let X(t) be a stationary process with expected value  X and autocovariance C X (  ). If, then is an unbiased, consistent sequence of estimates of  X. Pf : unbiased. consistent sequence: we must show that

27 EE571 Jointly Wide Sense Stationary Processes Continuous-time random process X(t) and Y(t) are jointly wide sense stationary if X(t) and Y(t) are both wide sense stationary, and the cross- correlation depends only on the time difference between the two random variables : R XY (t,  ) = R XY (  ). Random Sequences X n and Y n are jointly wide sense stationary if X n and Y n are both wide sense stationary, and the cross-correlation depends only on the index difference between the two random variables : R XY [m,k] = R XY [k].

28 EE571 Theorem 1.7 If X(t) and Y(t) are jointly wide sense stationary continuous-time processes, then R XY (  ) = R YX (  ). If X n and Y n are jointly wide sense stationary random sequences, then R XY [k] = R YX [  k]. Pf :

29 EE571 Example 1.16 Suppose we are interested in X(t) but we can observe only and Y(t) = X(t) + N(t), where N(t) is a noise process. Assume X(t) and N(t) are independent WSS with E[X(t)] =  X and E[N(t)] =  N = 0. Is Y(t) WSS? Are X(t) and Y(t) jointly WSS ? Are Y(t) and N(t) jointly WSS ? Sol :

30 EE571 Example 1.17 X n is a WSS random sequence with R X [k]. The random sequence Y n is obtained from X n by reversing the sign of every other random variable in X n : Y n = (  1) n X n. (a) Find R Y [n,k] in terms of R X [k]. (b) Find R XY [n,k] in terms of R X [k]. (c) Is Y n WSS? (d) Are X n and Y n jointly WSS? Sol :