ELEC 303 – Random Signals Lecture 20 – Random processes

Slides:



Advertisements
Similar presentations
Lecture 7 Linear time invariant systems
Advertisements

Lecture 6 Power spectral density (PSD)
Review of Frequency Domain
EE322 Digital Communications
Sep 22, 2005CS477: Analog and Digital Communications1 Random Processes and PSD Analog and Digital Communications Autumn
SYSTEMS Identification
1 HW 3: Solutions 1. The output of a particular system S is the time derivative of its input. a) Prove that system S is linear time-invariant (LTI). Solution:
Lecture 9: Fourier Transform Properties and Examples
Review of Probability and Random Processes
Lecture 16 Random Signals and Noise (III) Fall 2008 NCTU EE Tzu-Hsien Sang.
Matched Filters By: Andy Wang.
Digital Communication
ELEC 303 – Random Signals Lecture 21 – Random processes
Review of Probability.
Chapter 4. Random Processes
EE513 Audio Signals and Systems Digital Signal Processing (Systems) Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
ELEC ENG 4035 Communications IV1 School of Electrical & Electronic Engineering 1 Section 2: Frequency Domain Analysis Contents 2.1 Fourier Series 2.2 Fourier.
Probability Theory and Random Processes
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
1 Let g(t) be periodic; period = T o. Fundamental frequency = f o = 1/ T o Hz or  o = 2  / T o rad/sec. Harmonics =n f o, n =2,3 4,... Trigonometric.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
EE484: Probability and Introduction to Random Processes Autocorrelation and the Power Spectrum By: Jason Cho
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
1 I. Phasors (complex envelope) representation for sinusoidal signal narrow band signal II. Complex Representation of Linear Modulated Signals & Bandpass.
CISE315 SaS, L171/16 Lecture 8: Basis Functions & Fourier Series 3. Basis functions: Concept of basis function. Fourier series representation of time functions.
111 Lecture 2 Signals and Systems (II) Principles of Communications Fall 2008 NCTU EE Tzu-Hsien Sang.
Chapter 5 Frequency Domain Analysis of Systems. Consider the following CT LTI system: absolutely integrable,Assumption: the impulse response h(t) is absolutely.
Signals & systems Ch.3 Fourier Transform of Signals and LTI System 5/30/2016.
EE104: Lecture 5 Outline Review of Last Lecture Introduction to Fourier Transforms Fourier Transform from Fourier Series Fourier Transform Pair and Signal.
Elements of Stochastic Processes Lecture II
Course Outline (Tentative) Fundamental Concepts of Signals and Systems Signals Systems Linear Time-Invariant (LTI) Systems Convolution integral and sum.
Fourier Analysis of Signals and Systems
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 1 Random Process
Chapter 2. Fourier Representation of Signals and Systems
Geology 6600/7600 Signal Analysis 21 Sep 2015 © A.R. Lowry 2015 Last time: The Cross-Power Spectrum relating two random processes x and y is given by:
Discrete-time Random Signals
and shall lay stress on CORRELATION
11 Lecture 2 Signals and Systems (II) Principles of Communications Fall 2008 NCTU EE Tzu-Hsien Sang.
Lecture 5 – 6 Z - Transform By Dileep Kumar.
EE354 : Communications System I
Geology 6600/7600 Signal Analysis 28 Sep 2015 © A.R. Lowry 2015 Last time: Energy Spectral Density; Linear Systems given (deterministic) finite-energy.
DTFT continue (c.f. Shenoi, 2006)  We have introduced DTFT and showed some of its properties. We will investigate them in more detail by showing the associated.
Joint Moments and Joint Characteristic Functions.
Random Processes Gaussian and Gauss-Markov processes Power spectrum of random processes and white processes.
Geology 5600/6600 Signal Analysis 14 Sep 2015 © A.R. Lowry 2015 Last time: A stationary process has statistical properties that are time-invariant; a wide-sense.
ELEC 303 – Random Signals Lecture 19 – Random processes Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 12, 2009.
Lecture 5,6,7: Random variables and signals Aliazam Abbasfar.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
© by Yu Hen Hu 1 ECE533 Digital Image Processing Review of Probability, Random Process, Random Field for Image Processing.
EEE Chapter 6 Random Processes and LTI Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
Oh-Jin Kwon, EE dept., Sejong Univ., Seoul, Korea: 2.3 Fourier Transform: From Fourier Series to Fourier Transforms.
Eeng360 1 Chapter 2 Linear Systems Topics:  Review of Linear Systems Linear Time-Invariant Systems Impulse Response Transfer Functions Distortionless.
Chapter 2. Signals and Linear Systems
ENEE 322: Continuous-Time Fourier Transform (Chapter 4)
Fourier Transform and Spectra
Chapter 6 Random Processes
Signals & systems Ch.3 Fourier Transform of Signals and LTI System
SIGNALS PROCESSING AND ANALYSIS
EE 309 Signal and Linear System Analysis
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Chapter 2. Fourier Representation of Signals and Systems
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Chapter 2 Linear Systems
8.6 Autocorrelation instrument, mathematical definition, and properties autocorrelation and Fourier transforms cosine and sine waves sum of cosines Johnson.
Chapter 6 Random Processes
Presentation transcript:

ELEC 303 – Random Signals Lecture 20 – Random processes Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 11, 2010

Lecture outline Basic concepts Random processes and linear systems Power spectral density of stationary processes Power spectra in LTI systems Power spectral density of a sum process Gaussian processes

RP and linear systems When a RP passes a linear time-invariant system the output is also a RP Assuming a stationary process X(t) is input, the linear time-invariant system with the impulse response h(t), output process Y(t) Under what condition the output process would be stationary? Under what conditions will the input/output jointly stationary? Find the output mean, autocorrelation, and crosscorrelation h(t) X(t) Y(t)

Linear time invariant systems If a stationary RP with mean mX and autocorrelation function RX() Linear time invariant (LTI) system with response h(t) Then, the input and output process X(t) and Y(t) will be jointly stationary with h(t) X(t) Y(t)

The response mean Using the convolution integral to relate the output Y(t) to the input X(t), Y(t)=X()h(t-)d This proves that mY is independent of t h(t) X(t) Y(t)

Cross correlation The cross correlation function between output and the input is This shows that RXY(t1,t2) depends only on =t1-t2

Output autocorrelation The autocorrelation function of the output is This shows that RY and RXY depend only on =t1-t2,  Output process is stationary, and input/output are jointly stationary

Power spectral density of a stationary process If the signals in the RP are slowly varying, then the RP would mainly contain the low frequencies in its power concentration If the signal changes very fast, most of the power will be concentrated at high frequency The power spectral density of a RP X(t) is denoted by SX(f) showing the strength of the power in RP as a function of frequency The unit for SX(f) is Watts/Hz

Wiener-Khinchin theorem For a stationary RP X(t), the power spectral density is the Fourier transform of the autocorrelation function, i.e.,

Example 2 Randomly choose a phase  ~ U[0,2] Generate a sinusoid with fixed amplitude (A) and fixed freq (f0) but a random phase  The RP is X(t)= A cos(2f0t + ) From the previous lecture, we know

Example 3 X(t)=X Random variable X~U[-1,1] In this case Thus, For each realization of the RP, we have a different power spectrum

Power spectral density The power content of a RP is the sum of the powers at all frequencies in that RP To find the total power, need to integrate the power spectral density across all frequencies Since SX(f) is the Fourier transform of RX(), then RX() will be the inverse Fourier transform of SX(f), Thus Substituting =0, we get

Example 4 Find the power in the process of example 2

Translation to frequency domain For the LTI system and stationary input, find the translation of the relationships between the input/output in frequency domain Compute the Fourier transform of both sides to obtain Which says the mean of a RP is its DC value. Also, phase is irrelevant for power. Only the magnitude affects the power spectrum, i.e., power dependent on amplitude, not phase

Example 5 If a RP passes through a differentiator H(f)=j2f Then, mY=mX H(0) = 0 Also, SY(f) = 42 f2 SX(f)

Cross correlation in frequency domain Let us define the cross spectral density SXY(f) Since RYX() = RXY(-), we have Although SX(f) and SY(f) are real nonnegative functions, SXY(f) and SYX(f) can generally be complex functions

Example 6 Randomly choose a phase  ~ U[0,2] Generate a sinusoid with fixed amplitude (A) and fixed freq (f0) but a random phase  The RP is X(t)= A cos(2f0t + ) The X(t) goes thru a differentiator H(f)=j2f

Example 7 X(t)=X Random variable X~U[-1,1] If this goes through differentiation, then SY(f) = 42 f2 ((f)/3) = 0 SXY(f) = -j2f ((f)/3) = 0

Power spectral density of a sum process Z(t) = X(t)+Y(t) X(t) and Y(t) are jointly stationary RPs Z(t) is a stationary process with RZ() = RX() + RY() + RXY() + RYX() Taking the Fourier transform from both sides: SZ(f) = SX(F) + SY(f) + 2 Re[SXY(f)] The power spectral density of the sum process is the sum of the power spectral of the individual processes plus a term, that depends on the cross correlation If X(t) and Y(t) are uncorrelated, then RXY()=mXmY If at least one of the processes is zero mean, RXY()=0, and we get: SZ(f) = SX(F) + SY(f)

Example 8 X(t)=X Random variable X~U[-1,1] Z(t) = X(t) + d/dt X(t), then SXY(f) = jA2f0 /2 [(f+f0) - (f-f0)] Thus, Re[SXY(f)] = 0 SZ(f)= SX(f)+SY(f) = A2(1/4+2f02)[(f+f0)+(f-f0)]

Gaussian processes Widely used in communication Because thermal noise in electronics is produced by the random movement of electrons closely modeled by a Gaussian RP In a Gaussian RP, if we look at different instances of time, the resulting RVs will be jointly Gaussian: Definition 1: A random process X(t) is a Gaussian process if for all n and all (t1,t2,…,tn), the RVs {X(ti)}, i=1,…,n have a jointly Gaussian density function.

Gaussian processes (Cont’d) Definition 2: The random processes X(t) and Y(t) are jointly Gaussian if for all n and all (t1,t2,…,tn), and (1,2,…,m)the random vector {X(ti)}, i=1,…,n, {Y(j}, j=1,…,m have an n+m dimensional jointly Gaussian density function. It is obvious that if X(t) and Y(t) are jointly Gaussian, then each of them is individually Gaussian The reverse is not always true The Gaussian processes have important and unique properties

Important properties of Gaussian processes Property 1: If the Gaussian process X(t) is passed through an LTI system, then the output process Y(t) will also be a Gaussian process. Y(t) and X(t) will be jointly Gaussian processes Property 2: For jointly Gaussian processes, uncorrelatedness and independence are equivalent