Discrete-time Random Signals

Slides:



Advertisements
Similar presentations
Lecture 7 Linear time invariant systems
Advertisements

OPTIMUM FILTERING.
ELEC 303 – Random Signals Lecture 20 – Random processes
STAT 497 APPLIED TIME SERIES ANALYSIS
Random Processes ECE460 Spring, Random (Stocastic) Processes 2.
Stochastic processes Lecture 8 Ergodicty.
EE322 Digital Communications
Sep 22, 2005CS477: Analog and Digital Communications1 Random Processes and PSD Analog and Digital Communications Autumn
SYSTEMS Identification
Discrete-Time Signals Week 2. Continuous-Time Versus Discrete-Time Signals Continuous-time signal: a signal defined by a function of a continuous-time.
Review of Probability and Random Processes
About this Course Subject: Textbook Reference book Course website
Lecture 16 Random Signals and Noise (III) Fall 2008 NCTU EE Tzu-Hsien Sang.
1 Chapter 7 Generating and Processing Random Signals 第一組 電機四 B 蔡馭理 資工四 B 林宜鴻.
Digital Communication
Random Variables.
ELEC 303 – Random Signals Lecture 21 – Random processes
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Random Processes and LSI Systems What happedns when a random signal is processed by an LSI system? This is illustrated below, where x(n) and y(n) are random.
Probability Theory and Random Processes
1 Signals & Systems Spring 2009 Week 3 Instructor: Mariam Shafqat UET Taxila.
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Chapter 2: Discrete time signals and systems
STAT 497 LECTURE NOTES 2.
DISCRETE-TIME SIGNALS and SYSTEMS
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
Week 2ELE Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
1 The Fourier Series for Discrete- Time Signals Suppose that we are given a periodic sequence with period N. The Fourier series representation for x[n]
Decimation-in-frequency FFT algorithm The decimation-in-time FFT algorithms are all based on structuring the DFT computation by forming smaller and smaller.
Mathematical Preliminaries. 37 Matrix Theory Vectors nth element of vector u : u(n) Matrix mth row and nth column of A : a(m,n) column vector.
Course Review for Final ECE460 Spring, Common Fourier Transform Pairs 2.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
2. Stationary Processes and Models
Elements of Stochastic Processes Lecture II
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Fourier Analysis of Signals and Systems
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
Random Processes and Spectral Analysis
Chapter 11 Filter Design 11.1 Introduction 11.2 Lowpass Filters
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 1 Random Process
ENEE631 Digital Image Processing (Spring'04) Basics on 2-D Random Signal Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland, College Park.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Geology 6600/7600 Signal Analysis 28 Sep 2015 © A.R. Lowry 2015 Last time: Energy Spectral Density; Linear Systems given (deterministic) finite-energy.
DTFT continue (c.f. Shenoi, 2006)  We have introduced DTFT and showed some of its properties. We will investigate them in more detail by showing the associated.
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Random Processes Gaussian and Gauss-Markov processes Power spectrum of random processes and white processes.
Geology 5600/6600 Signal Analysis 14 Sep 2015 © A.R. Lowry 2015 Last time: A stationary process has statistical properties that are time-invariant; a wide-sense.
ECE4270 Fundamentals of DSP Lecture 8 Discrete-Time Random Signals I School of Electrical and Computer Engineering Center for Signal and Image Processing.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
Geology 5600/6600 Signal Analysis 11 Sep 2015 © A.R. Lowry 2015 Last time: The Central Limit theorem : The sum of a sequence of random variables tends.
Random process UNIT III Prepared by: D.MENAKA, Assistant Professor, Dept. of ECE, Sri Venkateswara College of Engineering, Sriperumbudur, Tamilnadu.
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
Chapter 2. Signals and Linear Systems
In summary If x[n] is a finite-length sequence (n  0 only when |n|
Chapter 6 Random Processes
Stochastic Process - Introduction
SIGNALS PROCESSING AND ANALYSIS
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Digital Communication
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
STOCHASTIC HYDROLOGY Random Processes
山东省精品课程《生物医学信号处理(双语)》
16. Mean Square Estimation
Presentation transcript:

Discrete-time Random Signals Until now, we have assumed that the signals are deterministic, i.e., each value of a sequence is uniquely determined. In many situations, the processes that generate signals are so complex as to make precise description of a signal extremely difficult or undesirable. A random or stochastic signal is considered to be characterized by a set of probability density functions.

Stochastic Processes Random (or stochastic) process (or signal) A random process is an indexed family of random variables characterized by a set of probability distribution function. A sequence x[n], <n< . Each individual sample x[n] is assumed to be an outcome of some underlying random variable Xn. The difference between a single random variable and a random process is that for a random variable the outcome of a random-sampling experiment is mapped into a number, whereas for a random process the outcome is mapped into a sequence.

Stochastic Processes (continue) Probability density function of x[n]: Joint distribution of x[n] and x[m]: Eg., x1[n] = Ancos(wn+n), where An and n are random variables for all  < n < , then x1[n] is a random process.

Independence and Stationary x[n] and x[m] are independent iff x is a stationary process iff for all k. That is, the joint distribution of x[n] and x[m] depends only on the time difference m  n.

Stationary (continue) Particularly, when m = n for a stationary process: It implies that x[n] is shift invariant.

Stochastic Processes vs. Deterministic Signal In many of the applications of discrete-time signal processing, random processes serve as models for signals in the sense that a particular signal can be considered a sample sequence of a random process. Although such a signals are unpredictable – making a deterministic approach to signal representation is inappropriate – certain average properties of the ensemble can be determined, given the probability law of the process.

Expectation Mean (or average)  denotes the expectation operator For independent random variables

Mean Square Value and Variance Mean squared value Variance

Autocorrelation and Autocovariance

Stationary Process For a stationary process, the autocorrelation is dependent on the time difference m  n. Thus, for stationary process, we can write If we denote the time difference by k, we have

Wide-sense Stationary In many instances, we encounter random processes that are not stationary in the strict sense. If the following equations hold, we call the process wide-sense stationary (w. s. s.).

Time Averages For any single sample sequence x[n], define their time average to be Similarly, time-average autocorrelation is

Ergodic Process A stationary random process for which time averages equal ensemble averages is called an ergodic process:

Ergodic Process (continue) It is common to assume that a given sequence is a sample sequence of an ergodic random process, so that averages can be computed from a single sequence. In practice, we cannot compute with the limits, but instead the quantities. Similar quantities are often computed as estimates of the mean, variance, and autocorrelation.

Properties of correlation and covariance sequences Property 1:

Properties of correlation and covariance sequences (continue) Property 2: Property 3

Properties of correlation and covariance sequences (continue) Property 4:

Properties of correlation and covariance sequences (continue) Property 5: If

Fourier Transform Representation of Random Signals Since autocorrelation and autocovariance sequences are all (aperiodic) one-dimensional sequences, there Fourier transform exist and are bounded in |w|. Let the Fourier transform of the autocorrelation and autocovariance sequences be

Fourier Transform Representation of Random Signals (continue) Consider the inverse Fourier Transforms:

Fourier Transform Representation of Random Signals (continue) Consequently, Denote to be the power density spectrum (or power spectrum) of the random process x.

Power Density Spectrum The total area under power density in [,] is the total energy of the signal. Pxx(w) is always real-valued since xx(n) is conjugate symmetric For real-valued random processes, Pxx(w) = xx(ejw) is both real and even.

Mean and Linear System Consider a linear system with frequency response h[n]. If x[n] is a stationary random signal with mean mx, then the output y[n] is also a stationary random signal with mean mx equaling to Since the input is stationary, mx[nk] = mx , and consequently,

Stationary and Linear System If x[n] is a real and stationary random signal, the autocorrelation function of the output process is Since x[n] is stationary , {x[nk]x[n+mr] } depends only on the time difference m+kr.

Stationary and Linear System (continue) Therefore, The output power density is also stationary. Generally, for a LTI system having a wide-sense stationary input, the output is also wide-sense stationary.

Power Density Spectrum and Linear System By substituting l = rk, where A sequence of the form of chh[l] is called a deterministic autocorrelation sequence.

Power Density Spectrum and Linear System (continue) A sequence of the form of Chh[l] l = rk, where Chh(ejw) is the Fourier transform of chh[l]. For real h, Thus

Power Density Spectrum and Linear System (continue) We have the relation of the input and the output power spectrums to be the following:

Power Density Property Key property: The area over a band of frequencies, wa<|w|<wb, is proportional to the power in the signal in that band. To show this, consider an ideal band-pass filter. Let H(ejw) be the frequency of the ideal band pass filter for the band wa<|w|<wb. Note that |H(ejw)|2 and xx(ejw) are both even functions. Hence,

White Noise (or White Gaussian Noise) A white noise signal is a signal for which Hence, its samples at different instants of time are uncorrelated. The power spectrum of a white noise signal is a constant The concept of white noise is very useful in quantization error analysis.

White Noise (continue) The average power of a white-noise is therefore White noise is also useful in the representation of random signals whose power spectra are not constant with frequency. A random signal y[n] with power spectrum yy(ejw) can be assumed to be the output of a linear time-invariant system with a white-noise input.

Cross-correlation The cross-correlation between input and output of a LTI system: That is, the cross-correlation between the input output is the convolution of the impulse response with the input autocorrelation sequence.

Cross-correlation (continue) By further taking the Fourier transform on both sides of the above equation, we have This result has a useful application when the input is white noise with variance x2. These equations serve as the bases for estimating the impulse or frequency response of a LTI system if it is possible to observe the output of the system in response to a white-noise input.

Remained Materials Not Included From Chap. 4, the materials will be taught in the class without using slides