OPTIMUM FILTERING.

Slides:



Advertisements
Similar presentations
State Space Models. Let { x t :t T} and { y t :t T} denote two vector valued time series that satisfy the system of equations: y t = A t x t + v t (The.
Advertisements

Dates for term tests Friday, February 07 Friday, March 07
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The Linear Prediction Model The Autocorrelation Method Levinson and Durbin.
Lecture 7 Linear time invariant systems
AGC DSP AGC DSP Professor A G Constantinides©1 Modern Spectral Estimation Modern Spectral Estimation is based on a priori assumptions on the manner, the.
ELE Adaptive Signal Processing
AGC DSP AGC DSP Professor A G Constantinides©1 A Prediction Problem Problem: Given a sample set of a stationary processes to predict the value of the process.
280 SYSTEM IDENTIFICATION The System Identification Problem is to estimate a model of a system based on input-output data. Basic Configuration continuous.
Pole Zero Speech Models Speech is nonstationary. It can approximately be considered stationary over short intervals (20-40 ms). Over thisinterval the source.
Lecture 19: Discrete-Time Transfer Functions
SYSTEMS Identification
Adaptive FIR Filter Algorithms D.K. Wise ECEN4002/5002 DSP Laboratory Spring 2003.
EE513 Audio Signals and Systems Wiener Inverse Filter Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Adaptive Signal Processing
Linear Prediction Problem: Forward Prediction Backward Prediction
RLSELE Adaptive Signal Processing 1 Recursive Least-Squares (RLS) Adaptive Filters.
Speech Coding Using LPC. What is Speech Coding  Speech coding is the procedure of transforming speech signal into more compact form for Transmission.
Week 2ELE Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
T – Biomedical Signal Processing Chapters
Least SquaresELE Adaptive Signal Processing 1 Method of Least Squares.
Method of Least Squares. Least Squares Method of Least Squares:  Deterministic approach The inputs u(1), u(2),..., u(N) are applied to the system The.
1 Linear Prediction. 2 Linear Prediction (Introduction) : The object of linear prediction is to estimate the output sequence from a linear combination.
1 Linear Prediction. Outline Windowing LPC Introduction to Vocoders Excitation modeling  Pitch Detection.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Speech Signal Representations I Seminar Speech Recognition 2002 F.R. Verhage.
Unit-V DSP APPLICATIONS. UNIT V -SYLLABUS DSP APPLICATIONS Multirate signal processing: Decimation Interpolation Sampling rate conversion by a rational.
2. Stationary Processes and Models
Adv DSP Spring-2015 Lecture#9 Optimum Filters (Ch:7) Wiener Filters.
Linear Predictive Analysis 主講人:虞台文. Contents Introduction Basic Principles of Linear Predictive Analysis The Autocorrelation Method The Covariance Method.
The process has correlation sequence Correlation and Spectral Measure where, the adjoint of is defined by The process has spectral measure where.
EE513 Audio Signals and Systems
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
Lecture#10 Spectrum Estimation
Estimation Method of Moments (MM) Methods of Moment estimation is a general method where equations for estimating parameters are found by equating population.
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Normal Equations The Orthogonality Principle Solution of the Normal Equations.
Autoregressive (AR) Spectral Estimation
Discrete-time Random Signals
Recursive Least-Squares (RLS) Adaptive Filters
Lecture 12: Parametric Signal Modeling XILIANG LUO 2014/11 1.
METHOD OF STEEPEST DESCENT ELE Adaptive Signal Processing1 Week 5.
Review and Summary Box-Jenkins models Stationary Time series AR(p), MA(q), ARMA(p,q)
Impulse Response Measurement and Equalization Digital Signal Processing LPP Erasmus Program Aveiro 2012 Digital Signal Processing LPP Erasmus Program Aveiro.
Geology 6600/7600 Signal Analysis Last time: Linear Systems Uncorrelated additive noise in the output signal, y = v + n, of a SISO system can be estimated.
Geology 6600/7600 Signal Analysis 23 Oct 2015
Linear Prediction.
ECEN3513 Signal Analysis Lecture #4 28 August 2006 n Read section 1.5 n Problems: 1.5-2a-c, 1.5-4, & n Quiz Friday (Chapter 1 and/or Correlation)
Geology 6600/7600 Signal Analysis 26 Oct 2015 © A.R. Lowry 2015 Last time: Wiener Filtering Digital Wiener Filtering seeks to design a filter h for a linear.
Adv DSP Spring-2015 Lecture#11 Spectrum Estimation Parametric Methods.
Least Squares Measurement model Weighted LSQ  Optimal estimates  Linear  Unbiased  Minimum variance.
Presentation : “ Maximum Likelihood Estimation” Presented By : Jesu Kiran Spurgen Date :
ELG5377 Adaptive Signal Processing Lecture 13: Method of Least Squares.
Geology 6600/7600 Signal Analysis Last time: Linear Systems The Frequency Response or Transfer Function of a linear SISO system can be estimated as (Note.
Locating a Shift in the Mean of a Time Series Melvin J. Hinich Applied Research Laboratories University of Texas at Austin
ELG5377 Adaptive Signal Processing
Figure 11.1 Linear system model for a signal s[n].
The Inverse of a Square Matrix
Adaptive Filters Common filter design methods assume that the characteristics of the signal remain constant in time. However, when the signal characteristics.
Linear Prediction.
Error rate due to noise In this section, an expression for the probability of error will be derived The analysis technique, will be demonstrated on a binary.
Modern Spectral Estimation
CS3291: "Interrogation Surprise" on Section /10/04
By Viput Subharngkasen
Linear Prediction.
Lecture 22 IIR Filters: Feedback and H(z)
16. Mean Square Estimation
CH2 Time series.
copyright Robert J. Marks II
Presentation transcript:

OPTIMUM FILTERING

WIENER FILTER This is an optimum filter that is based on the minimization of mean square error between the filter output and the a desired signal x(n) v(n) h(n) y(n) Interference due to noise Ideally y(n) should be equal to xd(n) Filter impulse response sd(n) Desired sequence e(n) Error signal s(n) Assuming that h(n) is of length N, the mean square error is defined as By taking the derivative of the mean square error and let it equal to zero, h(n) is solved by

WIENER FILTER When expressed in terms of the autocorrelation function and crosscorrelation function is Since the x(n) is a summation of the true signal s(n) and noise w(n), and sd(n) is not correlated with the noise v(n), then

WIENER FILTER The matrix representation is The h(n) can be obtained by taking the inverse of the autocorrelation matrix and multiply with crosscorrelation matrix. Minimum mean square error is

WIENER FILTER EXAMPLE A signal is defined as follows where v(n) is additive white Gaussian noise with zero mean and variance 0.1. The Wiener filter length is 4.

WIENER FILTER EXAMPLE The matrix representation is By taking inverse matrix, the filter coefficients are Minimum means square error is

WIENER FILTER CONFIGURATION The various configuration of the Wiener filter is often referred as the linear estimation problem. sd(n)=s(n) filtering sd(n)=s(n+D) D>0 signal prediction sd(n)=s(n-D) D>0 signal smoothing The materials presented will focus only on filtering and prediction.

WOLD REPRESENTATION H(z) V(n) X(n) White noise Random process H(z) - all pole, AR process - all zero, MA process - pole-zero, ARMA 1/H(z) V(n) X(n) White noise Random process

AUTOREGRESSIVE PROCESS Difference equation for M th order AR process is where V(n) is white noise with zero mean and variance sv2. The autocorrelation function is Since RVV(m)= sv2d(m), the autocorrelation function for AR process

AUTOREGRESSIVE PROCESS Expanding the autocorrelation function results in The matrix representation The results is known as the Yule-Walker equations.

LINEAR PREDICTION The linear predictive filter is used to predict the model of the underlying random process. Desired sequence h(n-1) X(n) Ideally Y(n) should be equal to Xd(n) Filter impulse response Xd(n) e(n) Error signal Assuming that h(n) is of length P, then one step forward predictor filter is defined as

LINEAR PREDICTION The forward prediction error is The mean square prediction error is By taking the derivative of the mean square error and let it equal to zero, ap(n) is solved by

LINEAR PREDICTION When expressed in terms of the autocorrelation function is The minimum mean square error predictor error is Combining the above two equation results in the augmented normal equations where the solution for the coefficients are derived

LINEAR PREDICTION The matrix representation for the augmented normal equation is where ap(0)=1. For sample functions, the time averaged autocorrelation function is used. The solution is calculated by taking the inverse matrix.

LINEAR PREDICTION EXAMPLE A sample function x(n) is defined by as follows x(n)=[7.0718, 0.3251, -6.5641, 1.3673, 7.1554, 2.011, -6.775, 1.001, 6.7555, -1.050] The biased time-average autocorrelation function is Rxx(l)=[24.39, -0.6904, -19.30, 1.662, 14.624, -2.127, -9.336, 1.617, 4.7433, -0.7425] For 4 th order predictor, the augmented normal equation is

LINEAR PREDICTION EXAMPLE The solution for the predictor filter is ap(n)=[0.1104, 0.0429, 0.0875, -0.0016] The normalized solution is ap(l)=[1, 0.0388, 0.7924, -0.095]