Download presentation
Presentation is loading. Please wait.
Published byMatthew Hunt Modified over 9 years ago
1
By Asst.Prof.Dr.Thamer M.Jamel Department of Electrical Engineering University of Technology Baghdad – Iraq
2
Introduction Linear filters : the filter output is a linear function of the filter input Design methods: The classical approach frequency-selective filters such as low pass / band pass / notch filters etc Optimal filter design Mostly based on minimizing the mean-square value of the error signal 2
3
Wiener filter work of Wiener in 1942 and Kolmogorov in 1939 it is based on a priori statistical information when such a priori information is not available, which is usually the case, it is not possible to design a Wiener filter in the first place.
4
Adaptive filter the signal and/or noise characteristics are often nonstationary and the statistical parameters vary with time An adaptive filter has an adaptation algorithm, that is meant to monitor the environment and vary the filter transfer function accordingly based in the actual signals received, attempts to find the optimum filter design
5
Adaptive filter The basic operation now involves two processes : 1. a filtering process, which produces an output signal in response to a given input signal. 2. an adaptation process, which aims to adjust the filter parameters (filter transfer function) to the (possibly time-varying) environment Often, the (average) square value of the error signal is used as the optimization criterion
6
Adaptive filter Because of complexity of the optimizing algorithms most adaptive filters are digital filters that perform digital signal processing When processing analog signals, the adaptive filter is then preceded by A/D and D/A convertors.
7
Adaptive filter The generalization to adaptive IIR filters leads to stability problems It’s common to use a FIR digital filter with adjustable coefficients
8
Applications of Adaptive Filters: Identification Used to provide a linear model of an unknown plant Applications: System identification
9
Applications of Adaptive Filters: Inverse Modeling Used to provide an inverse model of an unknown plant Applications: Equalization (communications channels)
10
Applications of Adaptive Filters: Prediction Used to provide a prediction of the present value of a random signal Applications: Linear predictive coding
11
Applications of Adaptive Filters: Interference Cancellation Used to cancel unknown interference from a primary signal Applications: Echo / Noise cancellation hands-free carphone, aircraft headphones etc
12
Example: Acoustic Echo Cancellation
15
LMS Algorithm Most popular adaptation algorithm is LMS Define cost function as mean-squared error Based on the method of steepest descent Move towards the minimum on the error surface to get to minimum gradient of the error surface estimated at every iteration
16
LMS Adaptive Algorithm Introduced by Widrow & Hoff in 1959 Simple, no matrices calculation involved in the adaptation In the family of stochastic gradient algorithms Approximation of the steepest – descent method Based on the MMSE criterion.(Minimum Mean square Error) Adaptive process containing two input signals: 1.) Filtering process, producing output signal. 2.) Desired signal (Training sequence) Adaptive process: recursive adjustment of filter tap weights
17
LMS Algorithm Steps Filter output Estimation error Tap-weight adaptation 17
18
Stability of LMS The LMS algorithm is convergent in the mean square if and only if the step-size parameter satisfy Here max is the largest eigenvalue of the correlation matrix of the input data More practical test for stability is Larger values for step size Increases adaptation rate (faster adaptation) Increases residual mean-squared error
19
Given the following function we need to obtain the vector that would give us the absolute minimum. It is obvious that give us the minimum. (This figure is quadratic error function (quadratic bowl) ) STEEPEST DESCENT EXAMPLE Now lets find the solution by the steepest descend method
20
We start by assuming (C 1 = 5, C 2 = 7) We select the constant. If it is too big, we miss the minimum. If it is too small, it would take us a lot of time to het the minimum. I would select = 0.1. The gradient vector is: STEEPEST DESCENT EXAMPLE So our iterative equation is:
21
STEEPEST DESCENT EXAMPLE As we can see, the vector [c1,c2] converges to the value which would yield the function minimum and the speed of this convergence depends on. Initial guess Minimum
22
LMS – CONVERGENCE GRAPH This graph illustrates the LMS algorithm. First we start from guessing the TAP weights. Then we start going in opposite the gradient vector, to calculate the next taps, and so on, until we get the MMSE, meaning the MSE is 0 or a very close value to it.(In practice we can not get exactly error of 0 because the noise is a random process, we could only decrease the error below a desired minimum) Example for the Unknown Channel of 2 nd order: Desired Combination of taps
34
Adaptive Array Antenna Adaptive Arrays Linear Combiner Interference SMART ANTENNAS
35
Adaptive Array Antenna
37
Applications are many Digital Communications (OFDM, MIMO, CDMA, and RFID) Channel Equalisation Adaptive noise cancellation Adaptive echo cancellation System identification Smart antenna systems Blind system equalisation And many, many others
39
Introduction Wireless communication is the most interesting field of communication these days, because it supports mobility (mobile users). However, many applications of wireless comm. now require high-speed communications (high-data-rates).
40
What is the ISI Inter-symbol-interference, takes place when a given transmitted symbol is distorted by other transmitted symbols. Cause of ISI ISI is imposed due to band-limiting effect of practical channel, or also due to the multi-path effects (delay spread).
43
Definition of the Equalizer: the equalizer is a digital filter that provides an approximate inverse of channel frequency response. Need of equalization: is to mitigate the effects of ISI to decrease the probability of error that occurs without suppression of ISI, but this reduction of ISI effects has to be balanced with prevention of noise power enhancement.
45
Types of Equalization techniques Linear Equalization techniques which are simple to implement, but greatly enhance noise power because they work by inverting channel frequency response. Non-Linear Equalization techniques which are more complex to implement, but have much less noise enhancement than linear equalizers.
46
Equalization Techniques Fig.3 Classification of equalizers
47
Linear equalizer with N-taps, and (N-1) delay elements. Go
48
Table of various algorithms and their trade-offs: algorithm Multiplying- operations complexityconvergencetracking LMSLowslowpoor MMSE Very high fastgood RLSHighfastgood FastkalmanFairlyLowfastgood RLS- DFE Highfastgood
54
Adaptive Filter Block Diagram
55
The LMS Equation The Least Mean Squares Algorithm (LMS) updates each coefficient on a sample-by-sample basis based on the error e(n). This equation minimises the power in the error e(n).
56
The Least Mean Squares Algorithm The value of µ (mu) is critical. If µ is too small, the filter reacts slowly. If µ is too large, the filter resolution is poor. The selected value of µ is a compromise.
57
LMS Convergence Vs u
58
Audio Noise Reduction A popular application of acoustic noise reduction is for headsets for pilots. This uses two microphones.
59
The Simulink Model
60
Setting the Step size (mu) The rate of convergence of the LMS Algorithm is controlled by the “Step size (mu)”. This is the critical variable.
61
Trace of Input to Model “Input” = Signal + Noise.
62
Trace of LMS Filter Output “Output” starts at zero and grows.
63
Trace of LMS Filter Error “Error” contains the noise.
64
Typical C6713 DSK Setup USB to PCto +5V Headphones Microphone
70
Acoustic Echo Canceller
80
New Trends in Adaptive Filtering Partial Updating Weights. Sub-band adaptive filtering. Adaptive Kalman filtering. Affine Projection Method. Time-Space adaptive processing. Non-Linear adaptive filtering:- Neural Networks. The Volterra Series Algorithm. Genetic & Fuzzy. Blind Adaptive Filtering.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.