Yuan Chen Advisor: Professor Paul Cuff. Introduction Goal: Remove reverberation of far-end input from near –end input by forming an estimation of the.

Slides:



Advertisements
Similar presentations
DSP C5000 Chapter 16 Adaptive Filter Implementation Copyright © 2003 Texas Instruments. All rights reserved.
Advertisements

Acoustic Echo Cancellation for Low Cost Applications
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The Linear Prediction Model The Autocorrelation Method Levinson and Durbin.
Pattern Recognition and Machine Learning: Kernel Methods.
CHAPTER 3 CHAPTER 3 R ECURSIVE E STIMATION FOR L INEAR M ODELS Organization of chapter in ISSO –Linear models Relationship between least-squares and mean-square.
Interference Cancellation Algorithm with Pilot in 3GPP/LTE
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The FIR Adaptive Filter The LMS Adaptive Filter Stability and Convergence.
AGC DSP AGC DSP Professor A G Constantinides©1 A Prediction Problem Problem: Given a sample set of a stationary processes to predict the value of the process.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Newton’s Method Application to LMS Recursive Least Squares Exponentially-Weighted.
Data mining and statistical learning - lecture 6
1 Chapter 4 Interpolation and Approximation Lagrange Interpolation The basic interpolation problem can be posed in one of two ways: The basic interpolation.
P. Brigger, J. Hoeg, and M. Unser Presented by Yu-Tseh Chi.
Aspects of Conditional Simulation and estimation of hydraulic conductivity in coastal aquifers" Luit Jan Slooten.
The loss function, the normal equation,
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Artificial Intelligence Lecture 2 Dr. Bo Yuan, Professor Department of Computer Science and Engineering Shanghai Jiaotong University
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Linear Regression  Using a linear function to interpolate the training set  The most popular criterion: Least squares approach  Given the training set:
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
1 Channel Estimation for IEEE a OFDM Downlink Transmission Student: 王依翎 Advisor: Dr. David W. Lin Advisor: Dr. David W. Lin 2006/02/23.
Nonlinear Sampling. 2 Saturation in CCD sensors Dynamic range correction Optical devices High power amplifiers s(-t) Memoryless nonlinear distortion t=n.
EE513 Audio Signals and Systems Wiener Inverse Filter Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
EE-608 Course project Adaptive Echo Cancellation By LMS Algorithm Pannir Selvam E Vikram Mehta Praveen Goyal ( ) Guided By Prof. U.B. Desai.
Adaptive Signal Processing
Normalised Least Mean-Square Adaptive Filtering
RLSELE Adaptive Signal Processing 1 Recursive Least-Squares (RLS) Adaptive Filters.
Acoustic Echo Cancellation Using Digital Signal Processing. Presented by :- A.Manigandan( ) B.Naveen Raj ( ) Parikshit Dujari ( )
ELE 488 F06 ELE 488 Fall 2006 Image Processing and Transmission ( ) Wiener Filtering Derivation Comments Re-sampling and Re-sizing 1D  2D 10/5/06.
Equalization in a wideband TDMA system
Dr. Hala Moushir Ebied Faculty of Computers & Information Sciences
Taylor Series.
Algorithm Taxonomy Thus far we have focused on:
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
Introduction to Adaptive Digital Filters Algorithms
Scheme for Improved Residual Echo Cancellation in Packetized Audio Transmission Jivesh Govil Digital Signal Processing Laboratory Department of Electronics.
By Asst.Prof.Dr.Thamer M.Jamel Department of Electrical Engineering University of Technology Baghdad – Iraq.
This Week Week Topic Week 1 Week 2 Week 3 Week 4 Week 5
Geo479/579: Geostatistics Ch12. Ordinary Kriging (1)
Name : Arum Tri Iswari Purwanti NPM :
Curve-Fitting Regression
CHAPTER 4 Adaptive Tapped-delay-line Filters Using the Least Squares Adaptive Filtering.
Kernel adaptive filtering Lecture slides for EEL6502 Spring 2011 Sohan Seth.
ESPL 1 Wordlength Optimization with Complexity-and-Distortion Measure and Its Application to Broadband Wireless Demodulator Design Kyungtae Han and Brian.
EE513 Audio Signals and Systems
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Derivation Computational Simplifications Stability Lattice Structures.
Kanpur Genetic Algorithms Laboratory IIT Kanpur 25, July 2006 (11:00 AM) Multi-Objective Dynamic Optimization using Evolutionary Algorithms by Udaya Bhaskara.
Learning Using Augmented Error Criterion Yadunandana N. Rao Advisor: Dr. Jose C. Principe.
EECS 274 Computer Vision Geometric Camera Calibration.
ADALINE (ADAptive LInear NEuron) Network and
Dept. E.E./ESAT-STADIUS, KU Leuven
CHAPTER 10 Widrow-Hoff Learning Ming-Feng Yeh.
3.7 Adaptive filtering Joonas Vanninen Antonio Palomino Alarcos.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Normal Equations The Orthogonality Principle Solution of the Normal Equations.
Autoregressive (AR) Spectral Estimation
Recursive Least-Squares (RLS) Adaptive Filters
Neural Networks 2nd Edition Simon Haykin 柯博昌 Chap 3. Single-Layer Perceptrons.
ELG5377 Adaptive Signal Processing Lecture 15: Recursive Least Squares (RLS) Algorithm.
Variable Step-Size Adaptive Filters for Acoustic Echo Cancellation Constantin Paleologu Department of Telecommunications
Nonlinear Adaptive Kernel Methods Dec. 1, 2009 Anthony Kuh Chaopin Zhu Nate Kowahl.
DSP-CIS Part-III : Optimal & Adaptive Filters Chapter-9 : Kalman Filters Marc Moonen Dept. E.E./ESAT-STADIUS, KU Leuven
Presentations on “ADAPTIVE LMS FILTERING APPROACH FOR SPEECH ENHANCEMENT” Approved By Presented By Mr. Rupesh Dubey Lalit P. Patil HOD (Elec & comm.) (
Adnan Quadri & Dr. Naima Kaabouch Optimization Efficiency
Equalization in a wideband TDMA system
Some useful linear algebra
Collaborative Filtering Matrix Factorization Approach
Systems of Linear Equations in Two Variables
Instructor :Dr. Aamer Iqbal Bhatti
Nonlinear regression.
Adaptive Filter A digital filter that automatically adjusts its coefficients to adapt input signal via an adaptive algorithm. Applications: Signal enhancement.
Presentation transcript:

Yuan Chen Advisor: Professor Paul Cuff

Introduction Goal: Remove reverberation of far-end input from near –end input by forming an estimation of the echo path

Review of Previous Work Considered cascaded filter architecture of memoryless nonlinearity and linear, FIR filter Applied method of generalized nonlinear NLMS algorithm to perform adaptation Choice of nonlinear functions: cubic B-spline, piecewise linear function

Spline (Nonlinear) Function Interpolation between evenly spaced control points: Piecewise Linear Function: M. Solazzi et al. “An adaptive spline nonlinear function for blind signal processing.”

Nonlinear, Cascaded Adaptation Linear Filter Taps: Nonlinear Filter Parameters: Step Size Normalization:

Optimal Filter Configuration For stationary environment, LMS filters converge to least squares (LS) filter Choose filter taps to minimize MSE: Solution to normal equations: Input data matrix:

Nonlinear Extension – Least Squares Spline (Piecewise Linear) Function Choose control points to minimize MSE: Spline formulation provides mapping from input to control point “weights”:

Optimality Conditions – Optimize with respect to control points First Partial Derivative: Expressing all constraints: In matrix form: Solve normal equations:

Least Squares Hammerstein Filter Difficult to directly solve for both filter taps and control points simultaneously Consider Iterative Approach: 1. Solve for best linear, FIR LS filter given current control points 2. Solve for optimal configuration of nonlinear function control points given updated filter taps 3. Iterate until convergence

Hammerstein Optimization Given filter taps, choose control points for min. MSE: Define, rearrange, and substitute: Similarity in problem structure:

Results Echo Reduction Loss Enhancement (ERLE): Simulate AEC using: a.) input samples drawn i.i.d. from Gsn(0, 1) b.) voice audio input Use sigmoid distortion and linear acoustic impulse response

Conclusions Under ergodicity and stationarity constraints, iterative least squares method converges to optimal filter configuration for Hammerstein cascaded systems Generalized nonlinear NLMS algorithm does not always converge to the optimum provided by least squares approach In general, Hammerstein cascaded systems cheaply introduce nonlinear compensation