Download presentation
Presentation is loading. Please wait.
Published byKelley Garry Skinner Modified over 9 years ago
1
ELG5377 Adaptive Signal Processing Lecture 15: Recursive Least Squares (RLS) Algorithm
2
Introduction MLS states that We would like to compute and z recursively. To account for any time variance, we would also incorporate a “forgetting” factor so that more weight is given to current inputs that previous ones. To do this, we modify the cost function to be minimized.
3
Cost Function We can show that
4
Reformulation of Normal Equations From previous, we can reformulate the time averaged autocorrelation function as: And the time averaged cross-correlation becomes: Derivation done on blackboard
5
Recursive computation of (n)
6
Recursive computation of z (n)
7
Result By simply updating (n) and z(n), we can compute However, this needs a matrix inversion at each iteration. Higher computational complexity. –Update -1 (n) each iteration instead!
8
Matrix Inversion Lemma Let A and B be two positive definite M by M matrices related by: –A = B -1 +CD -1 C H. –Where D is a positive definite N by M matrix and C is an M by N matrix. Then A -1 is given by: –A -1 = B-BC(D+C H BC) -1 C H B.
9
Applying Matrix Inversion Lemma to Finding -1 (n) from -1 (n-1)
10
Applying Matrix Inversion Lemma to Finding -1 (n) from -1 (n-1) (2) For convenience, let
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.