ELG5377 Adaptive Signal Processing Lecture 15: Recursive Least Squares (RLS) Algorithm
Introduction MLS states that We would like to compute and z recursively. To account for any time variance, we would also incorporate a “forgetting” factor so that more weight is given to current inputs that previous ones. To do this, we modify the cost function to be minimized.
Cost Function We can show that
Reformulation of Normal Equations From previous, we can reformulate the time averaged autocorrelation function as: And the time averaged cross-correlation becomes: Derivation done on blackboard
Recursive computation of (n)
Recursive computation of z (n)
Result By simply updating (n) and z(n), we can compute However, this needs a matrix inversion at each iteration. Higher computational complexity. –Update -1 (n) each iteration instead!
Matrix Inversion Lemma Let A and B be two positive definite M by M matrices related by: –A = B -1 +CD -1 C H. –Where D is a positive definite N by M matrix and C is an M by N matrix. Then A -1 is given by: –A -1 = B-BC(D+C H BC) -1 C H B.
Applying Matrix Inversion Lemma to Finding -1 (n) from -1 (n-1)
Applying Matrix Inversion Lemma to Finding -1 (n) from -1 (n-1) (2) For convenience, let