State-Space Recursive Least Squares with Adaptive Memory College of Electrical & Mechanical Engineering National University of Sciences & Technology (NUST) EE-869 Adaptive Filters
3 Introduction A recursive algorithm Built around state-space model of an unforced system Based on least squares approach Does not require process or observation noise statistics Works for time-invariant & time-variant environment alike Can handle scalar and vector observations Adapts forgetting factor that may be required due to Model uncertainty Presence of unknown external disturbances Time-varying nature of observed signal Non-stationary behaviour of observation noise SSRLS SSRLSWAM
Preview of SSRLS
5 State-Space Model process states output signal observation noise system matrix (full rank) observation matrix (full rank) L-step observable Unforced System SSRLS
6 Batch Processed Least Squares Approach
7 Batch of Observations Batch Processed Least Squares Approach Noise Vector
8 Least Squares Solution Batch Processed Least Squares Approach Full rank for Batch Processed Least Squares Solution Batch Processed Weighted Least Squares Solution Weighting Matrix
9 Recursive Algorithm
10 Predict and Correct Recursive Algorithm Predicted States Predicted Signal Prediction Error Predictor Corrector Form Estimator Gain
11 Recursive Solution Recursive Algorithm Based on k+1 observations Weighting Matrix k+1 observations
12 Recursive Solution (‘contd) Recursive Algorithm Defined variables Direct Form of SSRLS
13 Recursive Update of Recursive Algorithm Difference Lyapunov Equation
14 Matrix Inversion Lemma Recursive Algorithm Matrix Inversion Lemma
15 Recursive Update of Recursive Algorithm Riccati Equation for SSRLS Define
16 Recursive Update of Recursive Algorithm Recursive solution
17 Observer Gain Recursive Algorithm Defined
18 State-Space Representation of SSRLS Recursive Solution Defined Therefore Similarly State-Space Matrices
19 Initializating SSRLS Recursive Algorithm Rank Deficient or 1) Initializing using Regularization Term 2) Initialization using batch processing approach leads to delayed recursion - offers better initialization
20 Steady-State SSRLS
21 Steady-State Solution of SSRLS Steady-State SSRLS if Can be written like this For neutrally stable systems
22 Direct Form of Steady-State SSRLS Steady-State SSRLS
23 Observer Gain for Steady-State SSRLS Steady-State SSRLS
24 Transfer Function Representation Steady-State SSRLS
25 Initialization of Steady-State SSRLS Steady-State SSRLS Initialize only Preferable choice if no other estimate is available
26 Memory Length Steady-State SSRLS Filter Memory Asymptotic result
Model Uncertainty and Unknown External Disturbances
28 Underlying Model process states output external disturbance (bounded, deterministic) observation noise system matrix input matrix observation matrix Model Uncertainty and Unknown External Disturbances Controllable pair
29 Assumptions about Observation Noise Zero Mean White Model Uncertainty and Unknown External Disturbances
30 Perturbation Matrices Model Uncertainty and Unknown External Disturbances
31 Estimation Error where Model Uncertainty and Unknown External Disturbances White Input Deterministic Input
32 Steady-State Mean Estimation Error Model Uncertainty and Unknown External Disturbances Deterministic Input
33 Bounds on Steady-State Mean Estimation Error Model Uncertainty and Unknown External Disturbances
34 Steady-State Mean Square Error Model Uncertainty and Unknown External Disturbances Estimation Error Correlation Matrix where
35 Bounds on Steady-State Mean Square Estimation Error Model Uncertainty and Unknown External Disturbances
SSRLS with Adaptive Memory (SSRLSWAM)
37 The Cost Function cost function gradient of cost function row vector where SSRLS with Adaptive Memory (SSRLSWAM)
38 Gradient of Cost Function SSRLS with Adaptive Memory (SSRLSWAM) Deterministic Gradient Define
39 Gradient of Cost Function (‘contd) SSRLS with Adaptive Memory (SSRLSWAM)
40 Tuning Forgetting Factor SSRLS with Adaptive Memory (SSRLSWAM) Stochastic Gradient Update using Stochastic Gradient Method
41 SSRLSWAM – Complete Algorithm SSRLS with Adaptive Memory (SSRLSWAM)
42 Initializing SSRLSWAM 2) Initialization using batch processing approach leads to delayed recursion - offers better initialization SSRLS with Adaptive Memory (SSRLSWAM) Some suitable value < 1
Approximate Solution
44 Approximate Solution using Symbolic Computations Approximate Solution Discrete Lyapunov Equation for S4RLS Can be computed Symbolically, Off-line
45 Approximate Solution using Symbolic Computations (‘contd) Approximate Solution
46 Approximate Solution using Symbolic Computations (‘contd) Approximate Solution Define Simplified Algorithm
47 A Special Case (Constant Acceleration)
48 A Special Case (Constant Acceleration) Approximate Solution System Matrices Symbolic Computation
49 A Special Case (Constant Acceleration) – ‘Continued Approximate Solution Symbolic Computation
Computational Complexity
51 Computational Complexities: Standard Algorithms
52 Computational Complexities: SSRLSWAM and Variants
Example of Tracking a Noisy Chirp
54 Chirp Signal Example of Tracking a Noisy Chirp Sinusoid whose frequency drifts with time Model Used by Tracker Model Mismatch Actual Model is Chirped Sinusoid Model
55 Performance of SSRLSWAM Simulation Parameters Example of Tracking a Noisy Chirp
56 Tuning of Forgetting Factor Simulation Parameters Example of Tracking a Noisy Chirp
57 Performance of SSRLS Simulation Parameters Example of Tracking a Noisy Chirp
58 Conclusion SSRLSWAM is a combination of SSRLS and stochastic gradient method S4RLSWAM alleviates computational burden of SSRLSWAM Suitable for time-varying scenario Compensates for model uncertainty to some extent Conclusion
59 References Mohammad Bilal Malik, “State-space recursive least squares with adaptive memory”, Signal Processing Journal, Vol. 86, pp , 2006 References