Download presentation
Presentation is loading. Please wait.
Published byDarren Jennings Modified over 9 years ago
2
State-Space Recursive Least Squares with Adaptive Memory College of Electrical & Mechanical Engineering National University of Sciences & Technology (NUST) EE-869 Adaptive Filters
3
3 Introduction A recursive algorithm Built around state-space model of an unforced system Based on least squares approach Does not require process or observation noise statistics Works for time-invariant & time-variant environment alike Can handle scalar and vector observations Adapts forgetting factor that may be required due to Model uncertainty Presence of unknown external disturbances Time-varying nature of observed signal Non-stationary behaviour of observation noise SSRLS SSRLSWAM
4
Preview of SSRLS
5
5 State-Space Model process states output signal observation noise system matrix (full rank) observation matrix (full rank) L-step observable Unforced System SSRLS
6
6 Batch Processed Least Squares Approach
7
7 Batch of Observations Batch Processed Least Squares Approach Noise Vector
8
8 Least Squares Solution Batch Processed Least Squares Approach Full rank for Batch Processed Least Squares Solution Batch Processed Weighted Least Squares Solution Weighting Matrix
9
9 Recursive Algorithm
10
10 Predict and Correct Recursive Algorithm Predicted States Predicted Signal Prediction Error Predictor Corrector Form Estimator Gain
11
11 Recursive Solution Recursive Algorithm Based on k+1 observations Weighting Matrix k+1 observations
12
12 Recursive Solution (‘contd) Recursive Algorithm Defined variables Direct Form of SSRLS
13
13 Recursive Update of Recursive Algorithm Difference Lyapunov Equation
14
14 Matrix Inversion Lemma Recursive Algorithm Matrix Inversion Lemma
15
15 Recursive Update of Recursive Algorithm Riccati Equation for SSRLS Define
16
16 Recursive Update of Recursive Algorithm Recursive solution
17
17 Observer Gain Recursive Algorithm Defined
18
18 State-Space Representation of SSRLS Recursive Solution Defined Therefore Similarly State-Space Matrices
19
19 Initializating SSRLS Recursive Algorithm Rank Deficient or 1) Initializing using Regularization Term 2) Initialization using batch processing approach leads to delayed recursion - offers better initialization
20
20 Steady-State SSRLS
21
21 Steady-State Solution of SSRLS Steady-State SSRLS if Can be written like this For neutrally stable systems
22
22 Direct Form of Steady-State SSRLS Steady-State SSRLS
23
23 Observer Gain for Steady-State SSRLS Steady-State SSRLS
24
24 Transfer Function Representation Steady-State SSRLS
25
25 Initialization of Steady-State SSRLS Steady-State SSRLS Initialize only Preferable choice if no other estimate is available
26
26 Memory Length Steady-State SSRLS Filter Memory Asymptotic result
27
Model Uncertainty and Unknown External Disturbances
28
28 Underlying Model process states output external disturbance (bounded, deterministic) observation noise system matrix input matrix observation matrix Model Uncertainty and Unknown External Disturbances Controllable pair
29
29 Assumptions about Observation Noise Zero Mean White Model Uncertainty and Unknown External Disturbances
30
30 Perturbation Matrices Model Uncertainty and Unknown External Disturbances
31
31 Estimation Error where Model Uncertainty and Unknown External Disturbances White Input Deterministic Input
32
32 Steady-State Mean Estimation Error Model Uncertainty and Unknown External Disturbances Deterministic Input
33
33 Bounds on Steady-State Mean Estimation Error Model Uncertainty and Unknown External Disturbances
34
34 Steady-State Mean Square Error Model Uncertainty and Unknown External Disturbances Estimation Error Correlation Matrix where
35
35 Bounds on Steady-State Mean Square Estimation Error Model Uncertainty and Unknown External Disturbances
36
SSRLS with Adaptive Memory (SSRLSWAM)
37
37 The Cost Function cost function gradient of cost function row vector where SSRLS with Adaptive Memory (SSRLSWAM)
38
38 Gradient of Cost Function SSRLS with Adaptive Memory (SSRLSWAM) Deterministic Gradient Define
39
39 Gradient of Cost Function (‘contd) SSRLS with Adaptive Memory (SSRLSWAM)
40
40 Tuning Forgetting Factor SSRLS with Adaptive Memory (SSRLSWAM) Stochastic Gradient Update using Stochastic Gradient Method
41
41 SSRLSWAM – Complete Algorithm SSRLS with Adaptive Memory (SSRLSWAM)
42
42 Initializing SSRLSWAM 2) Initialization using batch processing approach leads to delayed recursion - offers better initialization SSRLS with Adaptive Memory (SSRLSWAM) Some suitable value < 1
43
Approximate Solution
44
44 Approximate Solution using Symbolic Computations Approximate Solution Discrete Lyapunov Equation for S4RLS Can be computed Symbolically, Off-line
45
45 Approximate Solution using Symbolic Computations (‘contd) Approximate Solution
46
46 Approximate Solution using Symbolic Computations (‘contd) Approximate Solution Define Simplified Algorithm
47
47 A Special Case (Constant Acceleration)
48
48 A Special Case (Constant Acceleration) Approximate Solution System Matrices Symbolic Computation
49
49 A Special Case (Constant Acceleration) – ‘Continued Approximate Solution Symbolic Computation
50
Computational Complexity
51
51 Computational Complexities: Standard Algorithms
52
52 Computational Complexities: SSRLSWAM and Variants
53
Example of Tracking a Noisy Chirp
54
54 Chirp Signal Example of Tracking a Noisy Chirp Sinusoid whose frequency drifts with time Model Used by Tracker Model Mismatch Actual Model is Chirped Sinusoid Model
55
55 Performance of SSRLSWAM Simulation Parameters Example of Tracking a Noisy Chirp
56
56 Tuning of Forgetting Factor Simulation Parameters Example of Tracking a Noisy Chirp
57
57 Performance of SSRLS Simulation Parameters Example of Tracking a Noisy Chirp
58
58 Conclusion SSRLSWAM is a combination of SSRLS and stochastic gradient method S4RLSWAM alleviates computational burden of SSRLSWAM Suitable for time-varying scenario Compensates for model uncertainty to some extent Conclusion
59
59 References Mohammad Bilal Malik, “State-space recursive least squares with adaptive memory”, Signal Processing Journal, Vol. 86, pp 1365-1374, 2006 References
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.