Download presentation
Presentation is loading. Please wait.
Published byValerie Lamb Modified over 9 years ago
1
1 Consensus-Based Distributed Least-Mean Square Algorithm Using Wireless Ad Hoc Networks Gonzalo Mateos, Ioannis Schizas and Georgios B. Giannakis ECE Department, University of Minnesota Acknowledgment: ARL/CTA grant no. DAAD19-01-2-0011
2
2 Motivation Estimation using ad hoc WSNs raises exciting challenges Communication constraints Limited power budget Lack of hierarchy / decentralized processing Consensus Unique features Environment is constantly changing (e.g., WSN topology) Lack of statistical information at sensor-level Bottom line: algorithms are required to be Resource efficient Simple and flexible Adaptive and robust to changes Single-hop communications
3
3 Prior Works Single-shot distributed estimation algorithms Consensus averaging [Xiao-Boyd ’ 05, Tsitsiklis-Bertsekas ’ 86, ’ 97] Incremental strategies [Rabbat-Nowak etal ’ 05] Deterministic and random parameter estimation [Schizas etal ’ 06] Consensus-based Kalman tracking using ad hoc WSNs MSE optimal filtering and smoothing [Schizas etal ’ 07] Suboptimal approaches [Olfati-Saber ’ 05], [Spanos etal ’ 05] Distributed adaptive estimation and filtering LMS and RLS learning rules [Lopes-Sayed ’ 06 ’ 07]
4
4 Problem Statement Ad hoc WSN with sensors Single-hop communications only. Sensor ‘ s neighborhood Connectivity information captured in Zero-mean additive (e.g., Rx, quantization) noise Each sensor, at time instant Acquires a regressor and scalar observation Both zero-mean w.l.o.g and spatially uncorrelated Least-mean squares (LMS) estimation problem of interest
5
5 Centralized Approaches If, jointly stationary Wiener solution If global (cross-) covariance matrices, available Steepest-descent converges avoiding matrix inversion If (cross-) covariance info. not available or time-varying Low complexity suggests (C-) LMS adaptation Goal: develop a distributed (D-) LMS algorithm for ad hoc WSNs
6
6 A Useful Reformulation Introduce the bridge sensor subset For all sensors, such that For, there must such that Consider the convex, constrained optimization Proposition [Schizas etal ’ 06]: For satisfying 1)-2) and the WSN is connected, then
7
7 Algorithm Construction Problem of interest Two key steps in deriving D-LMS Resort to the alternating-direction method of multipliers Gain desired degree of parallelization Apply stochastic approximation ideas Cope with unavailability of statistical information
8
8 Derivation of Recursions Associated augmented Lagrangian Alternating-direction method of Lagrange multipliers Three-step iterative update process Multipliers Dual iteration Local estimates Minimize w.r.t. Bridge variables Minimize w.r.t. Step 1: Step 2: Step 3:
9
9 Multiplier Updates Recall the constraints Use standard method of multipliers type of update Requires from the bridge neighborhood
10
10 Local Estimate Updates Given by the local optimization First order optimality condition Proposed recursion inspired by Robbins-Monro algorithm is the local prior error is a constant step-size Requires Already acquired bridge variables Updated local multipliers
11
11 Bridge Variable Updates Similarly, Requires from the neighborhood from the neighborhood in a startup phase
12
12 D-LMS Recap and Operation In the presence of communication noise, for Simple, fully distributed, only single-hop exchanges needed Step 1: Step 2: Step 3: Sensor Rx from Tx to Bridge sensor Tx to Rx from Steps 1,2:Step 3:
13
13 Further Insights Manipulating the recursions for and yields Introduce the instantaneous consensus error at sensor The update of becomes Superposition of two learning mechanisms Purely local LMS-type of adaptation PI consesus loop tracks the consensus set-point
14
14 Network-wide information enters through the set-point Expect increased performance with Flexibility D-LMS Processor Local LMS Algorithm Sensor j PI Regulator To Consensus Loop
15
15 Mean Analysis Independence setting signal assumptions for (As1) is a zero-mean white random vector, with spectral radius (As2) Observations obey a linear model where is a zero-mean white noise (As3) and are statistically independent Define and Goal : derive sufficient conditions under which
16
16 Dynamics of the Mean Lemma: Under (As1)-(As3), consider the D-LMS algorithm initialized with. Then for, is given by the second-order recursion with and, where Equivalent first-order system by state concatenation
17
17 First-Order Stability Result Proposition: Under (As1)-(As3), the D-LMS algorithm whose positive step-sizes and relevant parameters are chosen such that, achieves consensus in the mean sense i.e., Step-size selection based on local information only Local regressor statistics Bridge neighborhood size
18
18 Simulations node WSN, Regressors: i.i.d. Observations: D-LMS:, True time-varying weight:
19
19 Loop Tuning Adequately selecting actually does make a difference Compared figures of merit: MSE (Learning curve): MSD (Normalized estimation error):
20
20 Concluding Summary Developed a distributed LMS algorithm for general ad hoc WSNs Intuitive sensor-level processing Local LMS adaptation Tunable PI loop driving local estimate to consensus Mean analysis under independence assumptions step-size selection rules based on local information Simulations validate mss convergence and tracking capabilities Ongoing research Stability and performance analysis under general settings Optimality: selection of bridge sensors, D-RLS. Estimation/Learning performance Vs complexity tradeoff
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.