Download presentation
Presentation is loading. Please wait.
1
Distributed Inference in Dynamical Systems Emergency response systems: monitoring in hazardous conditions sensor calibration, localization Autonomous teams of mobile robots terrain mapping, robots temporarily out of communication range, operating under heavy interference Applications mote network nodes communicate locally observed temperatures nodes collaborate to estimate their bias state transition model observation model observations Transition model: Observation model: Dynamic Bayesian Networks 1. Estimation: 2. Prediction: 3. Roll-up: marginalize out the state at the previous time step multiply in the observation likelihood multiply in the transition model Recursively computefrom : Centralized Filtering X 1,X 2 X 2,X 3 X 2,X 4,X 6 X 4,X 5,X 6 X 4,X 6 X2X2 X2X2 cliqueseparator Exact posterior quickly loses independence structure; project to a simpler distribution. approximation (junction tree): X1X1 X2X2 X3X3 X4X4 X5X5 X6X6 can be written as a ratio of clique and separator marginals: true distribution: Assumed Density Filtering [Boyen & Koller 1998] 1.Estimation: multiply in observation likelihoods; recompute clique & sep. marginals 2.Prediction, roll-up, projection: X 1,X 2 X 2,X 3 X 2,X 4,X 6 X 4,X 5,X 6 X2X2 subtree that covers only need a subtree that covers prune the remaining terms of X 1,X 2 X 2,X 3 X 2,X 4,X 6 X 4,X 5,X 6 Distributed Filtering Each observation associated with a unique node 2 1 3 observes (1-1) Task: compute query p(Q n (t) | z (1:t) ) at each node n The nodes need to communicate their information maintain the global distribution locally DBN:Sensor network: assumed density structure does not correspond to network connectivity vs. Proposed Approach X 1,X 2 External junction tree (assumed density) X 2,X 3 X 2,X 4,X 6 X 4,X 5,X 6 At each time step, node computes (possibly approximate) marginal distribution over its cliques: X1X1 stronger links weaker links Cliques associated redundantly with network nodes together, these marginals represent the global distribution Nodes estimate link quality and build a Network Junction Tree [Paskin et al, IPSN 2005] routes messages satisfies running intersection property each cluster covers associated cliques adaptive, can be optimized in order to minimize computational complexity X 1,X 2,X 3 X 1,X 2 X 2,X 4, X 5, X 6 X 4,X 5,X 6 X 2,X 4,X 5,X 6 X 4,X 5,X 6 cluster ¶ { X 2,X 4, X 6 } X 1,X 2 X 2,X 3 X 2,X 4,X 6 X 4,X 5,X 6 X1X1 c.f. (centralized) junction tree calibration: prior: likelihood Each node starts with: 1 2 3 4 5 6 represent marginals of the prior local obs. likelihoods Each node obtains: marginals of the posterior priors obs. likelihoods RDPI transfers both likelihoods and missing priors Estimation as Robust Distributed Inference Local Prediction X 2,X 4,X 5,X 6 X 1,X 2,X 4, X 5, X 6 ¶ ¶ Each node:obtained the posterior over its cliques: wants to perform the prediction: Simpler solution: Include Pa [C i (t+1) ] in the cluster at each node Estimation obtains posterior over Pa [C i (t+1) ] Computational complexity still proportional to tree width of the external junction tree. Could emulate the centralized prediction, by collecting cliques of a subtree covering Pa [C i (t+1) ]. X 1,X 2 X 2,X 4,X 6 X 4,X 5,X 6 missing marginal multiplying in effectively assumes uniform distribution over X 1 (t) missing variable in the belief Multiply in Introduces independence assertion when transition model learnt, obtained e.g. from the empirical distribution Theorem:In the absence of partitions and given sufficient communnication at each time step, the algorithm equivalent to Boyen&Koller 98. Experimental Results: Convergence object trajectory Testbed of 25 cameras used for the SLAT experiments. Convergence for individual cameras. Convergence versus amount of communication in each time step. camera localization temperature monitoring real camera network distribution computed by nodes on the left distribution computed by nodes on the right network partition object location camera poses C 1, C 2, M t C 2, C 3, M t C 3, C 4, M t The beliefs obtained by the left and the right sub-network do not agree on the shared variables, do not represent a globally consistent distribution 1:1: 2:2: 3:3: C1C1 C2C2 C3C3 C4C4 ≠ Alignment:obtain an informative, globally consistent distribution from inconsistent marginals { i } C1C1 C2C2 C3C3 C4C4 = Alignment Problem C 1, C 2, M t C 2, C 3, M t C 3, C 4, M t 1:1: 2:2: 3:3: Define rooted at 1 as a product of conditionals: Which root best? Exact solution, no partition Rooted at 1, highly uncertain Rooted at 3, more certain Optimized Conditional Alignment (OCA) Suppose wish to minimize the entropy of : Entropy decomposes by cliques: for Gaussians, only depends on the conditional Can write the difference of entropies with neighboring roots 1,2 locally, in terms of 1 and 2 … C 1, C 2, M t C 2, C 3, M t …C 3, C 4, M t 12 3 the same terms for both roots 1 & 2 i j m i!j =loss (entropy) with root j vs. the best root among nodes on i’s side m k!i < 0 a) i is the best root in its subtree: i j 2 1 m k!i > 0, m 2 i largest b) there is a better root than i in i’s subtree: i.e., k the best root in i’s subtree m 12!23 2 pruned cliques 3 X2X2 X 1,X 2 X 2,X 3 X1X1 transferred clique m 1!12 m 12!23 message X 2,X 3 X 2,X 4,X 6 X 4,X 5,X 6 final belief at node 2 incoming entropy message In RDPI, marginalization corresponds to pruning of cliques piggy-back the computation of entropy messages m i j to this pruning at convergence, can determine optimal root locally Theorem:Given sufficient communication, the nodes will reach a globally consistent belief, selecting a root that minimizes the entropy. Highly-partitioned scenario A B C rooted alignment makes A & C uncertain about some of their cameras Suppose define aligned distribution to match the clique marginals: aligned distribution inconsistent prior marginals For Gaussians, this problem splits into independent convex problems: determinant maximization [Vandenberghe et al., SIAM 1998] linear regression, can distribute [Guestrin et al., IPSN 04] Joint Alignment Experimental Results: Partitions Without optimized alignment, inconsistencies can degrade the solution t=120, …, 120+ M t=297 In each component, nodes communicate fully. Evaluate quality at the end of experiment. Choice of root is significant, especially as the number of partitions increases. Min-KL provides best aligned distribution. Here, effects of partitioning less severe (approximate transition model accurate); results follow the same trend as SLAT. camera localization temperature monitoring Summary of the Algorithm Initialization: X 1,X 2 X 2,X 3 X1X1 cliques of the assumed density X 1,X 2,X 3 X 1,X 2 X1X1 m 12!23 X 2,X 3 X 2,X 4,X 6 X 4,X 5,X 6 optimal root 1. Estimation, alignment (distributed) X 2,X 4,X 6 X 4,X 5,X 6 2. Approximate prediction (local) Zhao et al., IEEE 2003: (mostly independent) PFs for tracking Rosencratz at al., UAI 2003: PFs in parallel, sharing observations Pfeffer and Tai, UAI 2005: Loopy BP, continuous DBN Our approach: does not assume point-to-point communication robust against node failures and partitions addresses the belief inconsistency problem Conclusions and Related Work Distributed algorithm for filtering in general dynamic Bayesian networks –converges to centralized solution (B&K 98) –robust to partitions, message losses –online optimization algorithm that resolves inconsistencies Evaluated on data from real networks with 25 cameras, 54 temperature sensors Stanislav FuniakCarlos GuestrinMark PaskinRahul Sukthankar Carnegie Mellon UniversityGoogleIntel Research Challenges lossy communication node failures changing link qualities Define message m i j from clique C i to C j as: Selecting root of minimal entropy Online algorithm Distributing OCA = ? j optimal if m i j < 0 for all neighbors i Computing m i j : Prediction with missing variables centralized B&K solution [Paskin, Guestrin, UAI 2003] Guarantees: global correctness: converges to true posterior partial correctness: before convergence, forms a principled approximation may not be able to obtain distribution over all parents: out of time, node failures, interference best interpretation worst interpretation best interpretation
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.