Consensus Problems in Networks Aman Agarwal EYES 2007 intern Advisor Prof. Mostofi ECE, University of New Mexico July 5, 2007.

Slides:



Advertisements
Similar presentations
Cooperative Transmit Power Estimation under Wireless Fading Murtaza Zafer (IBM US), Bongjun Ko (IBM US), Ivan W. Ho (Imperial College, UK) and Chatschik.
Advertisements

Hierarchical Trust Management for Wireless Sensor Networks and its Applications to Trust-Based Routing and Intrusion Detection Presented by: Vijay Kumar.
Active Shape Models Suppose we have a statistical shape model –Trained from sets of examples How do we use it to interpret new images? Use an “Active Shape.
Prashant Bajpayee Advisor: Dr. Daniel Noneaker SURE 2005 Motivation Currently most radio-frequency spectrum is assigned exclusively to “primary” users.
Winter 2004 UCSC CMPE252B1 CMPE 257: Wireless and Mobile Networking SET 3f: Medium Access Control Protocols.
Kick-off Meeting, July 28, 2008 ONR MURI: NexGeNetSci Distributed Coordination, Consensus, and Coverage in Networked Dynamic Systems Ali Jadbabaie Electrical.
1 Machine Learning: Lecture 10 Unsupervised Learning (Based on Chapter 9 of Nilsson, N., Introduction to Machine Learning, 1996)
1 Methods of Experimental Particle Physics Alexei Safonov Lecture #21.
Network Coding Testbed Using Software-Defined Radio Abstract In current generation networks, network nodes operate by replicating and forwarding the packets.
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Monday, June 01, 2015 ARRIVE: Algorithm for Robust Routing in Volatile Environments 1 NEST Retreat, Lake Tahoe, June
Peng Cheng, Member, IEEE, Zhijun Qiu, and Bin Ran Presented By: Guru Prasanna Gopalakrishnan.
Algorithmic and Economic Aspects of Networks Nicole Immorlica.
HMM-BASED PATTERN DETECTION. Outline  Markov Process  Hidden Markov Models Elements Basic Problems Evaluation Optimization Training Implementation 2-D.
1 Data Persistence in Large-scale Sensor Networks with Decentralized Fountain Codes Yunfeng Lin, Ben Liang, Baochun Li INFOCOM 2007.
Dynamic Tuning of the IEEE Protocol to Achieve a Theoretical Throughput Limit Frederico Calì, Marco Conti, and Enrico Gregori IEEE/ACM TRANSACTIONS.
Agent-Based Coordination of Sensor Networks Alex Rogers School of Electronics and Computer Science University of Southampton
ONR MURI: NexGeNetSci From Consensus to Social Learning in Complex Networks Ali Jadbabaie Skirkanich Associate Professor of innovation Electrical & Systems.
Location Estimation in Sensor Networks Moshe Mishali.
Chess Review May 11, 2005 Berkeley, CA Closing the loop around Sensor Networks Bruno Sinopoli Shankar Sastry Dept of Electrical Engineering, UC Berkeley.
Filtering and Control of Flow in Hypersonic Engines Nick West, Peter Glynn, George Papanicolaou and Gianluca Iaccarino Institute for Computational and.
Cyclone Time Technology Deriving Consistent Time Base Using Local Clock Information Ashok Agrawala Moustafa Youssef Bao Trinh University of Maryland College.
OMS 201 Review. Range The range of a data set is the difference between the largest and smallest data values. It is the simplest measure of dispersion.
Data Selection In Ad-Hoc Wireless Sensor Networks Olawoye Oyeyele 11/24/2003.
WIRELESS COMMUNICATION NETWORKS Modeling and Simulation.
2 Introduction: phase transition phenomena Phase transition: qualitative change as a parameter crosses threshold Matter temperature magnetism demagnetism.
1 Dynamic Adaption of DCF and PCF mode of IEEE WLAN Abhishek Goliya Guided By: Prof. Sridhar Iyer Dr. Leena-Chandran Wadia MTech Dissertation.
Dynamic Clustering for Acoustic Target Tracking in Wireless Sensor Network Wei-Peng Chen, Jennifer C. Hou, Lui Sha.
Consensus-based Distributed Estimation in Camera Networks - A. T. Kamal, J. A. Farrell, A. K. Roy-Chowdhury University of California, Riverside
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
DATA MINING LECTURE 13 Pagerank, Absorbing Random Walks Coverage Problems.
A Distributed Algorithm for Managing Multi-target Identities in Wireless Ad-hoc Sensor Networks Jaewon Shin, Leonidas J. Guibas, and Feng Zhao Presented.
Adapted from the original presentation made by the authors Reputation-based Framework for High Integrity Sensor Networks.
HQ U.S. Air Force Academy I n t e g r i t y - S e r v i c e - E x c e l l e n c e Improving the Performance of Out-of-Order Sigma-Point Kalman Filters.
Analysis of Optimal and Suboptimal Discrete-Time Digital Communications Receivers Clemson University SURE Program 2005 Justin Ingersoll, Prof. Michael.
Physical Layer Continued. Review Discussed how to design the communication scheme depending on the physical mediums – pulling voltage up and down for.
CDA6530: Performance Models of Computers and Networks Chapter 8: Statistical Simulation --- Discrete-Time Simulation TexPoint fonts used in EMF. Read the.
1 Statistics and Image Quality Evaluation III Oleh Tretiak Medical Imaging Systems Fall, 2002.
Dr. Sudharman K. Jayaweera and Amila Kariyapperuma ECE Department University of New Mexico Ankur Sharma Department of ECE Indian Institute of Technology,
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Announcements Read Chapters 11 and 12 (sections 12.1 to 12.3)
CS 347Notes 121 CS 347: Parallel and Distributed Data Management Notes12: Time and Clocks Hector Garcia-Molina.
Spectrum Sensing In Cognitive Radio Networks
Ahmad Salam AlRefai.  Introduction  System Features  General Overview (general process)  Details of each component  Simulation Results  Considerations.
September 28, 2000 Improved Simultaneous Data Reconciliation, Bias Detection and Identification Using Mixed Integer Optimization Methods Presented by:
Fidelity of a Quantum ARQ Protocol Alexei Ashikhmin Bell Labs  Classical Automatic Repeat Request (ARQ) Protocol  Quantum Automatic Repeat Request (ARQ)
Sporadic model building for efficiency enhancement of the hierarchical BOA Genetic Programming and Evolvable Machines (2008) 9: Martin Pelikan, Kumara.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 15: Mixtures of Experts Geoffrey Hinton.
Smart Sleeping Policies for Wireless Sensor Networks Venu Veeravalli ECE Department & Coordinated Science Lab University of Illinois at Urbana-Champaign.
Digital Communications I: Modulation and Coding Course Spring Jeffrey N. Denenberg Lecture 3c: Signal Detection in AWGN.
Optimization-based Cross-Layer Design in Networked Control Systems Jia Bai, Emeka P. Eyisi Yuan Xue and Xenofon D. Koutsoukos.
Name : Mamatha J M Seminar guide: Mr. Kemparaju. GRID COMPUTING.
Mean Field Methods for Computer and Communication Systems Jean-Yves Le Boudec EPFL Network Science Workshop Hong Kong July
The Inherent Security of Routing Protocols in Ad Hoc and Sensor Networks Tanya Roosta (EECS, Berkeley) In Collaboration With: Sameer Pai (ECE, Cornell)
Principios de Comunicaciones EL4005
Fast Kernel-Density-Based Classification and Clustering Using P-Trees
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Advanced Wireless Networks
Recent Advances in Iterative Parameter Estimation
Privacy and Fault-Tolerance in Distributed Optimization Nitin Vaidya University of Illinois at Urbana-Champaign.
Outlier Discovery/Anomaly Detection
Presenter: Xudong Zhu Authors: Xudong Zhu, etc.
TexPoint fonts used in EMF.
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
CRBcast: A Collaborative Rateless Scheme for Reliable and Energy-Efficient Broadcasting in Wireless Sensor/Actuator Networks Nazanin Rahnavard, Badri N.
Convolutional networks
Computational Intelligence
Computational Intelligence
Hongchao Zhou, Fei Liu, Xiaohong Guan
Patterson: Chap 1 A Review of Machine Learning
Presentation transcript:

Consensus Problems in Networks Aman Agarwal EYES 2007 intern Advisor Prof. Mostofi ECE, University of New Mexico July 5, 2007

Background Cooperative control for multi agent systems have a lot of applications. Formation control Non formation cooperative control Key issue is shared information

Consensus Protocols  x i be the information state of agent i  Continuous time : x ’ i (t) = Σ a ij (x j - x i )  x’(t) = - L x(t)  x(t) = e -Lt x(0) ; Lt t→∞ e -Lt →1v T ; v T 1 = 1 & v T L = 0  x*(t) = 1v T x(0); where v is the eigen vector corr to eigen value 0  Discrete time : x i [k+1] = Σ a ij [k] x i [k]  x[k+1] = D[k] x[k]  x[k+1] = D k x[0]; Lt k→∞ D k →1 v T ; v T 1 = 1 & v T D = v T  x*(t) = 1v T x(0); where v is the eigen vector corr to eigen value 1  L has an eigen value 0 corresponding to the solution and D has an eigen value 1 corresponding to the solution

Convergence of Consensus Protocols Equilibrium value: function of the initial state. Agents that can pass info to all the other vehicles have a say in the final value. Second smallest eigen value (L) or second largest eigen value(-L) or Fiedler eigen value: Determines the speed of convergence Dense graphs  λ 2 is relatively large. Sparse graphs  λ 2 is relatively small. The third smallest eigen value (of L) should be far away from λ 2 for faster convergence. Multiple values of λ 2 also affect the speed of convergence. Ideally we would like to have λ 2 as a simple eigen value for fast convergence.

Binary Consensus Problems In most consensus applications, the agents will communicate their status wirelessly. On the bit level there is receiver noise. Noise is not bounded  no transition point beyond which consensus is guaranteed  a probabilistic approach to characterize and understand the behavior of the network. To examine this effect we look at binary consensus problems. Assume that the network is fully connected. A majority poll to assert if the majority of the nodes are in consensus and  node updates its own information

Binary Consensus Problems Model 1 noise decision b j (k+1) = Dec( Σ b j,i (k)/ M ); Dec(x) = 1x  0.5 0x < 0.5 = Dec( Σ b j (k)/ M + Σ n j,i (k)/ M ) = Dec( Σ S(k)/ M + w i (k) ) S(k) = state of the system at time k = Σ b j (k)  i (k) = probability [state S(k) = i]  (k) = [  0 (k)  1 (k) …  n (k) ] ; probability vector P ij = probability [ S(k) = j | S(k) = i ] = M C j k i j (1-k i ) M-j ; where k i = prob[ i / M + w i (k) > 0.5 | S(k)=i]  (k+1) = P T  (k) ; P = [P ij ]  (k) = ( P T ) k  (0) ; asymptotic behavior of probabilities b j (k)b ji (k)b j (k+1)

Probability plot for Model 1 with sigma = 0.5Probability plot for Model 1 with sigma = 0.75 Probability plot for Model 1 with sigma = 1Probability plot for Model 1 with sigma = 2 Model 1: M=4 & X(0)=[ ]

Binary Consensus Problems Model 2(a) & 2(b) Noise noise filtering decision b j,i D (k) = Dec( b j,i (k) ); Dec(x) = 1 x  th (normally 0.5) 0 x < th b j (k+1) = Dec( Σ b j,i D (k)/ M ); Dec(x) = 1 x  x < 0.5 = Dec( Σ Dec( b j,i (k) )/ M ) P ij = probability [ S(k) = j | S(k) = i ] = M C j k i j (1-k i ) m-j ; where k i = prob[ Σ Dec( b j,i (k) )/ M > 0.5 | s(k)=i] M = Σ M C l P[ b j,i D (k) = 1 ] j P[ b j,i D (k) = 0 ] m-j L =  m/2  P[b j,i d (k) = 1] = P[b j,i d (k)=1| b j (k)=1]*P[b j (k) = 1]+P[b j,i d (k)=1| b j (k)=0]*P[b j (k) = 0] = i / m + Q(0.5/  )(1 - 2i / M) P[b j,i d (k) = 0] = P[b j,i d (k)=0| b j (k) =1]*P[b j (k) = 1]+P[b j,i d (k)=0| b j (k)=0]*P[b j (k) = 0] = 1- i / M - Q(0.5/  )(1 - 2i / M) b j (k)b ji (k)b j (k+1)b ji D (k)

Probability plot for Model 2(a) with sigma = 0.5Probability plot for Model 2(a) with sigma = 0.75 Probability plot for Model 2(a) with sigma = 1Probability plot for Model 2(a) with sigma = 2 Model 2(a): The noise is filtered first by thresholding the received values at threshold level of 0.5 to ensure that the majority decision is made on correct data only. M=4 & X(0)=[ ]

Model 2(b): In this case the threshold for the comm. noise is dynamically chosen by monitoring the values that the nodes are sending and then updating the threshold based on the differential probabilities of sending a 1 or a 0.

Probability plot for Model 2(b) with sigma = 0.5Probability plot for Model 2(b) with sigma = 0.75 Probability plot for Model 2(b) with sigma = 1 Model 2(b): M=4 & X(0)=[ ]

Binary Consensus Problems Model 3 Noise noise filtering soft info decision b j,i D (k) = Dec( b j,i (k) ); Dec(x) = 1 x  th (normally 0.5) 0 x < th b j (k+1) = Dec( Σ E[ b j (k) | b j,i (k) ] / M ); Dec(x) = 1 x  x < 0.5 Where E[ b j (k) | b j,i (k) ] = f( b j,i (k) - 1 ) * P[ b j (k) = 1 ] f( b j,i (k) - 1 ) * P[ b j (k) = 1 ] + f( b j,i (k) ) * P[ b j (k) = 0 ] And f(x) = pdf of N ( 0,  2 ) P ij = probability [ s(k) = j | s(k) = i ] ; finding the probability of transition becomes very tedious and complex in this case so we simulate the case and calculate the probability statistically by taking a lot of samples ( min 1000 ) b j (k)b ji (k)b j (k+1)b ji D (k)E[b j (k)|b ji D (k)]

Probability plot for Model 2 with sigma = 0.5Probability plot for Model 3 with sigma = 0.75 Probability plot for Model 3 with sigma = 1Probability plot for Model 3 with sigma = 2 Model 3

Comparison of models Model 1  performance sharply degrades for larger noise variances (sigma > 0.5). Model 2(a)  Better than model 1 but can’t handle large noise variances (sigma > 1). Model 2(b)  better than model 2(a). The dynamic threshold works but only if the noise variance is < 1, because for larger noises a threshold between 0,1 will not work. Model 3  is very robust and can perform with large noises also (sigma >1) but we trade off speed of convergence for handling larger noises.

Detection & Estimation A group of nodes where each node has limited sensing capabilities  rely on the group for improving its estimation/detection quality. Estimation  each agent has an estimate of the parameter of interest which can take values over an infinite set or a known finite set. Detection  parameter of interest takes values from a finite known set

S S j (k)Ŝ j (k) O j (k+1) O j (k)O ji (k)O j D (k) Sensing noise noise filtering decision comm. noise noise filtering decision For k ≥ 1 Ŝ j (k) ; event sensed at time k O j (k); opinion formed at time k O ji (k) = O j (k) + σ n ji O i D (k) = Dec ( O ji (k) ) ; Dec(x) = 1 x ≥ x < 0.5 O j (k+1) = Dec( ( ∑O i D (k) + O j (k) + Ŝ j (k) ) / M+1 ) Binary Detection σ n = 0.5, σ s = 1 & S=1

Every node has M + 1 different values to weigh every time Weigh nodes with better communication or better sensing differently Define trust factor Trust factors  either time invariant or time variant Should update themselves over time Binary Detection

Trust factors: one way of implementing this is as follows: How nodes with good sensing and good communication affect the consensus X(0) = [ ] Average consensusDifferent weights