1 Collaborative Processing in Sensor Networks Lecture 4 - Distributed In-network Processing Hairong Qi, Associate Professor Electrical Engineering and.

Slides:



Advertisements
Similar presentations
1 MU-FASHION Multi-Resolution Data Fusion using Agent-Bearing Sensors In Hierarchically-Organized Networks Project Participants: Krishnendu Chakrabarty.
Advertisements

V-1 Part V: Collaborative Signal Processing Akbar Sayeed.
A Hierarchical Multiple Target Tracking Algorithm for Sensor Networks Songhwai Oh and Shankar Sastry EECS, Berkeley Nest Retreat, Jan
Decentralized Reactive Clustering in Sensor Networks Yingyue Xu April 26, 2015.
Fault-Tolerant Target Detection in Sensor Networks Min Ding +, Dechang Chen *, Andrew Thaeler +, and Xiuzhen Cheng + + Department of Computer Science,
Coverage Estimation in Heterogeneous Visual Sensor Networks Mahmut Karakaya and Hairong Qi Advanced Imaging & Collaborative Information Processing Laboratory.
On-Line Probabilistic Classification with Particle Filters Pedro Højen-Sørensen, Nando de Freitas, and Torgen Fog, Proceedings of the IEEE International.
What is Statistical Modeling
Nov 4, Detection, Classification and Tracking of Targets in Distributed Sensor Networks Presented by: Prabal Dutta Dan Li, Kerry Wong,
Watchdog Confident Event Detection in Heterogeneous Sensor Networks Matthew Keally 1, Gang Zhou 1, Guoliang Xing 2 1 College of William and Mary, 2 Michigan.
Chess Review May 11, 2005 Berkeley, CA Tracking Multiple Objects using Sensor Networks and Camera Networks Songhwai Oh EECS, UC Berkeley
Independent Component Analysis (ICA) and Factor Analysis (FA)
Collaborative Signal Processing CS 691 – Wireless Sensor Networks Mohammad Ali Salahuddin 04/22/03.
Distributed and Efficient Classifiers for Wireless Audio-Sensor Networks Baljeet Malhotra Ioanis Nikolaidis Mario A. Nascimento University of Alberta Canada.
Oral Defense by Sunny Tang 15 Aug 2003
Lecture II-2: Probability Review
1 Collaborative Processing in Sensor Networks Lecture 6 - Self-deployment Hairong Qi, Associate Professor Electrical Engineering and Computer Science University.
Data Selection In Ad-Hoc Wireless Sensor Networks Olawoye Oyeyele 11/24/2003.
1 Linear Methods for Classification Lecture Notes for CMPUT 466/551 Nilanjan Ray.
1 Secure Cooperative MIMO Communications Under Active Compromised Nodes Liang Hong, McKenzie McNeal III, Wei Chen College of Engineering, Technology, and.
1 Sequential Acoustic Energy Based Source Localization Using Particle Filter in a Distributed Sensor Network Xiaohong Sheng, Yu-Hen Hu University of Wisconsin.
Semantic Information Fusion Shashi Phoha, PI Head, Information Science and Technology Division Applied Research Laboratory The Pennsylvania State.
Bayesian networks Classification, segmentation, time series prediction and more. Website: Twitter:
REVISED CONTEXTUAL LRT FOR VOICE ACTIVITY DETECTION Javier Ram’ırez, Jos’e C. Segura and J.M. G’orriz Dept. of Signal Theory Networking and Communications.
Location Centric Distributed Computation and Signal Processing Parmesh Ramanathan University of Wisconsin, Madison Co-Investigators:A. Sayeed, K. K. Saluja,
LOGO Ensemble Learning Lecturer: Dr. Bo Yuan
Image Classification 영상분류
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Distributed State-Estimation Using Quantized Measurement Data from Wireless Sensor Networks Li Chai with Bocheng Hu Professor College of.
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Dynamic Sensor Resource Management for ATE MURI.
Detection, Classification and Tracking in a Distributed Wireless Sensor Network Presenter: Hui Cao.
Today Ensemble Methods. Recap of the course. Classifier Fusion
A Brief Review of Theory for Information Fusion in Sensor Networks Xiaoling Wang February 19, 2004.
CLASSIFICATION: Ensemble Methods
1 Collaborative Processing in Sensor Networks Lecture 2 - Mobile-agent-based Computing Hairong Qi, Associate Professor Electrical Engineering and Computer.
1 Collaborative Processing in Sensor Networks Lecture 5 - Visual Coverage Hairong Qi, Associate Professor Electrical Engineering and Computer Science University.
Mobile Agent Migration Problem Yingyue Xu. Energy efficiency requirement of sensor networks Mobile agent computing paradigm Data fusion, distributed processing.
Limitations of Cotemporary Classification Algorithms Major limitations of classification algorithms like Adaboost, SVMs, or Naïve Bayes include, Requirement.
High-integrity Sensor Networks Mani Srivastava UCLA.
Chapter 3: Maximum-Likelihood Parameter Estimation l Introduction l Maximum-Likelihood Estimation l Multivariate Case: unknown , known  l Univariate.
Modeling Mobile-Agent-based Collaborative Processing in Sensor Networks Using Generalized Stochastic Petri Nets Hongtao Du, Hairong Qi, Gregory Peterson.
1 Value of information – SITEX Data analysis Shubha Kadambe (310) Information Sciences Laboratory HRL Labs 3011 Malibu Canyon.
Multi-Speaker Modeling with Shared Prior Distributions and Model Structures for Bayesian Speech Synthesis Kei Hashimoto, Yoshihiko Nankaku, and Keiichi.
AUTOMATIC TARGET RECOGNITION AND DATA FUSION March 9 th, 2004 Bala Lakshminarayanan.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Lecture 2: Statistical learning primer for biologists
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Multi-target Detection in Sensor Networks Xiaoling Wang ECE691, Fall 2003.
Detection, Classification and Tracking in Distributed Sensor Networks D. Li, K. Wong, Y. Hu and A. M. Sayeed Dept. of Electrical & Computer Engineering.
Simulation of DeReClus Yingyue Xu September 6, 2003.
CS Statistical Machine learning Lecture 12 Yuan (Alan) Qi Purdue CS Oct
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Distributed Signal Processing Woye Oyeyele March 4, 2003.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
ADAPTIVE HIERARCHICAL CLASSIFICATION WITH LIMITED TRAINING DATA Dissertation Defense of Joseph Troy Morgan Committee: Dr Melba Crawford Dr J. Wesley Barnes.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
Canadian Bioinformatics Workshops
SENSOR FUSION LAB RESEARCH ACTIVITIES PART I : DATA FUSION AND DISTRIBUTED SIGNAL PROCESSING IN SENSOR NETWORKS Sensor Fusion Lab, Department of Electrical.
ECE 471/571 – Lecture 18 Classifier Fusion 04/12/17.
Outline S. C. Zhu, X. Liu, and Y. Wu, “Exploring Texture Ensembles by Efficient Markov Chain Monte Carlo”, IEEE Transactions On Pattern Analysis And Machine.
Design of Hierarchical Classifiers for Efficient and Accurate Pattern Classification M N S S K Pavan Kumar Advisor : Dr. C. V. Jawahar.
Hairong Qi, Gonzalez Family Professor
Hairong Qi, Gonzalez Family Professor
NON-NEGATIVE COMPONENT PARTS OF SOUND FOR CLASSIFICATION Yong-Choon Cho, Seungjin Choi, Sung-Yang Bang Wen-Yi Chu Department of Computer Science &
ECE – Pattern Recognition Lecture 10 – Nonparametric Density Estimation – k-nearest-neighbor (kNN) Hairong Qi, Gonzalez Family Professor Electrical.
Hairong Qi, Gonzalez Family Professor
ECE – Pattern Recognition Lecture 4 – Parametric Estimation
Hairong Qi, Gonzalez Family Professor
Presentation transcript:

1 Collaborative Processing in Sensor Networks Lecture 4 - Distributed In-network Processing Hairong Qi, Associate Professor Electrical Engineering and Computer Science University of Tennessee, Knoxville Lecture Series at ZheJiang University, Summer 2008

2 Research Focus - Recap Develop energy-efficient collaborative processing algorithms with fault tolerance in sensor networks –Where to perform collaboration? –Computing paradigms –Who should participate in the collaboration? –Reactive clustering protocols –Sensor selection protocols –How to conduct collaboration? –In-network processing –Self deployment

3 A Signal Processing and Fusion Hierarchy for Target Classification modality 1 modality M Temporal fusion Multi-modality fusion Multi-sensor fusion …… Temporal fusion … … … … … …… node 1 node inode N Local processing Local processing Local processing Local processing Local processing Local processing Mobile-agent Middleware

4 Signal - Acoustic/Seismic/IR

5 Local Signal Processing Amplitude stat. Time series signal Power Spectral Density (PSD)Wavelet Analysis Shape stat. Peak selection Coefficients Feature vectors (26 elements) Feature normalization, Principal Component Analysis (PCA) Target Classification (kNN) Feature extraction Classifier design

6 A Signal Processing and Fusion Hierarchy for Target Classification modality 1 modality M Temporal fusion Multi-modality fusion Multi-sensor fusion …… Temporal fusion … … … … … …… node 1 node inode N Local processing Local processing Local processing

7 Temporal Fusion Fuse all the 1-sec sub-interval local processing results corresponding to the same event (usually lasts about 10-sec) Majority voting number of possible local processing results number of local output c occurrence

8 A Signal Processing and Fusion Hierarchy for Target Classification modality 1 modality M Temporal fusion Multi-modality fusion Multi-sensor fusion …… Temporal fusion … … … … … …… node 1 node inode N Local processing Local processing Local processing

9 Multi-modality Fusion Difference sensing modalities can compensate each other’s sensing capability and provide a comprehensive view of the event Treat results from different modalities as independent classifiers – classifier fusion Majority voting won’t work Behavior-Knowledge Space algorithm (Huang&Suen) Assumption: - 2 modalities - 3 kinds of targets samples in the training Set Then: - 9 possible classification combinations c 1, c 2 samples from each classfused result 1,110/3/31 1,23/0/63 1,35/4/51,3 … 3,30/0/63

10 A Signal Processing and Fusion Hierarchy for Target Classification modality 1 modality M Temporal fusion Multi-modality fusion Multi-sensor fusion …… Temporal fusion … … … … … …… node 1 node inode N Local processing Local processing Local processing

11 Value-based vs. Interval-based Fusion Interval-based fusion can provide fault tolerance Interval integration – overlap function –Assume each sensor in a cluster measures the same parameters, the integration algorithm is to construct a simple function (overlap function) from the outputs of the sensors in a cluster and can resolve it at different resolutions as required w5s5w5s5 w4s4w4s4 w3s3w3s3 w2s2w2s2 w1s1w1s1 w6s6w6s6 w7s7w7s O(x)O(x) Crest: the highest, widest peak of the overlap function

12 Fault Tolerance Comparison of Different Overlap Functions n: number of sensors f: number of faulty sensors M: to find the smallest interval that contains all the intersections of n-f intervals S: return interval [a,b] where a is the (f+1)th left end point and b is the (n-f)th right end point Ω: overlap function N: interval with the overlap function ranges [n-f, n]

13 Multi-sensor Fusion for Target Classification Generation of local confidence ranges (For example, at each node i, use kNN for each k  {5,…,15}) Apply the integration algorithm on the confidence ranges generated from each node to construct an overlapping function confidence range confidence level smallest largest in this column Class 1 Class 2 … Class n k=5 3/5 2/5 … 0 k=6 2/6 3/6 … 1/6 … … … … … k=15 10/15 4/15 … 1/15 {2/6, 10/15} {4/15, 3/6} … {0, 1/6}

14 Distributed Integration Scheme Integration techniques must be robust and fault- tolerant to handle uncertainty and faulty sensor readouts Provide progressive accuracy A 1-D array serves to represent the partially- integrated overlap function Mobile-agent-based framework is employed

15 Mobile-agent-based Multi-sensor Fusion for Target Classification Node 1 Node 2 Node 3 [2/6, 10/15] Class 1 [3/15, 3/6] Class 2 … Confidence range CR1 [3/6, 4/5] Class 1 [0, 5/15] Class 2 … Confidence range CR2 [3/5, 10/12] Class 1 [1/15, 2/8] Class 2 … Confidence range CR3 At node 2, fusion code generates partially integrated confidence range CR12 [3/6, 10/15] Class 1 [3/15, 5/15] Class 2 … Fusion result at node 2: class 1 (58%) Mobile agent carries CR1 Mobile agent carries CR12 Fusion result at node 3: class 1 (63%) At node 3, fusion code generates partially integrated confidence range CR123 [3/5, 10/15] Class 1 [3/15, 2/8] Class 2 …

16 An Example of Multi-sensor Fusion [ ] [ ] [ ] [ ] [ ] [ ] [ ] [ ] [ ] [ ] node 1 node 2 node 3 node 4 h: height of the highest peak w: width of the peak acc: confidence at the center of the peak

17 Example of Multi-sensor Integration Result at Each Stop stop 1stop 2stop 3stop 4 caccc c c class class class

18 Performance Evaluation of Target Classification Hierarchy North-South East-West Center

19 SITEX02 Scenario Setup Acoustic sampling rate: 1024Hz Seismic sampling rate: 512 Hz Target types: AAV, DW, and HMMWV Collaborated work with two other universities (Penn State, Wisconsin) Data set is divided into 3 partitions: q1, q2, q3 Cross validation –M1: Training (q2+q3); Testing (q1) –M2: Training (q1+q3); Testing (q2) –M3: Training (q1+q2); Testing (q3)

20 Confusion Matrices of Classification on SITEX02 Acoustic (75.47%, 81.78%) Seismic (85.37%, 89.44%) Multi-modality fusion (84.34%) Multi-sensor fusion (96.44%) AAVDWHMV AAV2921 DW0188 HMV0223

21 Result from BAE-Austin Demo Suzuki Vitara Ford 350Ford 250 Harley Motocycle

22 Confusion Matrices Acoustic Seismic Multi-modal

23

24 Demo Mobile-agent-based multi-modal multi-sensor classification –Mobile agent migrates with a fixed itinerary (Tennessee) –Mobile agent migrates with a dynamic itinerary realized by distributed services (Auburn)

25 run 1: car walker run 2: car

26 Target Detection in Sensor Networks Single target detection –Energy threshold –Energy decay model: Multiple target detection –Why is multiple target detection necessary? –Requirements: energy-efficient & bandwidth efficient –Multiple target detection vs. source number estimation source source matrix observation matrix unmixing matrix Assumption: Target separation Speaker separation

27 Source Number Estimation (SNE) Task: Algorithms for source number estimation –Heuristic techniques –Principled approaches: Bayesian estimation, Markov chain Monte Carlo method, etc. Limitations –Centralized structure –Large amount of raw data transmission –Computation burden on the central processing unit Environment Sensor 1Sensor i central processing unit data …… decision Sensor n

28 Distributed Source Number Estimation Scheme Sensor … cluster head cluster head Fusion Center decision cluster 1 cluster L System architecture Algorithm structure m targets present in sensor filed Local estimation at cluster 1 progressive Local estimation at cluster L progressive Posterior probability fusion … centralized

29 Centralized Bayesian Estimation Approach : sensor observation matrix : hypothesis of the number of sources : source matrix : mixing matrix, : unmixing matrix, and : latent variable, and : non-linear transformation function : noise, with variance : marginal distribution of Source: S. J. Roberts, “Independent component analysis: source assessment & separation, a Bayesian approach,” IEE Proc. on Vision, Image, and Signal Processing, 145(3), pp , 1998.

30 Progressive Bayesian Estimation Approach Motivation –Reduce data transmission –Conserve energy m targets present in sensor filed Local estimation at cluster 1 progressive Local estimation at cluster L progressive Posterior probability fusion … centralized

31 Progressive Bayesian Estimation Approach Environment Sensor 1Sensor 2 … Sensor i … Sensor n Estimation accuracy is progressively improved includes … Transmitted information

32 Progressive Update at Local Sensors only depends on the updating rule of mixing matrix Progressive update of log-likelihood function:

33 Progressive Update at Local Sensors only depends on the updating rule of mixing matrix Progressive update of mixing matrix : BFGS method (Quasi-Newton method) Progressive update of error:

34

35 An Example

36 Mobile-agent-based Implementation of Progressive Estimation Sensor 1 Sensor 2 Sensor 3 Initialization Update Update terms in log-likelihood function Compute estimated latent variable Update estimation error Update Update terms in log-likelihood function Compute estimated latent variable Update estimation error Output

37 Distributed Fusion Scheme Bayes theorem Dempster’s rule of combination –Utilize probability intervals and uncertainty intervals to determine the likelihood of hypotheses based on multiple evidence m targets present in sensor filed Local estimation at cluster 1 progressive Local estimation at cluster L progressive Posterior probability fusion … centralized where

38 Performance Evaluation Target types Sensor nodes clustering Signals: 1-second acoustic, 500 samples each m targets present in sensor filed Local estimation progressive Local estimation progressive Posterior probability fusion … centralized BayesianDempster’s rule m targets present in sensor filed centralizedprogressive Source number estimation centralized Experiment 1 progressive Experiment 2 Bayesian Experiment 3 Dempster’s rule Experiment 4 centralized progressive Experiment 5 Experiment 6

39 Average Log-likelihood Comparison Experiment 1 (Centralized) Experiment 2 (Centralized + Bayesian) Experiment 3 Centralized + Dempster’s rule Experiment 4 (Progressive) Experiment 5 & 6 (Progressive (cluster 1) vs. fusion) Experiment 5 & 6 (Progressive (cluster 2) vs. fusion)

40 Output Histogram Comparison Experiment 1 (Centralized) Experiment 2 (Centralized + Bayesian fusion) Experiment 3 (Centralized + Dempster’s rule) Experiment 4 (Progressive) Experiment 5 (Progressive + Bayesian fusion) Experiment 6 (Progressive + Dempster’s rule)

41 Performance Comparison Kurtosis Data transmission Detection probability Energy consumption

42 Discussion Develop a novel distributed multiple target detection scheme with –Progressive source number estimation approach (first in literature) –Mobile agent implementation –Cluster-based probability fusion The approach (progressive intra-cluster estimation + Bayesian inter- cluster fusion) has the best performance KurtosisDetection probability (%) Data transmission (bits) Energy comsumption ( uW*sec) Centralized ,000486,095 Centralized + Bayesian fusion ,160433,424 Progressive + Bayesian fusion ,160 (16.78%)89,242 (18.36%)

43 Reference X. Wang, H. Qi, S. Beck, H. Du, “Progressive approach to distributed multiple-target detection in sensor networks,” pp , Sensor Network Operations. Editor: Shashi Phoha, Thomas F. La Porta, Christopher Griffin. Wiley-IEEE Press, March H. Qi, Y. Xu, X. Wang, “Mobile-agent-based collaborative signal and information processing in sensor networks,” Proceedings of the IEEE, 91(8): , August H. Qi, X. Wang, S. S. Iyengar, K. Chakrabarty, “High performance sensor integration in distributed sensor networks using mobile agents,” International Journal of High Performance Computing Applications, 16(3): , Fall, 2002.