November 1, 2012 Presented by Marwan M. Alkhweldi Co-authors Natalia A. Schmid and Matthew C. Valenti Distributed Estimation of a Parametric Field Using.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

1 ECE 776 Project Information-theoretic Approaches for Sensor Selection and Placement in Sensor Networks for Target Localization and Tracking Renita Machado.
Expectation-Maximization (EM) Algorithm Md. Rezaul Karim Professor Department of Statistics University of Rajshahi Bangladesh September 21, 2012.
An Introduction to the EM Algorithm Naala Brewer and Kehinde Salau.
Slide 1 Bayesian Model Fusion: Large-Scale Performance Modeling of Analog and Mixed- Signal Circuits by Reusing Early-Stage Data Fa Wang*, Wangyang Zhang*,
Expectation Maximization Expectation Maximization A “Gentle” Introduction Scott Morris Department of Computer Science.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Maximum Likelihood And Expectation Maximization Lecture Notes for CMPUT 466/551 Nilanjan Ray.
Fast Bayesian Matching Pursuit Presenter: Changchun Zhang ECE / CMR Tennessee Technological University November 12, 2010 Reading Group (Authors: Philip.
Maximum likelihood separation of spatially autocorrelated images using a Markov model Shahram Hosseini 1, Rima Guidara 1, Yannick Deville 1 and Christian.
1 Expectation Maximization Algorithm José M. Bioucas-Dias Instituto Superior Técnico 2005.
Visual Recognition Tutorial
On Systems with Limited Communication PhD Thesis Defense Jian Zou May 6, 2004.
EE-148 Expectation Maximization Markus Weber 5/11/99.
Volkan Cevher, Marco F. Duarte, and Richard G. Baraniuk European Signal Processing Conference 2008.
First introduced in 1977 Lots of mathematical derivation Problem : given a set of data (data is incomplete or having missing values). Goal : assume the.
The EM algorithm (Part 1) LING 572 Fei Xia 02/23/06.
Lecture 5: Learning models using EM
1 Learning Entity Specific Models Stefan Niculescu Carnegie Mellon University November, 2003.
Location Estimation in Sensor Networks Moshe Mishali.
Comparative survey on non linear filtering methods : the quantization and the particle filtering approaches Afef SELLAMI Chang Young Kim.
Visual Recognition Tutorial
Expectation-Maximization (EM) Chapter 3 (Duda et al.) – Section 3.9
Performance Analysis of Relative Location Estimation for Multihop Wireless Sensor Networks Qicai Shi; Kyperountas, S.; Correal, N.S.; Feng Niu Selected.
24 November, 2011National Tsin Hua University, Taiwan1 Mathematical Structures of Belief Propagation Algorithms in Probabilistic Information Processing.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
1 Mohammed M. Olama Seddik M. Djouadi ECE Department/University of Tennessee Ioannis G. PapageorgiouCharalambos D. Charalambous Ioannis G. Papageorgiou.
Consensus-based Distributed Estimation in Camera Networks - A. T. Kamal, J. A. Farrell, A. K. Roy-Chowdhury University of California, Riverside
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
3 September, 2009 SSP2009, Cardiff, UK 1 Probabilistic Image Processing by Extended Gauss-Markov Random Fields Kazuyuki Tanaka Kazuyuki Tanaka, Muneki.
Lecture note for Stat 231: Pattern Recognition and Machine Learning 4. Maximum Likelihood Prof. A.L. Yuille Stat 231. Fall 2004.
An Enhanced Received Signal Level Cellular Location Determination Method via Maximum Likelihood and Kalman Filtering Ioannis G. Papageorgiou Charalambos.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
On Energy-Efficient Trap Coverage in Wireless Sensor Networks Junkun Li, Jiming Chen, Shibo He, Tian He, Yu Gu, Youxian Sun Zhejiang University, China.
Prognosis of gear health using stochastic dynamical models with online parameter estimation 10th International PhD Workshop on Systems and Control a Young.
Distributed State-Estimation Using Quantized Measurement Data from Wireless Sensor Networks Li Chai with Bocheng Hu Professor College of.
Detection, Classification and Tracking in a Distributed Wireless Sensor Network Presenter: Hui Cao.
Clustering and Testing in High- Dimensional Data M. Radavičius, G. Jakimauskas, J. Sušinskas (Institute of Mathematics and Informatics, Vilnius, Lithuania)
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
A Passive Approach to Sensor Network Localization Rahul Biswas and Sebastian Thrun International Conference on Intelligent Robots and Systems 2004 Presented.
: Chapter 3: Maximum-Likelihood and Baysian Parameter Estimation 1 Montri Karnjanadecha ac.th/~montri.
Lecture 6 Spring 2010 Dr. Jianjun Hu CSCE883 Machine Learning.
Synchronization of Turbo Codes Based on Online Statistics
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Expectation-Maximization (EM) Algorithm & Monte Carlo Sampling for Inference and Approximation.
1 Introduction to Statistics − Day 4 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Lecture 2 Brief catalogue of probability.
Mathematical Analysis of MaxEnt for Mixed Pixel Decomposition
A Low-Complexity Universal Architecture for Distributed Rate-Constrained Nonparametric Statistical Learning in Sensor Networks Avon Loy Fernandes, Maxim.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Chance Constrained Robust Energy Efficiency in Cognitive Radio Networks with Channel Uncertainty Yongjun Xu and Xiaohui Zhao College of Communication Engineering,
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
Learning Theory Reza Shadmehr Distribution of the ML estimates of model parameters Signal dependent noise models.
Computational Intelligence: Methods and Applications Lecture 26 Density estimation, Expectation Maximization. Włodzisław Duch Dept. of Informatics, UMK.
30 November, 2005 CIMCA2005, Vienna 1 Statistical Learning Procedure in Loopy Belief Propagation for Probabilistic Image Processing Kazuyuki Tanaka Graduate.
- A Maximum Likelihood Approach Vinod Kumar Ramachandran ID:
A Study on Speaker Adaptation of Continuous Density HMM Parameters By Chin-Hui Lee, Chih-Heng Lin, and Biing-Hwang Juang Presented by: 陳亮宇 1990 ICASSP/IEEE.
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
Ch3: Model Building through Regression
Outlier Processing via L1-Principal Subspaces
Graduate School of Information Sciences, Tohoku University, Japan
Probabilistic Models for Linear Regression
Graduate School of Information Sciences, Tohoku University
Probabilistic image processing and Bayesian network
Parametric Methods Berlin Chen, 2005 References:
Graduate School of Information Sciences, Tohoku University
A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models Jeff A. Bilmes International.
A Gentle Tutorial of the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models Jeff A. Bilmes International.
Information Sciences and Systems Lab
Maximum Likelihood Estimation (MLE)
Presentation transcript:

November 1, 2012 Presented by Marwan M. Alkhweldi Co-authors Natalia A. Schmid and Matthew C. Valenti Distributed Estimation of a Parametric Field Using Sparse Noisy Data This work was sponsored by the Office of Naval Research under Award No. N

Overview and Motivation Assumptions Problem Statement Proposed Solution Numerical Results Summary Outline November 1, 2012

WSNs have been used for area monitoring, surveillance, target recognition and other inference problems since 1980s [1]. All designs and solutions are application oriented. Various constraints were incorporated [2]. Performance of WSNs under the constraints was analyzed. The task of distributed estimators was focused on estimating an unknown signal in the presence of channel noise [3]. We consider a more general estimation problem, where an object is characterized by a physical field, and formulate the problem of distributed field estimation from noisy measurements in a WSN. Overview and Motivation November 1, 2012 [1] C. Y. Chong, S. P. Kumar, “Sensor Networks: Evolution, Opportunities, and Challenges” Proceeding of the IEEE, vol. 91, no. 8, pp , [2] A. Ribeiro, G. B. Giannakis, “Bandwidth-Constrained Distributed Estimation for Wireless Sensor Networks - Part I:Gaussian Case,” IEEE Trans. on Signal Processing, vol. 54, no. 3, pp , [3] J. Li, and G. AlRegib, “Distributed Estimation in Energy-Contrained Wireless Sensor Networks,” IEEE Trans. on Signal Processing, vol. 57, no. 10, pp , 2009.

Assumptions November 1, 2012 Z1 Z2. ZK Fusion Center A Transmission Channel Observation Model The object generates fumes that are modeled as a Gaussian shaped field.

Given noisy quantized sensor observations at the Fusion Center, the goal is to estimate the location of the target and the distribution of its physical field. Proposed Solution: Signals received at the FC are independent but not i.i.d. Since the unknown parameters are deterministic, we take the maximum likelihood (ML) approach. Let be the log-likelihood function of the observations at the Fusion Center. Then the ML estimates solve: Problem Statement November 1, 2012

Proposed Solution November 1, 2012 The log-likelihood function of is: The necessary condition to find the maximum is:

Iterative Solution November 1, 2012 A. P. Dempster, N. M. Laird, and D. B. Rubin, “Maximum likelihood from incomplete data via the em algorithm," J. of the Royal Stat. Soc. Series B, vol. 39, no. 1, pp. 1-38, Incomplete data: Complete data:, where, and. Mapping:. where. The complete data log-likelihood:

Expectation Step : Maximization Step: E- and M- steps November 1, 2012

Assume the area A is of size 8-by-8; K sensors are randomly distributed over A; M quantization levels; SNR in observation channel is defined as: SNR in transmission channel is defined as: Experimental Set Up November 1, 2012

Performance Measures November 1, 2012 Target Localization Shape Reconstruction

The simulated Gaussian field and squared difference between the original and reconstructed fields where Numerical Results November 1, 2012

EM - convergence November 1, 2012 SNRo=SNRc=15dB. Number of sensors K=20.

Box-plot of Square Error November 1, Monte Carlo realizations. SNRo=SNRc=15dB.

Box-plot of Integrated Square Error November 1, Monte Carlo realizations. SNRo=SNRc=15dB. Number of quantization levels M=8

Probability of Outliers November 1, Monte Carlo realizations. SNRo=SNRc=15dB. Number of quantization levels M=8.

Effect of Quantization Levels November 1, Monte Carlo realizations. SNRo=SNRc=15dB. Number of sensors K=20.

Summary November 1, 2012 An iterative linearized EM solution to distributed field estimation is presented and numerically evaluated. SNRo dominates SNRc in terms of its effect on the performance of the estimator. Increasing the number of sensors results in fewer outliers and thus in increased quality of the estimated values. At small number of sensors the EM algorithm produces a substantial number of outliers. More number of quantization levels makes the EM algorithm takes fewer iterations to converge. For large K, increasing the number of sensors does not have a notable effect on the performance of the algorithms.

Natalia A. Schmid Marwan Alkhweldi Matthew C. Valenti Contact Information November 1, 2012