On Systems with Limited Communication PhD Thesis Defense Jian Zou May 6, 2004.

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

Motivating Markov Chain Monte Carlo for Multiple Target Tracking
Design of Experiments Lecture I
SA-1 Probabilistic Robotics Planning and Control: Partially Observable Markov Decision Processes.
Tracking Unknown Dynamics - Combined State and Parameter Estimation Tracking Unknown Dynamics - Combined State and Parameter Estimation Presenters: Hongwei.
Bayesian Robust Principal Component Analysis Presenter: Raghu Ranganathan ECE / CMR Tennessee Technological University January 21, 2011 Reading Group (Xinghao.
Visual Recognition Tutorial
Copyright © 2006 Pearson Addison-Wesley. All rights reserved. Lecture 3: Monte Carlo Simulations (Chapter 2.8–2.10)
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
Sérgio Pequito Phd Student
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Comparative survey on non linear filtering methods : the quantization and the particle filtering approaches Afef SELLAMI Chang Young Kim.
Planning operation start times for the manufacture of capital products with uncertain processing times and resource constraints D.P. Song, Dr. C.Hicks.
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Estimation and the Kalman Filter David Johnson. The Mean of a Discrete Distribution “I have more legs than average”
Nonlinear Stochastic Programming by the Monte-Carlo method Lecture 4 Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius, Lithuania EURO.
Adaptive Signal Processing
CHAPTER 15 S IMULATION - B ASED O PTIMIZATION II : S TOCHASTIC G RADIENT AND S AMPLE P ATH M ETHODS Organization of chapter in ISSO –Introduction to gradient.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Introduction to Adaptive Digital Filters Algorithms
Material Model Parameter Identification via Markov Chain Monte Carlo Christian Knipprath 1 Alexandros A. Skordos – ACCIS,
Particle Filtering in Network Tomography
1 Miodrag Bolic ARCHITECTURES FOR EFFICIENT IMPLEMENTATION OF PARTICLE FILTERS Department of Electrical and Computer Engineering Stony Brook University.
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
1 Mohammed M. Olama Seddik M. Djouadi ECE Department/University of Tennessee Ioannis G. PapageorgiouCharalambos D. Charalambous Ioannis G. Papageorgiou.
Markov Localization & Bayes Filtering
Introduction to variable selection I Qi Yu. 2 Problems due to poor variable selection: Input dimension is too large; the curse of dimensionality problem.
Object Tracking using Particle Filter
Chapter 17 Boundary Value Problems. Standard Form of Two-Point Boundary Value Problem In total, there are n 1 +n 2 =N boundary conditions.
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
November 1, 2012 Presented by Marwan M. Alkhweldi Co-authors Natalia A. Schmid and Matthew C. Valenti Distributed Estimation of a Parametric Field Using.
Stochastic Linear Programming by Series of Monte-Carlo Estimators Leonidas SAKALAUSKAS Institute of Mathematics&Informatics Vilnius, Lithuania
Probabilistic Robotics Bayes Filter Implementations.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Dynamic Sensor Resource Management for ATE MURI.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 6.1: Bayes Filter Jürgen Sturm Technische Universität München.
-Arnaud Doucet, Nando de Freitas et al, UAI
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Monte-Carlo method for Two-Stage SLP Lecture 5 Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius, Lithuania EURO Working Group on Continuous.
Adaptive Hopfield Network Gürsel Serpen Dr. Gürsel Serpen Associate Professor Electrical Engineering and Computer Science Department University of Toledo.
CS Statistical Machine learning Lecture 24
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Mixture Kalman Filters by Rong Chen & Jun Liu Presented by Yusong Miao Dec. 10, 2003.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
1 On the Channel Capacity of Wireless Fading Channels C. D. Charalambous and S. Z. Denic School of Information Technology and Engineering, University of.
Javad Azimi, Ali Jalali, Xiaoli Fern Oregon State University University of Texas at Austin In NIPS 2011, Workshop in Bayesian optimization, experimental.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Network Arnaud Doucet Nando de Freitas Kevin Murphy Stuart Russell.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Probability Theory and Parameter Estimation I
Ch3: Model Building through Regression
Department of Civil and Environmental Engineering
Particle Filtering for Geometric Active Contours
Course: Autonomous Machine Learning
Slides for Introduction to Stochastic Search and Optimization (ISSO) by J. C. Spall CHAPTER 15 SIMULATION-BASED OPTIMIZATION II: STOCHASTIC GRADIENT AND.
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Parametric Methods Berlin Chen, 2005 References:
Machine Learning: Lecture 6
Machine Learning: UNIT-3 CHAPTER-1
Convergence of Sequential Monte Carlo Methods
Stochastic Methods.
Presentation transcript:

On Systems with Limited Communication PhD Thesis Defense Jian Zou May 6, 2004

5/6/2004PhD Thesis Defense, Jian Zou2 Motivation I  Information theoretical issues are traditionally decoupled from consideration of decision and control problems by ignoring communication constraints.  Many newly emerged control systems are distributed, asynchronous and networked. We are interested in integrating communication constraints into consideration of control system.

5/6/2004PhD Thesis Defense, Jian Zou3 Examples MEMSUAV Picture courtesy: Aeronautical Systems Biological System

5/6/2004PhD Thesis Defense, Jian Zou4 Theoretical framework for systems with limited communication  A theoretical framework for systems with limited communication should answer many important questions (state estimation, stability and controllability, optimal control and robust control).  The effort just begins. It is still a long road ahead.

5/6/2004PhD Thesis Defense, Jian Zou5 State Estimation  Communication constraints cause time delay and quantization of analog measurements.  Two steps in considering state estimation problem from quantized measurement. First, for a class of given underlying systems and quantizers, we seek effective state estimator from quantized measurement. Second, we try to find optimal quantizer with respect to those state estimators.

5/6/2004PhD Thesis Defense, Jian Zou6 Motivation II  Optimal reconstruction of a Gauss-Markov process from its quantized version requires exploration of the power spectrum (autocorrelation function) of the process.  Mathematical models for this problem is similar to that of state estimation from quantized measurement.

5/6/2004PhD Thesis Defense, Jian Zou7 Major contributions  We found effective state estimators from quantized measurements, namely quantized measurement sequential Monte Carlo method and finite state approximation for two broad classes of systems.  We studied numerical methods to seek optimal quantizer with respect to those state estimators.

5/6/2004PhD Thesis Defense, Jian Zou8 Reconstruction of a Gauss-Markov process Systems with limited communication Noisy Measurement Noiseless Measurement Quantized Measurement Kalman Filter ( or Extend Kalman Filter) Quantized Measurement Sequential Monte Carlo method Quantized Measurement Kalman Filter Finite State Approximation Motivation Mathematical Models (Chapter 2) Sub optimal State Estimator (Chapter 3, 4 and 5)

5/6/2004PhD Thesis Defense, Jian Zou9 System Block Diagram Figure 2.1

5/6/2004PhD Thesis Defense, Jian Zou10 Assumptions  We only consider systems which can be modeled as block diagram in Figure 2.1.  Assumptions regarding underlying physical object or process, information to be transmitted, type of communication channels, protocols are made.

5/6/2004PhD Thesis Defense, Jian Zou11 Mathematical Model

5/6/2004PhD Thesis Defense, Jian Zou12 State Estimation from Quantized Measurement

5/6/2004PhD Thesis Defense, Jian Zou13 Optimal Reconstruction of Colored Stochastic Process

5/6/2004PhD Thesis Defense, Jian Zou14 Reconstruction of a Gauss-Markov process Noisy Measurement Noiseless Measurement Quantized Measurement Kalman Filter ( or Extend Kalman Filter) Quantized Measurement Sequential Monte Carlo method Quantized Measurement Kalman Filter Finite State Approximation Motivation Mathematical Models (Chapter 2) Sub optimal State Estimator (Chapter 3, 4 and 5) Systems with limited communication

5/6/2004PhD Thesis Defense, Jian Zou15 Noisy Measurement

5/6/2004PhD Thesis Defense, Jian Zou16 Two approaches  Treating quantization as additive noise + Kalman Filter (Extended Kalman Filter)  We call them Quantized measurement Kalman filter (extended Kalman filter) respectively.  Applying sequential Monte Carlo method (particle filter).  We call the method Quantized measurement sequential Monte Carlo method (QMSMC).

5/6/2004PhD Thesis Defense, Jian Zou17 Treating quantization as additive noise  Definition (Reverse map and quantization function )  Definition (Quantization noise function n)  Definition (Quantization noise sequence)  Impose Assumptions on statistics of quantization noise.

5/6/2004PhD Thesis Defense, Jian Zou18 Quantized Measurement Kalman filter (Extend Kalman filter)  Kalman filter is modified to incorporate the artificially made-up quantization noise. The statistics of quantization noise depends on the distribution of measurement being quantized.  Extend Kalman filter is modified in a similar way.

5/6/2004PhD Thesis Defense, Jian Zou19 QMSMC algorithm Samples of step k-1 Prior Samples Evaluation of Likelihood … Resampling and sample of step k

5/6/2004PhD Thesis Defense, Jian Zou20 Diagram for General Convergence Theorem Evolution of approximate distribution Evolution of a posterior distribution

5/6/2004PhD Thesis Defense, Jian Zou21 Properties of QMSMC  complexity at each iteration. Parallel Computation can effectively reduce the computational time.  The resulted random variable sequence indexed by number of samples used converges to the conditional mean in probability. This is the meaning of asymptotical optimality.

5/6/2004PhD Thesis Defense, Jian Zou22 Simulation Results

5/6/2004PhD Thesis Defense, Jian Zou23 Simulation Results

5/6/2004PhD Thesis Defense, Jian Zou24 Simulation Results

5/6/2004PhD Thesis Defense, Jian Zou25 Simulation results for navigation model of MIT instrumented X-60 helicopter

5/6/2004PhD Thesis Defense, Jian Zou26 Reconstruction of a Gauss-Markov process Noisy Measurement Noiseless Measurement Quantized Measurement Kalman Filter ( or Extend Kalman Filter) Quantized Measurement Sequential Monte Carlo method Quantized Measurement Kalman Filter Finite State Approximation Motivation Mathematical Models (Chapter 2) Sub optimal State Estimator (Chapter 3, 4 and 5) Systems with limited communication

5/6/2004PhD Thesis Defense, Jian Zou27 Noiseless Measurement

5/6/2004PhD Thesis Defense, Jian Zou28 Two approaches  Treating quantization as additive noise + Kalman Filter (Extended Kalman Filter)  Discretize the state space and apply the formula for partially observed HMM.  We call the method finite state approximation.

5/6/2004PhD Thesis Defense, Jian Zou29 Finite State Approximation

5/6/2004PhD Thesis Defense, Jian Zou30  We assume that the evolution ofobeys time invariant linear rule. We also assume this rule can be obtained from evolution of underlying systems.  Under this assumption, we apply formula for partially observed HMM for state estimation.  Computational complexity Finite State Approximation

5/6/2004PhD Thesis Defense, Jian Zou31 Finite State Approximation

5/6/2004PhD Thesis Defense, Jian Zou32 Optimal quantizer For Standard Normal Distribution Numerical methods searching for optimal quantizer for Second-order Gauss Markov process

5/6/2004PhD Thesis Defense, Jian Zou33

5/6/2004PhD Thesis Defense, Jian Zou34 Properties of Optimal Quantizer for Standard Normal Distribution  Theorem 6.1.1, establish bounds on conditional mean in the tail of standard normal distribution.  Theorem proposes an upper bound on quantization error contributed by the tail.  After assuming conjecture 6.1.1, we obtain upper bounds of error associated with optimal N-level quantizer for standard normal distribution.

5/6/2004PhD Thesis Defense, Jian Zou35 Numerical Methods Searching for Optimal Quantizer for Second-order Gauss Markov Process  For Gauss-Markov underlying process, define cost function of an quantizer to be square root of mean squared estimation error by Quantized measurement Kalman filter.  Algorithm search for local minimum of cost function using gradient descent method with respect to parameters in quantizer.

5/6/2004PhD Thesis Defense, Jian Zou36 Numerical Results  For second order systems with different damping ratios, optimal quantizers are indistinguishable based on our criteria.  Lower damping ratio will reduce error associated with optimal quantizer.

5/6/2004PhD Thesis Defense, Jian Zou37 Conclusions  We considered systems with limited communication and optimal reconstruction of a Gauss-Markov process.  Effective sub optimal state estimators from quantized measurements.  Study of properties of optimal quantizer for standard normal distribution and numerical methods to seek optimal quantizer for Gauss-Markov process.

5/6/2004PhD Thesis Defense, Jian Zou38 Reconstruction of a Gauss-Markov process Systems with limited communication Noisy Measurement Noiseless Measurement Quantized Measurement Kalman Filter ( or Extend Kalman Filter) Quantized Measurement Sequential Monte Carlo method Quantized Measurement Kalman Filter Finite State Approximation Motivation Mathematical Models (Chapter 2) Sub optimal State Estimator (Chapter 3, 4 and 5)

5/6/2004PhD Thesis Defense, Jian Zou39 Optimal quantizer For Standard Normal Distribution Numerical methods searching for optimal quantizer for Second-order Gauss Markov process Optimal Quantizer (Chapter 6)

5/6/2004PhD Thesis Defense, Jian Zou40 Future Work  Other topics regarding systems with limited communication such as controllability, stability, optimal control with respect to new cost function and robust control.  Improving QMSMC and finite state approximation methods and related theoretical work.  New methods to search optimal quantizer for Gauss- Markov process.

5/6/2004PhD Thesis Defense, Jian Zou41 Acknowledgements  Prof. Roger Brockett.  Prof. Alek Kavcic, Prof. Garrett Stanley and Prof. Navin Khaneja  Haidong Yuan and Dan Crisan  Michael, Ben, Ali, Jason, Sean, Randy, Mark, Manuela.  NSF and U.S. Army Research Office