On-Line Probabilistic Classification with Particle Filters Pedro Højen-Sørensen, Nando de Freitas, and Torgen Fog, Proceedings of the IEEE International.

Slides:



Advertisements
Similar presentations
State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.
Advertisements

Dynamic Bayesian Networks (DBNs)
An Overview of Machine Learning
An Introduction to Variational Methods for Graphical Models.
Visual Tracking CMPUT 615 Nilanjan Ray. What is Visual Tracking Following objects through image sequences or videos Sometimes we need to track a single.
1 Slides for the book: Probabilistic Robotics Authors: Sebastian Thrun Wolfram Burgard Dieter Fox Publisher: MIT Press, Web site for the book & more.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Oklahoma State University Generative Graphical Models for Maneuvering Object Tracking and Dynamics Analysis Xin Fan and Guoliang Fan Visual Computing and.
Presenter: Yufan Liu November 17th,
Pattern Recognition and Machine Learning
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
A brief Introduction to Particle Filters
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Particle filters (continued…). Recall Particle filters –Track state sequence x i given the measurements ( y 0, y 1, …., y i ) –Non-linear dynamics –Non-linear.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference (Sec. )
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
1 lBayesian Estimation (BE) l Bayesian Parameter Estimation: Gaussian Case l Bayesian Parameter Estimation: General Estimation l Problems of Dimensionality.
Arizona State University DMML Kernel Methods – Gaussian Processes Presented by Shankar Bhargav.
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Novel approach to nonlinear/non- Gaussian Bayesian state estimation N.J Gordon, D.J. Salmond and A.F.M. Smith Presenter: Tri Tran
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Radial Basis Function Networks
Jeff Howbert Introduction to Machine Learning Winter Classification Bayesian Classifiers.
EE513 Audio Signals and Systems Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Helsinki University of Technology Adaptive Informatics Research Centre Finland Variational Bayesian Approach for Nonlinear Identification and Control Matti.
Object Tracking using Particle Filter
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
A General Framework for Tracking Multiple People from a Moving Camera
Multi-Layer Perceptrons Michael J. Watts
STUDY, MODEL & INTERFACE WITH MOTOR CORTEX Presented by - Waseem Khatri.
Comparison of Bayesian Neural Networks with TMVA classifiers Richa Sharma, Vipin Bhatnagar Panjab University, Chandigarh India-CMS March, 2009 Meeting,
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
1 Generative and Discriminative Models Jie Tang Department of Computer Science & Technology Tsinghua University 2012.
Learning Theory Reza Shadmehr Linear and quadratic decision boundaries Kernel estimates of density Missing data.
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
Virtual Vector Machine for Bayesian Online Classification Yuan (Alan) Qi CS & Statistics Purdue June, 2009 Joint work with T.P. Minka and R. Xiang.
-Arnaud Doucet, Nando de Freitas et al, UAI
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Sparse Bayesian Learning for Efficient Visual Tracking O. Williams, A. Blake & R. Cipolloa PAMI, Aug Presented by Yuting Qi Machine Learning Reading.
Tracking Multiple Cells By Correspondence Resolution In A Sequential Bayesian Framework Nilanjan Ray Gang Dong Scott T. Acton C.L. Brown Department of.
Linear Models for Classification
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Chapter 8. Learning of Gestures by Imitation in a Humanoid Robot in Imitation and Social Learning in Robots, Calinon and Billard. Course: Robots Learning.
Cameron Rowe.  Introduction  Purpose  Implementation  Simple Example Problem  Extended Kalman Filters  Conclusion  Real World Examples.
Gaussian Process and Prediction. (C) 2001 SNU CSE Artificial Intelligence Lab (SCAI)2 Outline Gaussian Process and Bayesian Regression  Bayesian regression.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Network Arnaud Doucet Nando de Freitas Kevin Murphy Stuart Russell.
Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks Arnaud Doucet, Nando de Freitas, Kevin Murphy and Stuart Russell CS497EA presentation.
Probabilistic Robotics Probability Theory Basics Error Propagation Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics.
Evolutionary Computation Evolving Neural Network Topologies.
Department of Civil and Environmental Engineering
Particle Filtering for Geometric Active Contours
Probabilistic Robotics
Course: Autonomous Machine Learning
Chapter 3: Maximum-Likelihood and Bayesian Parameter Estimation (part 2)
Introduction to particle filter
State Estimation Probability, Bayes Filtering
Introduction to particle filter
Particle Filtering.
Multivariate Methods Berlin Chen
Mathematical Foundations of BME
Multivariate Methods Berlin Chen, 2005 References:
Linear Discrimination
Chapter 3: Maximum-Likelihood and Bayesian Parameter Estimation (part 2)
Yalchin Efendiev Texas A&M University
Presentation transcript:

On-Line Probabilistic Classification with Particle Filters Pedro Højen-Sørensen, Nando de Freitas, and Torgen Fog, Proceedings of the IEEE International Workshop on Neural Networks for Signal Processing (NNSP2000), 2000 (to appear) Cho, Dong-Yeon

Introduction Sequential Classification Problems  Condition monitoring and real-time decision systems  Monitoring patients, fault detection problem  Particle filter provide an efficient and elegant probabilistic solution to this problem.  It becomes possible to compute the probabilities of class membership when the classes overlap and evolve with time.  This classification framework applied to any type of classifier, but for demonstration purposes, multi-layer perceptrons (MLPs) are used.

Model Specification Markov, Nonlinear, State Space Representation  Transition model: p(  t |  t–1 ) R   t  R n  corresponds to the parameters (weights) of a neural network f(x t,  t )  The parameters are assumed to follow a random walk  t =  t–1 + u t.  The process noise could be Gaussian u t ~ N(0,  t 2 I n  )  Observation model: p(y t |x t,  t ) R  x t  R n x denotes the input data at time t.  y t  {0,1} n y represents the output class labels.  The likelihood of the observations should be given by the following binomial (Bernoulli) distribution

Estimation Objectives  Our goal will be to approximate the posterior distribution p(  0: t |d 1: t ) and one of its marginals, the filtering density p(  t |d 1: t ), where d 1: t = {x 1:t, y 1:t }  By computing the filtering density recursively, we do not need to keep track of the complete history of the parameters.

Particle Filtering

Generic Particle Filter for Classification

Bayesian Importance Sampling Step  Importance functions  Recursive formulas  Transition prior p(  t |  t–1 ) is used as importance distribution for the MLPs.

Selection Step E  E(N i ) = Nw t (i) MCMC step  A skewed importance weights distribution  Many particles have no children, whereas others have a large number of children.

A Simple Classification Example Experimental Setup  An MLP with 4 hidden logistic functions and an output logistic function  N = 200,  t = 0.2 (  0 = 10)

Results

An Application to Fault Detection Monitoring the exhaust valve condition in a marine diesel engine  The main goal  Detection of the leakage before the engine performance becomes unacceptable or irreversible damage occurs.

Experimental Setup and Results  An MLP with 2 hidden unit and 5 input nodes (PCA is used for dimensionality reduction.)  500 particles

Conclusions We presented a novel on-line classification scheme and demonstrated it on two problems.  We believe this strategy has great potential and that it needs to be further tested on other types of parametric classifiers and classification domains.