Random Set/Point Process in Multi-Target Tracking

Slides:



Advertisements
Similar presentations
SAMSI Discussion Session Random Sets/ Point Processes in Multi-Object Tracking: Vo Dr Daniel Clark EECE Department Heriot-Watt University UK.
Advertisements

Discussion for SAMSI Tracking session, 8 th September 2008 Simon Godsill Signal Processing and Communications Lab. University of Cambridge www-sigproc.eng.cam.ac.uk/~sjg.
Dynamic Spatial Mixture Modelling and its Application in Cell Tracking - Work in Progress - Chunlin Ji & Mike West Department of Statistical Sciences,
Bayesian Belief Propagation
Motivating Markov Chain Monte Carlo for Multiple Target Tracking
Jose-Luis Blanco, Javier González, Juan-Antonio Fernández-Madrigal University of Málaga (Spain) Dpt. of System Engineering and Automation May Pasadena,
CSCE643: Computer Vision Bayesian Tracking & Particle Filtering Jinxiang Chai Some slides from Stephen Roth.
1 Approximated tracking of multiple non-rigid objects using adaptive quantization and resampling techniques. J. M. Sotoca 1, F.J. Ferri 1, J. Gutierrez.
Particle Filtering in MEG: from single dipole filtering to Random Finite Sets A. SorrentinoCNR-INFM LAMIA, Genova methods for image and data analysis
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
Markov Localization & Bayes Filtering 1 with Kalman Filters Discrete Filters Particle Filters Slides adapted from Thrun et al., Probabilistic Robotics.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Oklahoma State University Generative Graphical Models for Maneuvering Object Tracking and Dynamics Analysis Xin Fan and Guoliang Fan Visual Computing and.
PHD Approach for Multi-target Tracking
Artificial Learning Approaches for Multi-target Tracking Jesse McCrosky Nikki Hu.
Particle Filter Speed Up Using a GPU High Performance Embedded Computing Workshop MIT Lincoln Labs By John Sacha & Andrew Shaffer Applied Research Laboratory.
A brief Introduction to Particle Filters
Sérgio Pequito Phd Student
SLAM: Simultaneous Localization and Mapping: Part I Chang Young Kim These slides are based on: Probabilistic Robotics, S. Thrun, W. Burgard, D. Fox, MIT.
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
A Probabilistic Approach to Collaborative Multi-robot Localization Dieter Fox, Wolfram Burgard, Hannes Kruppa, Sebastin Thrun Presented by Rajkumar Parthasarathy.
1 Integration of Background Modeling and Object Tracking Yu-Ting Chen, Chu-Song Chen, Yi-Ping Hung IEEE ICME, 2006.
Today Introduction to MCMC Particle filters and MCMC
Darryl MorrellStochastic Modeling Seminar1 Particle Filtering.
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Novel approach to nonlinear/non- Gaussian Bayesian state estimation N.J Gordon, D.J. Salmond and A.F.M. Smith Presenter: Tri Tran
Bayesian Filtering for Location Estimation D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello Presented by: Honggang Zhang.
Tracking with focus on the particle filter
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use sensor models.
Bayesian Filtering for Robot Localization
Muhammad Moeen YaqoobPage 1 Moment-Matching Trackers for Difficult Targets Muhammad Moeen Yaqoob Supervisor: Professor Richard Vinter.
Particle Filtering in Network Tomography
1 Miodrag Bolic ARCHITECTURES FOR EFFICIENT IMPLEMENTATION OF PARTICLE FILTERS Department of Electrical and Computer Engineering Stony Brook University.
Markov Localization & Bayes Filtering
Tracking with focus on the particle filter (part II) Michael Rubinstein IDC.
Computer vision: models, learning and inference Chapter 19 Temporal models.
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
Particle Filtering (Sequential Monte Carlo)
Mapping and Localization with RFID Technology Matthai Philipose, Kenneth P Fishkin, Dieter Fox, Dirk Hahnel, Wolfram Burgard Presenter: Aniket Shah.
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Probabilistic Robotics Bayes Filter Implementations.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
14 October, 2010LRI Seminar 2010 (Univ. Paris-Sud)1 Statistical performance analysis by loopy belief propagation in probabilistic image processing Kazuyuki.
-Arnaud Doucet, Nando de Freitas et al, UAI
CASC Primer on tracking Sen-ching S. Cheung March 26, 2004.
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
NCAF Manchester July 2000 Graham Hesketh Information Engineering Group Rolls-Royce Strategic Research Centre.
Boosted Particle Filter: Multitarget Detection and Tracking Fayin Li.
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Short Introduction to Particle Filtering by Arthur Pece [ follows my Introduction to Kalman filtering ]
Particle Filtering. Sensors and Uncertainty Real world sensors are noisy and suffer from missing data (e.g., occlusions, GPS blackouts) Use dynamics models.
Tracking with dynamics
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Introduction to Sampling based inference and MCMC
Particle Filtering for Geometric Active Contours
Probabilistic Robotics
Course: Autonomous Machine Learning
Introduction to particle filter
Particle Filter/Monte Carlo Localization
Filtering and State Estimation: Basic Concepts
Introduction to particle filter
Particle Filtering.
Presentation transcript:

Random Set/Point Process in Multi-Target Tracking Ba-Ngu Vo EEE Department University of Melbourne Australia http://www.ee.unimelb.edu.au/staff/bv/ Collaborators (in no particular order): Mahler R., Singh. S., Doucet A., Ma. W.K., Panta K., Clark D., Vo B.T., Cantoni A., Pasha A., Tuan H.D., Baddeley A., Zuyev S., Schumacher D. SAMSI, RTP, NC, USA, 8 September 2008

Outline The Bayes (single-target) filter Multi-target tracking System representation Random finite set & Bayesian Multi-target filtering Tractable multi-target filters Probability Hypothesis Density (PHD) filter Cardinalized PHD filter Multi-Bernoulli filter Conclusions

The Bayes (single-target) Filter observation space zk zk-1 state space target motion xk xk-1 state-vector I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking. System Model fk|k-1(xk| xk-1) gk(zk| xk) Markov Transition Density Measurement Likelihood Objective pk(xk | z1:k) posterior (filtering) pdf of the state measurement history (z1,…, zk)

The Bayes (single-target) Filter observation space zk zk-1 state space target motion xk xk-1 state-vector I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking.  pk-1(xk-1| z1:k-1) dxk-1 fk|k-1(xk| xk-1) K-1 pk|k-1(xk| z1:k-1) gk(zk| xk) Bayes filter pk-1(xk-1 |z1:k-1) prediction pk|k-1(xk| z1:k-1) data-update  pk(xk| z1:k) 

The Bayes (single-target) Filter observation space zk zk-1 gk(zk| xk) state space target motion xk fk|k-1(xk| xk-1) xk-1 state-vector I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking. pk-1(. |z1:k-1) pk|k-1(. | z1:k-1) pk(. | z1:k) prediction data-update  Bayes filter  N(.;mk-1, Pk-1) N(.;mk|k-1, Pk|k-1) N(.;(mk, Pk ) Kalman filter i=1 N {wk|k-1, xk|k-1} (i) {wk, xk }  {wk-1, xk-1} Particle filter

Multi-target tracking

observation produced by targets Multi-target tracking observation space observation produced by targets state space target motion I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking. Xk Xk-1 5 targets 3 targets Objective: Jointly estimate the number and states of targets Challenges: Random number of targets and measurements Detection uncertainty, clutter, association uncertainty

System Representation How can we mathematically represent the multi-target state? Usual practice: stack individual states into a large vector! Problem: True Multi-target state Estimated Multi-target state 2 targets 2 targets Estimate is correct but estimation error ??? Remedy: use

System Representation True Multi-target state Estimated Multi-target State 1 target 2 targets True Multi-target state Estimated Multi-target State no target 2 targets What are the estimation errors?

System Representation Error between estimate and true state (miss-distance) fundamental in estimation/filtering & control well-understood for single target: Euclidean distance, MSE, etc in the multi-target case: depends on state representation For multi-target state: vector representation doesn’t admit multi-target miss-distance finite set representation admits multi-target miss-distance: distance between 2 finite sets In fact the “distance” is a distance for sets not vectors

observation produced by targets System Representation observation space observation produced by targets state space target motion I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking. Xk Xk-1 5 targets 3 targets Number of measurements and their values are (random) variables Ordering of measurements not relevant! Multi-target measurement is represented by a finite set

RFS & Bayesian Multi-target Filtering Reconceptualize as a generalized single-target problem [Mahler 94] observations observed set Z  targets target set X X I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking. Bayesian: Model state & observation as Random Finite Sets [Mahler 94]  prediction pk-1(Xk-1|Z1:k-1) pk|k-1(Xk|Z1:k-1) data-update pk(Xk|Z1:k)  Need suitable notions of density & integration

RFS & Bayesian Multi-target Filtering state space E S random finite set or random point pattern I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking. S state space E NS(S) = |S  S| point process or random counting measure

RFS & Bayesian Multi-target Filtering F(E) E Collection of finite subsets of E State space S S T T Probability distribution of S PS (T ) = P(S ÎT ) , T Í F(E) Belief “distribution” of S bS (T ) = P(S Í T ) , T Í E Since Individual target probability distribution models are defined on subsets of Rn Not tractable to derive multi-target probability distribution on the abstract Borel subsets of the finite subsets of E However, belief distribution on the closed subsets of E can be derived Mahler’s approach is offers a more tractable modeling alternative- Aim: for practicing engineers to write down the belief distribution using the motion models of individual targets, take set derivative to get the multi-target transition density, write down belief distribution using the sensor models, take set derivative to get the multi-target likelihood. Choquet (1968) Point Process Theory (1950-1960’s) Mahler’s Finite Set Statistics (1994) Probability density of S pS : F(E) ® [0,¥) PS (T ) = òT pS (X)m(dX) Belief “density” of S fS : F(E) ® [0,¥) bS (T ) = òT fS (X)dX Vo et. al. (2005) Conventional integral Set integral

Multi-object transition Multi-target Motion Model Xk = Sk|k-1(Xk-1)ÈBk|k-1(Xk-1)ÈGk x x’ X’  death creation spawn motion fk|k-1(Xk|Xk-1 ) Multi-object transition density Evolution of each element x of a given multi-object state Xk-1

Multi-object likelihood Multi-target Observation Model Zk = Qk(Xk) ÈKk(Xk) likelihood x z gk(Zk|Xk) misdetection  Multi-object likelihood x clutter  state space observation space Observation process for each element x of a given multi-object state Xk

Multi-target Bayes Filter  pk-1(Xk-1|Z1:k-1) prediction pk|k-1(Xk|Z1:k-1) data-update pk(Xk|Z1:k)  Since Individual target probability distribution models are defined on subsets of Rn Not tractable to derive multi-target probability distribution on the abstract Borel subsets of the finite subsets of E However, belief distribution on the closed subsets of E can be derived Mahler’s approach is offers a more tractable modeling alternative- Aim: for practicing engineers to write down the belief distribution using the motion models of individual targets, take set derivative to get the multi-target transition density, write down belief distribution using the sensor models, take set derivative to get the multi-target likelihood. Computationally intractable in general No closed form solution Particle or SMC implementation [Vo, Singh & Doucet 03, 05, Sidenbladh 03, Vihola 05, Ma et al. 06] Restricted to a very small number of targets

Particle Multi-target Bayes Filter Algorithm for i =1:N, % Initialise => Sample: Compute: end; normalise weights; for k =1: kmax , for i =1:N, % Update => Update: resample; MCMC step;

The PHD Filter pk-1(Xk-1|Z1:k-1) pk|k-1(Xk|Z1:k-1) pk(Xk|Z1:k)   prediction pk|k-1(Xk|Z1:k-1) data-update pk(Xk|Z1:k)  Multi-target Bayes filter: very expensive! Since Individual target probability distribution models are defined on subsets of Rn Not tractable to derive multi-target probability distribution on the abstract Borel subsets of the finite subsets of E However, belief distribution on the closed subsets of E can be derived Mahler’s approach is offers a more tractable modeling alternative- Aim: for practicing engineers to write down the belief distribution using the motion models of individual targets, take set derivative to get the multi-target transition density, write down belief distribution using the sensor models, take set derivative to get the multi-target likelihood. state of system: random vector single-object Bayes filter first-moment filter (e.g. a-b-g filter) Single-object state of system: random set multi-object Bayes filter first-moment filter (“PHD” filter) Multi-object

The Probability Hypothesis Density vS PHD (intensity function) of a RFS S vS(x0) = density of expected number of objects at x0 S vS(x)dx = expected number of objects in S = mean of, NS(S), the random counting measure at S x0 state space S

Multi-object Bayes filter The PHD Filter state space vk vk-1 Avoids data association! PHD filter vk-1(xk-1|Z1:k-1) PHD prediction vk|k-1(xk|Z1:k-1) PHD update vk(xk|Z1:k)   pk-1(Xk-1|Z1:k-1) prediction pk|k-1(Xk|Z1:k-1) update pk(Xk|Z1:k)   Multi-object Bayes filter

Markov transition density PHD Prediction  vk|k-1(xk |Z1:k-1) = fk|k-1(xk, xk-1) vk-1(xk-1|Z1:k-1)dxk-1 + gk(xk) predicted intensity Nk|k-1 = vk|k-1 (x|Z1:k-1)dx  predicted expected number of objects Markov transition intensity intensity from previous time-step term for spontaneous object births = intensity of Gk fk|k-1(xk, xk-1) = ek|k-1(xk-1) fk|k-1(xk|xk-1) + bk|k-1(xk|xk-1) probability of object survival Markov transition density term for objects spawned by existing objects = intensity of Bk(xk-1) vk|k-1 = Fk|k-1vk-1 (Fk|k-1a)(xk) = fk|k-1(xk, x)a(x)dx + gk(xk) 

sensor likelihood function PHD Update [ S + 1 - pD,k(xk)]vk|k-1(xk|Z1:k-1) pD,k(xk)gk(z|xk) vk(xk|Z1:k)  Dk(z) + kk(z) zZk Bayes-updated intensity measurement intensity of false alarms probability of detection predicted intensity (from previous time)  Dk(z) = pD,k(x)gk(z|x)vk|k-1(x|Z1:k-1)dx Nk= vk(x|Z1:k)dx  sensor likelihood function expected number of objects vk = Ykvk|k-1 (Yka)(x) = zZk <yk,z,a> + kk(z) yk,z(x) + 1 - pD,k(x)]a(x) [ S

Particle PHD filter The PHD (or intensity function) vk is not a probability density The PHD propagation equation is not a standard Bayesian recursion Sequential MC implementation of the PHD filter [Vo, Singh & Doucet 03, 05], [Sidenbladh 03], [Mahler & Zajic 03] state space Particle approximation of vk Particle approximation of vk-1 Need to cluster the particles to obtain multi-target estimates

Particle PHD filter Algorithm Initialise; for k =1: kmax , for i =1: Jk , Sample: ; compute: ; end; for i = Jk +1: Jk +Lk-1 , Sample: ; compute: ; for i =1: Jk +Lk-1 , Update: ; Redistribute total mass among Lk resampled particles; Convergence: [Vo, Singh & Doucet 05], [Clark & Bell 06], [Johansen et. al. 06]

Gaussian Mixture PHD filter Closed-form solution to the PHD recursion exists for linear Gaussian multi-target model Gaussian mixture prior intensity Þ Gaussian mixture posterior intensities at all subsequent times vk-1( . |Z1:k-1) vk(. |Z1:k) vk|k-1(. |Z1:k-1)  PHD filter  {wk-1, mk-1, Pk-1} i=1 Jk-1 (i) {wk|k-1, mk|k-1, Pk|k-1} Jk|k-1 {wk, mk, Pk } Jk Gaussian Mixture (GM) PHD filter [Vo & Ma 05, 06] Extended & Unscented Kalman PHD filter [Vo & Ma 06] Jump Markov PHD filter [Pasha et. al. 06] Track continuity [Clark et. al. 06]

Cardinalised PHD Filter Drawback of PHD filter: High variance of cardinality estimate Relax Poisson assumption: allows arbitrary cardinality distribution Jointly propagate: intensity function & probability generating function of cardinality. pk-1(n|Z1:k-1) pk(n|Z1:k) pk|k-1(n|Z1:k-1)  cardinality prediction update vk-1(xk-1|Z1:k-1) intensity prediction vk|k-1(xk|Z1:k-1) intensity update vk(xk|Z1:k)   CPHD filter [Mahler 06,07] More complex PHD update step (higher computational costs)

Particle CPHD filter [Vo 08] Gaussian Mixture CPHD Filter Closed-form solution to the CPHD recursion exists for linear Gaussian multi-target model Gaussian mixture prior intensity Þ Gaussian mixture posterior intensities at all subsequent times [Vo et. al. 06, 07] Particle-PHD filter can be extended to the CPHD filter cardinality prediction  cardinality update   {pk-1(n)}  {pk|k-1(n)} {pk(n)}  n=0 n=0 n=0 (i) (i) Jk-1 intensity prediction {wk-1, xk-1} {wk|k-1, xk|k-1} (i) (i) Jk|k-1 intensity update (i) (i) Jk  {wk, xk }  i=1 i=1 i=1 Particle CPHD filter [Vo 08]

CPHD filter Demonstration 1000 MC trial average GMCPHD filter Well I’m getting a bit bored of equations, so I’d like to show this closed form solution in action. What I’m going to show you is a scenario where we have targets moving in a given region shown by a blue cross. The measurements received by the filter are shown by black crosses and you’ll see that we can’t distinguish which measurement came from where. The filter estimate is shown in red. On the right you’ll see a histogram of the posterior intensity showing how well the closed form Gaussian mixture implementation performs. Here we’ll actually have a total of 10 targets on the scene at any one time. GMPHD filter

CPHD filter Demonstration Comparison with JPDA: linear dynamics, sv = 5, sh = 10, 4 targets, 1000 MC trial average Well I’m getting a bit bored of equations, so I’d like to show this closed form solution in action. What I’m going to show you is a scenario where we have targets moving in a given region shown by a blue cross. The measurements received by the filter are shown by black crosses and you’ll see that we can’t distinguish which measurement came from where. The filter estimate is shown in red. On the right you’ll see a histogram of the posterior intensity showing how well the closed form Gaussian mixture implementation performs. Here we’ll actually have a total of 10 targets on the scene at any one time.

CPHD filter Demonstration Sonar images

Multi-object Bayes filter MeMBer Filter Multi-object Bayes filter pk-1(Xk-1|Z1:k-1) prediction pk|k-1(Xk|Z1:k-1) update pk(Xk|Z1:k)   (i) (i) Mk-1 prediction (i) (i) Mk|k-1 update (i) (i) Mk  {(rk-1, pk-1)} {(rk|k-1, pk|k-1)} {(rk, pk )}  i=1 i=1 i=1 (Multi-target Multi-Bernoulli ) MeMBer filter [Mahler 07], biased Cardinality-Balanced MeMBer filter [Vo et. al. 07], unbiased Approximate predicted/posterior RFSs by Multi-Bernoulli RFSs Valid for low clutter rate & high probability of detection

Cardinality-Balanced Cardinality-Balanced MeMBer Filter (i) (i) Mk-1 prediction (i) (i) Mk|k-1 update {(rk-1, pk-1)} {(rk|k-1, pk|k-1)} {(rk, pk )} (i) (i) Mk   i=1 i=1 i=1 (i) Mk-1 {(rP,k|k-1, pP,k|k-1)} È {(rG,k, pG,k)} (i) (i) (i) MG,k i=1 i=1 rk-1á pk-1, pS,kñ (i) term for object births (i) á fk|k-1(·|·), pk-1 pS,kñ Cardinality-Balanced MeMBer filter [Vo et. al. 07] ápk-1, pS,kñ (i)

Cardinality-Balanced Cardinality-Balanced MeMBer Filter (i) (i) Mk-1 prediction (i) (i) Mk|k-1 update  {(rk-1, pk-1)} {(rk|k-1, pk|k-1)} {(rk, pk )} (i) (i) Mk  i=1 i=1 i=1 {(rL,k, pL,k)} È {(rU,k,(z), pU,k(z))} (i) (i) Mk|k-1 i=1 zÎZk 1- rk|k-1 ápk|k-1, pD,kñ (i) rk|k-1(1- ápk|k-1, pD,kñ) 1- ápk|k-1, pD,kñ (i) pk|k-1(1- pD,k) 1- rk|k-1 (i) rk|k-1 pk|k-1 i=1 Mk|k-1 S pD,kgk(z|·) rk|k-1ápk|k-1, pD,kgk(z|·)ñ 1- rk|k-1 (i) i=1 Mk|k-1 S Mk|k-1 rk|k-1(1- rk|k-1) ápk|k-1, pD,kgk(z|·)ñ (i) (i) (i) S (1- rk|k-1ápk|k-1, pD,kñ)2 (i) (i) i=1 1- rk|k-1 ápk|k-1, pD,kñ (i) rk|k-1 ápk|k-1, pD,kgk(z|·)ñ i=1 Mk|k-1 S Cardinality-Balanced MeMBer filter [Vo et. al. 07] k(z) +

Cardinality-Balanced MeMBer Filter Closed-form (Gaussian mixture) solution [Vo et. al. 07], Jk-1 (i,j) (i,j) (i,j) Jk|k-1 {wk-1, mk-1, Pk-1} (i,j) (i,j) (i,j) {wk|k-1, mk|k-1, Pk|k-1} {wk, mk, Pk } (i,j) (i,j) (i,j) Jk j=1 j=1 j=1 (i) (i) Mk-1 prediction (i) (i) Mk|k-1 update {(rk-1, pk-1)} {(rk|k-1, pk|k-1)} {(rk, pk )} (i) (i) Mk   i=1 i=1 i=1 {wk-1, xk-1} (i,j) (i,j) Jk-1 {wk|k-1, xk|k-1 } (i,j) (i,j) Jk|k-1 (i,j) (i,j) Jk {wk, xk } j=1 j=1 j=1 Particle implementation [Vo et. al. 07], More useful than PHD filters in highly non-linear problems

Performance comparison Example: 10 targets max on scene, with births/deaths 4D states: x-y position/velocity, linear Gaussian observations: x-y position, linear Gaussian Dynamics constant velocity model: v = 5ms-2, survival probability: pS,k = 0.99, Observations additive Gaussian noise:  =10m, detection probability: pD,k = 0.98, uniform Poisson clutter: c = 2.5x10-6m-2 / start/end positions

Gaussian implementation 1000 MC trial average Cardinality-Balanced Recursion Mahler’s MeMBer Recursion

Gaussian implementation 1000 MC trial average CPHD Filter has better performance

Particle implementation 1000 MC trial average CB-MeMBer Filter has better performance

Concluding Remarks Thank You! Random Finite Set framework Rigorous formulation of Bayesian multi-target filtering Leads to efficient algorithms Future research directions Track before detect Performance measure for multi-object systems Numerical techniques for estimation of trajectories For more info please see http://randomsets.ee.unimelb.edu.au/ Thank You!

References D. Stoyan, D. Kendall, J. Mecke, Stochastic Geometry and its Applications, John Wiley & Sons, 1995 D. Daley and D. Vere-Jones, An Introduction to the Theory of Point Processes, Springer-Verlag, 1988. I. Goodman, R. Mahler, and H. Nguyen, Mathematics of Data Fusion. Kluwer Academic Publishers, 1997. R. Mahler, “An introduction to multisource-multitarget statistics and applications,” Lockheed Martin Technical Monograph, 2000. R. Mahler, “Multi-target Bayes filtering via first-order multi-target moments,” IEEE Trans. AES, vol. 39, no. 4, pp. 1152–1178, 2003. B. Vo, S. Singh, and A. Doucet, “Sequential Monte Carlo methods for multi-target filtering with random finite sets,” IEEE Trans. AES, vol. 41, no. 4, pp. 1224–1245, 2005,. B. Vo, and W. K. Ma, “The Gaussian mixture PHD filter,” IEEE Trans. Signal Processing, IEEE Trans. Signal Processing, Vol. 54, No. 11, pp. 4091-4104, 2006. R. Mahler, “A theory of PHD filter of higher order in target number,” in I. Kadar (ed.), Signal Processing, Sensor Fusion, and Target Recognition XV, SPIE Defense & Security Symposium, Orlando, April 17-22, 2006 B. T. Vo, B. Vo, and A. Cantoni, "Analytic implementations of the Cardinalized Probability Hypothesis Density Filter," IEEE Trans. SP, Vol. 55,  No. 7,  Part 2,  pp. 3553-3567, 2007. D. Clark & J. Bell, “Convergence of the Particle-PHD filter,” IEEE Trans. SP, 2006. A. Johansen, S. Singh, A. Doucet, and B. Vo, "Convergence of the SMC implementation of the PHD filter," Methodology and Computing in Applied Probability, 2006. A. Pasha, B. Vo, H. D Tuan and W. K. Ma, "Closed-form solution to the PHD recursion for jump Markov linear models," FUSION, 2006. D. Clark, K. Panta, and B. Vo, "Tracking multiple targets with the GMPHD filter," FUSION, 2006. B. T. Vo, B. Vo, and A. Cantoni, “On Multi-Bernoulli Approximation of the Multi-target Bayes Filter," ICIF, Xi’an, 2007. See also: http://www.ee.unimelb.edu.au/staff/bv/publications.html

Representation of Multi-target state Optimal Subpattern Assignment (OSPA) metric [Schumacher et. al 08] Fill up X with n - m dummy points located at a distance greater than c from any points in Y Calculate pth order Wasserstein distance between resulting sets Efficiently computed using the Hungarian algorithm

S S [pS,kwk-1N(x; mS,k|k-1, PS,k|k-1) + Gaussian Mixture PHD Prediction Gaussian mixture posterior intensity at time k-1: vk-1(x) = wk-1N(x; mk-1, Pk-1) S i=1 Jk-1 (i) Gaussian mixture predicted intensity to time k: vk|k-1(x) = [pS,kwk-1N(x; mS,k|k-1, PS,k|k-1) + S i=1 Jk-1 (i) wk-1wb,kN(x; mb,k|k-1, Pb,k|k-1)] + gk(x) (i,l) l=1 Jb,k (l) Fk|k-1vk-1 mS,k|k-1 = Fk-1mk-1 PS,k|k-1 = Fk-1 Pk-1 Fk-1 + Qk-1 (i) T (i,l) (l) mb,k|k-1 = Fb,k-1mk-1 + db,k-1 Pb,k|k-1 = Fb,k-1 Pk-1 (Fb,k-1 )T + Qb,k-1 (i)

S S S S Gaussian Mixture PHD Update Gaussian mixture predicted intensity to time k: vk|k-1(x) = wk|k-1N(x; mk|k-1, Pk|k-1) S i=1 Jk|k-1 (i) Gaussian mixture updated intensity at time k: vk(x) = i=1 Jk|k-1 (i) N(x; mk|k(z), Pk|k) + (1- pD,k)vk|k-1(x) S S zÎ Zk (j) j=1 S pD,k wk|k-1qk (z) + kk(z) pD,kwk|k-1qk (z) Ykvk|k-1 mk|k(z) = mk|k-1 + Kk (z- Hk mk|k-1 ) (i) Pk|k = (I- Kk Hk )Pk|k-1 (i) qk(z) = N(z; Hkmk|k-1, HkPk|k-1Hk + Rk ) T (i) Kk = Pk|k-1Hk (Hk Pk|k-1Hk + Rk )-1 (i) T

Markov transition density Cardinalised PHD Prediction S j=0 n pk|k-1(n) = p,k(n - j) k|k-1[vk-1,pk-1](j) predicted cardinality probability of n - j spontaneous births probability of j surviving targets S l=j ¥ Cjl <pS,k ,vk-1> j <1- pS,k ,vk-1> l-j pk-1 (l) <1,vk-1>l  vk|k-1(xk) = pS,k(xk-1) fk|k-1(xk|xk-1) vk-1(xk-1)dxk-1 + gk(xk) predicted intensity probability of survival Markov transition density intensity from previous time-step intensity of spontaneous object births Gk

S (P ) Cardinalised PHD Update S esfj(Z) = S pk(n) = + ¡k[vk|k-1, Zk](n)pk|k-1(n) vk(xk) = vk|k-1(xk)Yk, Zk(xk) predicted intensity updated intensity pk(n) = <¡k[vk|k-1, Zk], pk|k-1> updated cardinality distribution predicted cardinality distribution ¡k[v, Z](n) = pK,k(|Z|–j) (|Z|–j)! Pj+u S esfj({<v,yk,z>: zZk}) <1- pD,k ,v >n-(j+u) <1,v >n n j=0 min(|Z|,n) u S (P ) z zS S Í Z,|S|=j esfj(Z) = likelihood function prob. of detection clutter intensity pD,k(xk)gk(z|xk)<1,kk>/kk(z) clutter cardinality distribution S zZk yk,z(xk) + <¡k[vk|k-1, Zk], pk|k-1> <¡k[vk|k-1, Zk\{z}], pk|k-1> 1 (1-pD,k(xk))

Multi-object Bayes filter Mahler’s MeMBer Filter Multi-object Bayes filter pk-1(Xk-1|Z1:k-1) prediction pk|k-1(Xk|Z1:k-1) update pk(Xk|Z1:k)   (i) (i) Mk-1 prediction (i) (i) Mk|k-1 update (i) (i) Mk  {(rk-1, pk-1)} {(rk|k-1, pk|k-1)} {(rk, pk )}  i=1 i=1 i=1 (Multi-target Multi-Bernoulli ) MeMBer filter [Mahler 07] Approximate predicted/posterior RFSs by Multi-Bernoulli RFSs Valid for low clutter rate & high probability of detection Biased in Cardinality (except when probability of detection = 1)

Cardinality-Balanced Cardinality-Balanced MeMBer Filter (i) (i) Mk-1 prediction (i) (i) Mk|k-1 update {(rk-1, pk-1)} {(rk|k-1, pk|k-1)} {(rk, pk )} (i) (i) Mk   i=1 i=1 i=1 Cardinality-Balanced MeMBer filter [Vo et. al. 07] (i) (i) Mk|k-1 {(rL,k, pL,k)} È {(rU,k,(z), pU,k(z))} i=1 zÎZk 1- rk|k-1 ápk|k-1, pD,kñ (i) rk|k-1(1- ápk|k-1, pD,kñ) 1- ápk|k-1, pD,kñ (i) pk|k-1(1- pD,k) ávk|k-1, pD,kgk(z|·)ñ vk|k-1 pD,kgk(z|·) ~* (1- rk|k-1ápk|k-1, pD,kñ)2 (i) rk|k-1(1- rk|k-1) pk|k-1 i=1 Mk|k-1 vk|k-1 = S (1) k(z) + ávk|k-1, pD,kgk(z|·)ñ ávk|k-1, pD,kgk(z|·)ñ (1) ~ 1- rk|k-1 ápk|k-1, pD,kñ (i) rk|k-1 pk|k-1 i=1 Mk|k-1 vk|k-1 = ~ S 1- rk|k-1 (i) rk|k-1 pk|k-1 i=1 Mk|k-1 vk|k-1 = ~* S

Extensions of the PHD filter Linear Jump Markov PHD filter [Pasha et. al. 06]

Extensions of the PHD filter Example: 4-D, Linear JM target dynamics with 3 models 4 targets, birth rate= 3x0.05, death prob. = 0.01, clutter rate = 40

What is a Random Finite Set (RFS)? I will then present finite sets stats-a stat tool derived from RS for attacking MS MT tracking. Pine saplings in a Finish forest [Kelomaki & Penttinen] Childhood leukaemia & lymphoma in North Humberland [Cuzich & Edwards] The number of points is random, The points have no ordering and are random Loosely, an RFS is a finite set-valued random variable Also known as: (simple finite) point process or random point pattern