Aaron D. Lanterman 404-385-2548 School of Electrical and Computer Engineering Target Recognition in Cluttered Infrared Scenes via.

Slides:



Advertisements
Similar presentations
Ch 11. Sampling Models Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by I.-H. Lee Biointelligence Laboratory, Seoul National.
Advertisements

Computer Vision Lab. SNU Young Ki Baik An Introduction to MCMC for Machine Learning (Markov Chain Monte Carlo)
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Image Parsing: Unifying Segmentation and Detection Z. Tu, X. Chen, A.L. Yuille and S-C. Hz ICCV 2003 (Marr Prize) & IJCV 2005 Sanketh Shetty.
Bayesian statistics – MCMC techniques
Stochastic approximate inference Kay H. Brodersen Computational Neuroeconomics Group Department of Economics University of Zurich Machine Learning and.
BAYESIAN INFERENCE Sampling techniques
On Systems with Limited Communication PhD Thesis Defense Jian Zou May 6, 2004.
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
Segmentation and Tracking of Multiple Humans in Crowded Environments Tao Zhao, Ram Nevatia, Bo Wu IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,
The University of Texas at Austin, CS 395T, Spring 2008, Prof. William H. Press IMPRS Summer School 2009, Prof. William H. Press 1 4th IMPRS Astronomy.
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
Today Introduction to MCMC Particle filters and MCMC
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Bayesian Analysis for Extreme Events Pao-Shin Chu and Xin Zhao Department of Meteorology School of Ocean & Earth Science & Technology University of Hawaii-
Data-Driven Markov Chain Monte Carlo Presented by Tomasz MalisiewiczTomasz Malisiewicz for Advanced PerceptionAdvanced Perception 3/1/2006.
Monte Carlo Methods in Partial Differential Equations.
Robin McDougall, Ed Waller and Scott Nokleby Faculties of Engineering & Applied Science and Energy Systems & Nuclear Science 1.
Bayes Factor Based on Han and Carlin (2001, JASA).
Techniques for Estimating Layers from Polar Radar Imagery Jerome E. Mitchell, Geoffrey C. Fox, and David J. Crandall :: CReSIS NSF Site Visit :: School.
A General Framework for Tracking Multiple People from a Moving Camera
Topic 10 - Image Analysis DIGITAL IMAGE PROCESSING Course 3624 Department of Physics and Astronomy Professor Bob Warwick.
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
School of Electrical and Computer Engineering A Mathematical Theory of Automatic Target Recognition Aaron D. Lanterman
1 Gil McVean Tuesday 24 th February 2009 Markov Chain Monte Carlo.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Monte Carlo Methods1 T Special Course In Information Science II Tomas Ukkonen
Automated Detection and Classification Models SAR Automatic Target Recognition Proposal J.Bell, Y. Petillot.
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
Suppressing Random Walks in Markov Chain Monte Carlo Using Ordered Overrelaxation Radford M. Neal 발표자 : 장 정 호.
Fitting: The Hough transform
Fields of Experts: A Framework for Learning Image Priors (Mon) Young Ki Baik, Computer Vision Lab.
Learning With Bayesian Networks Markus Kalisch ETH Zürich.
Tracking Multiple Cells By Correspondence Resolution In A Sequential Bayesian Framework Nilanjan Ray Gang Dong Scott T. Acton C.L. Brown Department of.
MCMC (Part II) By Marc Sobel. Monte Carlo Exploration  Suppose we want to optimize a complicated distribution f(*). We assume ‘f’ is known up to a multiplicative.
1 Tree Crown Extraction Using Marked Point Processes Guillaume Perrin Xavier Descombes – Josiane Zerubia ARIANA, joint research group CNRS/INRIA/UNSA INRIA.
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Characterizing the Function Space for Bayesian Kernel Models Natesh S. Pillai, Qiang Wu, Feng Liang Sayan Mukherjee and Robert L. Wolpert JMLR 2007 Presented.
An Introduction to Markov Chain Monte Carlo Teg Grenager July 1, 2004.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
by Ryan P. Adams, Iain Murray, and David J.C. MacKay (ICML 2009)
Automated Detection and Classification Models SAR Automatic Target Recognition Proposal J.Bell, Y. Petillot.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Gil McVean, Department of Statistics Thursday February 12 th 2009 Monte Carlo simulation.
Introduction: Metropolis-Hasting Sampler Purpose--To draw samples from a probability distribution There are three steps 1Propose a move from x to y 2Accept.
Global Illumination (3) Path Tracing. Overview Light Transport Notation Path Tracing Photon Mapping.
Markov-Chain-Monte-Carlo (MCMC) & The Metropolis-Hastings Algorithm P548: Intro Bayesian Stats with Psych Applications Instructor: John Miyamoto 01/19/2016:
Bayesian Inference and Visual Processing: Image Parsing & DDMCMC. Alan Yuille (Dept. Statistics. UCLA) Tu, Chen, Yuille & Zhu (ICCV 2003).
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
Computational Physics (Lecture 10) PHY4370. Simulation Details To simulate Ising models First step is to choose a lattice. For example, we can us SC,
Efficiency Measurement William Greene Stern School of Business New York University.
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Introduction to Sampling based inference and MCMC
MCMC Output & Metropolis-Hastings Algorithm Part I
Advanced Statistical Computing Fall 2016
ECE 7251: Signal Detection and Estimation
Jun Liu Department of Statistics Stanford University
MURI Annual Review Meeting Randy Moses November 3, 2008
Dynamical Statistical Shape Priors for Level Set Based Tracking
Adversarially Tuned Scene Generation
Outline Image Segmentation by Data-Driven Markov Chain Monte Carlo
Markov chain monte carlo
Image Parsing & DDMCMC. Alan Yuille (Dept. Statistics. UCLA)
Robust Full Bayesian Learning for Neural Networks
Sampling Plans.
Presentation transcript:

Aaron D. Lanterman School of Electrical and Computer Engineering Target Recognition in Cluttered Infrared Scenes via Pattern Theoretic Representations and Jump-Diffusion Processes Prof. Aaron D. Lanterman School of Electrical & Computer Engineering Georgia Institute of Technology

Aaron D. Lanterman School of Electrical and Computer Engineering Variability in Complex Scenes –Geometric variability Position Orientation Articulation Fingerprint (in the Bob Hummel sense) –Environmental variability Thermodynamic variability in infrared Illumination variability in visual –Complexity variability Number of objects not known

Aaron D. Lanterman School of Electrical and Computer Engineering Pattern Theory: The Grenander Program Representation: –Incorporate variability in the parameter space Possibly many nuisance parameters –Model mapping from parameters to data Inference: –Build inference algorithms using the representation –Ex.: Markov chain Monte Carlo (jump-diffusion) General notion: avoid “preprocessing” or “feature extraction” as much as possible to avoid loss of information –Recall Biao Chen’s mention of the Information Processing Inequality Apply tools of Bayesian inference to weird things

Aaron D. Lanterman School of Electrical and Computer Engineering Legacy Work Sponsored by –U.S. Army Center for Imaging Science (ARO - David Skatrud/Bill Sander) –ONR (Bill Miceli) Collaborators –Michael Miller (now at Johns Hopkins) –Donald Snyder (Washington Univ.) –Anuj Srivastava (now with Dept. of Stat., Florida State) Airborne targets – radar –Matt Cooper (now with Xerox) Thermodynamic variability of targets

Aaron D. Lanterman School of Electrical and Computer Engineering Parameter Space for ATR Parameter space for a single target: Number of targets not known in advance: Parameter space for an n-target scene:

Aaron D. Lanterman School of Electrical and Computer Engineering Ingrid’s Third Approach Data, parameters Likelihood –Render infrared scene onto detector plane –Model sensor noise effects Prior Bayesian posterior –Analytically forboding! Sample via jump-diffusion processes –Jump from subspace to subspace –Diffuse to refine estimates within a subspace

Aaron D. Lanterman School of Electrical and Computer Engineering Take Home Message Go read Ulf Grenander’s papers and books. They are very cool.

Aaron D. Lanterman School of Electrical and Computer Engineering Perspective Projection

Aaron D. Lanterman School of Electrical and Computer Engineering Sensor Effects Optical PSF Poisson Photocounting Noise Dead and Saturated Pixels

Aaron D. Lanterman School of Electrical and Computer Engineering FLIR Loglikelihood CCD loglikelihood of Snyder, Hammoud, and White where Cascade with Sensor fusion natural; just add loglikelihoods

Aaron D. Lanterman School of Electrical and Computer Engineering Langevin Processes Process Consider a fixed number of N targets and target classes Simulate Langevin diffusion: Distribution of Computed desired statistics from the samples Generalizes to non-Euclidean groups like rotations Gradient computation –Numeric approximations –Easy and fast on modern 3-D graphics hardware Write posterior in Gibbs form:

Aaron D. Lanterman School of Electrical and Computer Engineering Diffusion Example on AMCOM Data

Aaron D. Lanterman School of Electrical and Computer Engineering Jump Moves Birth Death Type-change

Aaron D. Lanterman School of Electrical and Computer Engineering Helpful Properties of Jump Processes Connectedness: can go from any point to any other in a finite number of jumps (continuous form slightly more complicated) {states reachable from } {states from which can be reached} Jump at exponentially distributed times Detailed balance (in discrete form) Move reversability:

Aaron D. Lanterman School of Electrical and Computer Engineering Jumping Strategies Gibbs –Sample from a restricted part of the posterior Metropolis-Hastings style –Draw a “proposal” from a “proposal density” –Accept (or reject) the proposal with a certain probability

Aaron D. Lanterman School of Electrical and Computer Engineering Example Jump-Diffusion Process

Aaron D. Lanterman School of Electrical and Computer Engineering How to Model Clutter? Problem: Algorithm only knows about tanks (which Bob doesn’t like anyway), and will try to explain everything with tanks! –Cows, swimming pools, school buses Solution (?): Let the algorithm use flexible representation in addition to rigid objects –Blobs: Simple connected shapes on the lattice to represent structured clutter –Could use active curves, level set methods (Clem Karl), active polygons (Hamid Krim) –Clutter might be interesting in its own right!

Aaron D. Lanterman School of Electrical and Computer Engineering Random Sampling for Blobs Set of jump moves –Add a pixel along the boundary –Remove a pixel along the boundary –Keep the blob a blob Pick move based on posterior probability

Aaron D. Lanterman School of Electrical and Computer Engineering Blob Estimation Examples M60 spreading M60 decaying Ship decaying

Aaron D. Lanterman School of Electrical and Computer Engineering NVESD M60 Example

Aaron D. Lanterman School of Electrical and Computer Engineering Saccadic Detection Current implementation “births” specific target types May be better to birth simple shapes, and later change them to more specific target types (clutter or target) Example: –Birth squares –Deform into rectangles –Then jump to more detailed targets

Aaron D. Lanterman School of Electrical and Computer Engineering Initial Detection Low-dimensional refinement AMCOM Data Ex.: Finding Tank 1 High-dimensional refinement Equilibrium

Aaron D. Lanterman School of Electrical and Computer Engineering Initial Detection AMCOM Data Ex.: Finding Tank 2 Low-dimensional refinement Equilibrium

Aaron D. Lanterman School of Electrical and Computer Engineering AMCOM Data Ex.: Finding ????? Initial Detection Low-dimensional refinement

Aaron D. Lanterman School of Electrical and Computer Engineering Factory Example

Aaron D. Lanterman School of Electrical and Computer Engineering Unified Algorithm Extended jump moves – Saccadic  blob – Saccadic  rigid – Blob  rigid – Break/combine blobs – Change rigid target types Difficulty: Make parameters make sense between representation types