1 Tree Crown Extraction Using Marked Point Processes Guillaume Perrin Xavier Descombes – Josiane Zerubia ARIANA, joint research group CNRS/INRIA/UNSA INRIA.

Slides:



Advertisements
Similar presentations
Dynamic Spatial Mixture Modelling and its Application in Cell Tracking - Work in Progress - Chunlin Ji & Mike West Department of Statistical Sciences,
Advertisements

1 Bayesian methods for parameter estimation and data assimilation with crop models David Makowski and Daniel Wallach INRA, France September 2006.
Marked Point Processes for Crowd Counting
Bayesian Estimation in MARK
The use of airborne laser scanner data (LIDAR) for forest measurement applications Hans-Erik Andersen Precision Forestry Cooperative University of Washington.
Ch 11. Sampling Models Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by I.-H. Lee Biointelligence Laboratory, Seoul National.
Computer Vision Lab. SNU Young Ki Baik An Introduction to MCMC for Machine Learning (Markov Chain Monte Carlo)
Texture Segmentation Based on Voting of Blocks, Bayesian Flooding and Region Merging C. Panagiotakis (1), I. Grinias (2) and G. Tziritas (3)
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
Image Parsing: Unifying Segmentation and Detection Z. Tu, X. Chen, A.L. Yuille and S-C. Hz ICCV 2003 (Marr Prize) & IJCV 2005 Sanketh Shetty.
Bayesian statistics – MCMC techniques
BAYESIAN INFERENCE Sampling techniques
First introduced in 1977 Lots of mathematical derivation Problem : given a set of data (data is incomplete or having missing values). Goal : assume the.
Segmentation and Tracking of Multiple Humans in Crowded Environments Tao Zhao, Ram Nevatia, Bo Wu IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,
The University of Texas at Austin, CS 395T, Spring 2008, Prof. William H. Press IMPRS Summer School 2009, Prof. William H. Press 1 4th IMPRS Astronomy.
Maximum Likelihood Network Topology Identification Mark Coates McGill University Robert Nowak Rui Castro Rice University DYNAMICS May 5 th,2003.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Exploring subjective probability distributions using Bayesian statistics Tom Griffiths Department of Psychology Cognitive Science Program University of.
Multiple Human Objects Tracking in Crowded Scenes Yao-Te Tsai, Huang-Chia Shih, and Chung-Lin Huang Dept. of EE, NTHU International Conference on Pattern.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference.
Today Introduction to MCMC Particle filters and MCMC
Data-Driven Markov Chain Monte Carlo Presented by Tomasz MalisiewiczTomasz Malisiewicz for Advanced PerceptionAdvanced Perception 3/1/2006.
Normative models of human inductive inference Tom Griffiths Department of Psychology Cognitive Science Program University of California, Berkeley.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference.
Introduction to Monte Carlo Methods D.J.C. Mackay.
Bayes Factor Based on Han and Carlin (2001, JASA).
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
Material Model Parameter Identification via Markov Chain Monte Carlo Christian Knipprath 1 Alexandros A. Skordos – ACCIS,
Prakash Chockalingam Clemson University Non-Rigid Multi-Modal Object Tracking Using Gaussian Mixture Models Committee Members Dr Stan Birchfield (chair)
Bayesian parameter estimation in cosmology with Population Monte Carlo By Darell Moodley (UKZN) Supervisor: Prof. K Moodley (UKZN) SKA Postgraduate conference,
Hierarchical Bayesian Modeling (HBM) in EEG and MEG source analysis Carsten Wolters Institut für Biomagnetismus und Biosignalanalyse, Westfälische Wilhelms-Universität.
Particle Filters for Shape Correspondence Presenter: Jingting Zeng.
1 Gil McVean Tuesday 24 th February 2009 Markov Chain Monte Carlo.
Markov Random Fields Probabilistic Models for Images
Suppressing Random Walks in Markov Chain Monte Carlo Using Ordered Overrelaxation Radford M. Neal 발표자 : 장 정 호.
Markov-Chain Monte Carlo CSE586 Computer Vision II Spring 2010, Penn State Univ.
MTA SzTAKI & Veszprém University (Hungary) Guests at INRIA, Sophia Antipolis, 2000 and 2001 Paintbrush Rendering of Images Tamás Szirányi.
CAMELS CCDAS A Bayesian approach and Metropolis Monte Carlo method to estimate parameters and uncertainties in ecosystem models from eddy-covariance data.
Bayesian Hierarchical Modeling for Longitudinal Frequency Data Joseph Jordan Advisor: John C. Kern II Department of Mathematics and Computer Science Duquesne.
An Introduction to Markov Chain Monte Carlo Teg Grenager July 1, 2004.
Overview G. Jogesh Babu. Overview of Astrostatistics A brief description of modern astronomy & astrophysics. Many statistical concepts have their roots.
MCMC reconstruction of the 2 HE cascade events Dmitry Chirkin, UW Madison.
Lecture #9: Introduction to Markov Chain Monte Carlo, part 3
Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Introduction to Sampling Methods Qi Zhao Oct.27,2004.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Bayesian Modelling Harry R. Erwin, PhD School of Computing and Technology University of Sunderland.
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
HW7: Evolutionarily conserved segments ENCODE region 009 (beta-globin locus) Multiple alignment of human, dog, and mouse 2 states: neutral (fast-evolving),
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
Overview G. Jogesh Babu. R Programming environment Introduction to R programming language R is an integrated suite of software facilities for data manipulation,
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Introduction to Sampling based inference and MCMC
MCMC Output & Metropolis-Hastings Algorithm Part I
Advanced Statistical Computing Fall 2016
Simple Instances of Swendson-Wang & RJMCMC
Marked Point Processes for Crowd Counting
Data-Driven Markov Chain Monte Carlo
Outline Image Segmentation by Data-Driven Markov Chain Monte Carlo
Introduction into Bayesian transdimensional inverse problems
Markov chain monte carlo
Graduate School of Information Sciences, Tohoku University
Image Segmentation Techniques
Object Detection Using Marked Point Process
SMEM Algorithm for Mixture Models
Robust Full Bayesian Learning for Neural Networks
Bayesian Decision Theory
Stochastic Methods.
Presentation transcript:

1 Tree Crown Extraction Using Marked Point Processes Guillaume Perrin Xavier Descombes – Josiane Zerubia ARIANA, joint research group CNRS/INRIA/UNSA INRIA Sophia Antipolis, FRANCE MAS, Applied Mathematics Laboratory Ecole Centrale Paris, FRANCE EUSIPCO 2004 – 10 th, September 2004

2 Contents  Motivations  Notations and Definitions  Our Model for Tree Crown Extraction  Results  Conclusion 1

3 EUSIPCO 2004 – 10 th, September 2004 Motivations Remote sensing in forestry management - Near infrared images - Could avoid human investigations : economic considerations - More control on forest stands evolution 900nm 520nm Forestry statistics to estimate - Stem number - Diameter distribution - Forestry cover area Automatic Extraction - [Gougeon 95] ; valley following - [Larsen 97] : template based model 2 French Inventory (IFN) - Aerial images - 50 cm/pixel

4 Contents  Motivations  Notations and Definitions  Our Model for Tree Crown Extraction  Results  Conclusion 3

5 EUSIPCO 2004 – 10 th, September 2004 Notations and Definitions Object Space U An Object (position/marks) A Configuration A Marked Point Process X with - Probability Distribution P X - Unnormalized Density h(.) - Reference Poisson measure (uniform point process) Example : Strauss Process X Y k pypy pxpx u r s=5 with t 1 P P s=1 with t 2 <t 1

6 Contents  Motivations  Notations and Definitions  Our Model for Tree Crown Extraction  Results  Conclusion 5

7 EUSIPCO 2004 – 10 th, September 2004 Objects of the process - Disk process : position of the center and radius Density of the marked point process (1) - Prior density (knowledge) 1.Penalizes intersections of disks 2.Favours alignments 3.Hard Core (stability reasons) >0 repulsive <0 attractive Proposed model for Tree Crown Extraction 6

8 EUSIPCO 2004 – 10 th, September 2004 Proposed model for Tree Crown Extraction Density of the marked point process (2) - Likelihood = Gaussian Mixture - Each pixel belongs to one of these 2 classes : -Tree Class, with normal distribution -Background Class, with normal distribution Stability condition of the density 7

9 EUSIPCO 2004 – 10 th, September 2004 Proposed model for Tree Crown Extraction MCMC Simulation of point processes [Geyer 98] - Markov Chain (X) with equilibrium distribution P X (ergodic convergence) - Algorithm : Metropolis Hastings with Reversible Jumps [Green 95] - Application to feature extraction : Maximum A Posteriori Estimator 1.Simulate a point process defined by a density h(.) 2.Explore the whole state space 3.Find one of the global maxima of h(.) Simulated Annealing 8

10 EUSIPCO 2004 – 10 th, September 2004 Proposed model for Tree Crown Extraction Reversible Jump MCMC Algorithm Configuration of objects X i = x Simulate y ~ Q(x,.) (proposal kernel) Evaluate Green ratio R=F[Q(.,.),h(.),x,y] Accept y with probability min(1,R) XiXi Y Proposal Kernel : - Birth / Death - Translation - Dilation - Split / Merge - BD alignment, … Goal : find the MAP as fast as possible and avoid local maxima of h(.) 9

11 Contents  Motivations  Notations and Definitions  Our Model for Tree Crown Extraction  Results  Conclusion 10

12 EUSIPCO 2004 – 10 th, September 2004 Results Results depend on - Simulated annealing scheme -In theory : logarithmic decrease to get the MAP estimator -In practice : geometric decrease. every N iterations - Parameters of density h(.) -Which parameters for the priori ? -Experimental / Parameter Estimation -Which parameters for the likelihood ? -KMeans / Parameter Estimation 11

13 EUSIPCO 2004 – 10 th, September 2004 Results SLOW : 50M iterations ~ 17 minutes 304 objects - U= FAST : 1,5M iterations ~ 30 seconds 299 objects - U= Original image 12

14 EUSIPCO 2004 – 10 th, September 2004 Results Extraction evolution / Green ratio - High Temperature : Green ratio dominated by Poisson measure ratio - Low Temperature : Green ratio dominated by density ratio Critical Temperature 13 Density ratio Poisson measure ratio Kernel ratio Diaporama

15 Contents  Motivations  Notations and Definitions  Our Model for Tree Crown Extraction  Results  Conclusion 14

16 EUSIPCO 2004 – 10 th, September 2004 Conclusion Advantages of the modeling - Geometrical information of stands taken into account - Can be adapted to multi-species extraction Drawbacks - Computational time - Trees have to be separable (pb on too dense areas) Future work - Parameter estimation on the global model (in progress) - Texture information in the density (distinguish btw different species) 15

17 EUSIPCO 2004 – 10 th, September 2004 References [Gougeon 95] A crown-following approach to the automatic delineation of individual tree crowns in high spatial resolution aerial images – Canadian Journal of Remote Sensing – 21(3), , [Larsen 97] Using ray-traced templates to find individual trees in aerial photographs – Proc. 10 th Scandinavian Conference on Image Analysis, vol.2, , [Green 95] Reversible Jump Markov Chain Monte Carlo computation and Bayesian model determination – Biometrika 82, , [Geyer 98] Stochastic geometry, likelihood and computation : “Likelihood inference for spatial point processes”, Chapman et Hall, London,

18 EUSIPCO 2004 – 10 th, September 2004 Results Extraction evolution / Green ratio - High Temperature : Green ratio dominated by Poisson measure ratio - Low Temperature : Green ratio dominated by density ratio Critical Temperature 13 Back to presentation Density ratio Poisson measure ratio Kernel ratio