Shashi Shekhar Weili Wu Sanjay Chawla Ranga Raju Vatsavai

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

U.S. Department of the Interior U.S. Geological Survey USGS/EROS Data Center Global Land Cover Project – Experiences and Research Interests GLC2000-JRC.
Spatial Dependency Modeling Using Spatial Auto-Regression Mete Celik 1,3, Baris M. Kazar 4, Shashi Shekhar 1,3, Daniel Boley 1, David J. Lilja 1,2 1 CSE.
Data Mining Classification: Alternative Techniques
Carolina Galleguillos, Brian McFee, Serge Belongie, Gert Lanckriet Computer Science and Engineering Department Electrical and Computer Engineering Department.
PRESENTATION ON “ Processing Of Satellite Image Using Dip ” by B a n d a s r e e n i v a s Assistant Professor Department of Electronics & Communication.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Desheng Liu, Maggi Kelly and Peng Gong Dept. of Environmental Science, Policy & Management University of California, Berkeley May 18, 2005 Classifying.
Markov random field Institute of Electronics, NCTU
Chapter 8-3 Markov Random Fields 1. Topics 1. Introduction 1. Undirected Graphical Models 2. Terminology 2. Conditional Independence 3. Factorization.
Multiple Criteria for Evaluating Land Cover Classification Algorithms Summary of a paper by R.S. DeFries and Jonathan Cheung-Wai Chan April, 2000 Remote.
1 On the Statistical Analysis of Dirty Pictures Julian Besag.
Predicting Locations Using Map Similarity(PLUMS): A Framework for Spatial Data Mining Sanjay Chawla(Vignette Corporation) Shashi Shekhar, Weili Wu(CS,
Predictive Automatic Relevance Determination by Expectation Propagation Yuan (Alan) Qi Thomas P. Minka Rosalind W. Picard Zoubin Ghahramani.
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU
Spatial Data Mining: Three Case Studies For additional details Shashi Shekhar, University of Minnesota Presented.
Classification of Remotely Sensed Data General Classification Concepts Unsupervised Classifications.
Relevance Feedback Content-Based Image Retrieval Using Query Distribution Estimation Based on Maximum Entropy Principle Irwin King and Zhong Jin The Chinese.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Remote Sensing Laboratory Dept. of Information Engineering and Computer Science University of Trento Via Sommarive, 14, I Povo, Trento, Italy Remote.
Crash Course on Machine Learning
Image Classification and its Applications
Extracting Places and Activities from GPS Traces Using Hierarchical Conditional Random Fields Yong-Joong Kim Dept. of Computer Science Yonsei.
Competence Centre on Information Extraction and Image Understanding for Earth Observation Matteo Soccorsi (1) and Mihai Datcu (1,2) A Complex GMRF for.
Introduction to Remote Sensing. Outline What is remote sensing? The electromagnetic spectrum (EMS) The four resolutions Image Classification Incorporation.
Bayesian Sets Zoubin Ghahramani and Kathertine A. Heller NIPS 2005 Presented by Qi An Mar. 17 th, 2006.
Hierarchical Distributed Genetic Algorithm for Image Segmentation Hanchuan Peng, Fuhui Long*, Zheru Chi, and Wanshi Siu {fhlong, phc,
第十讲 概率图模型导论 Chapter 10 Introduction to Probabilistic Graphical Models
Image Classification 영상분류
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Markov Random Fields Probabilistic Models for Images
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
Classification Techniques: Bayesian Classification
Computational Sciences & Engineering Division Geographic Information Science and Technology Landsat LIDAR data Hi-res satellite imagery sensor networks.
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
Map of the Great Divide Basin, Wyoming, created using a neural network and used to find likely fossil beds See:
Fields of Experts: A Framework for Learning Image Priors (Mon) Young Ki Baik, Computer Vision Lab.
Page  1 LAND COVER GEOSTATISTICAL CLASSIFICATION FOR REMOTE SENSING  Kęstutis Dučinskas, Lijana Stabingiene and Giedrius Stabingis  Department of Statistics,
Lecture 2: Statistical learning primer for biologists
Data Mining and Decision Support
6.4 Random Fields on Graphs 6.5 Random Fields Models In “Adaptive Cooperative Systems” Summarized by Ho-Sik Seok.
Discriminative Training and Machine Learning Approaches Machine Learning Lab, Dept. of CSIE, NCKU Chih-Pin Liao.
Chapter 7 Maximum likelihood classification of remotely sensed imagery 遥感影像分类的最大似然法 Chapter 7 Maximum likelihood classification of remotely sensed imagery.
Unsupervised Classification
26. Classification Accuracy Assessment
Biointelligence Laboratory, Seoul National University
Data Mining, Machine Learning, Data Analysis, etc. scikit-learn
Chapter 7. Classification and Prediction
LECTURE 09: BAYESIAN ESTIMATION (Cont.)
Classification of Remotely Sensed Data
Markov Random Fields with Efficient Approximations
Map of the Great Divide Basin, Wyoming, created using a neural network and used to find likely fossil beds See:
Machine Learning Basics
Location Prediction and Spatial Data Mining (S. Shekhar)
University College London (UCL), UK
Statistical Learning Dong Liu Dept. EEIS, USTC.
Course Outline MODEL INFORMATION COMPLETE INCOMPLETE
REMOTE SENSING Multispectral Image Classification
REMOTE SENSING Multispectral Image Classification
Markov Random Fields for Edge Classification
Statistical NLP: Lecture 4
Data Mining, Machine Learning, Data Analysis, etc. scikit-learn
Data Mining, Machine Learning, Data Analysis, etc. scikit-learn
Parametric Methods Berlin Chen, 2005 References:
Multivariate Methods Berlin Chen
Logistic Regression Chapter 7.
Multivariate Methods Berlin Chen, 2005 References:
Spatial Data Mining: Three Case Studies
EM Algorithm and its Applications
Outline Texture modeling - continued Markov Random Field models
Presentation transcript:

Modeling Spatial Context in Classification/Prediction Using MRF and SAR techniques. Shashi Shekhar Weili Wu Sanjay Chawla Ranga Raju Vatsavai Department of Computer Science. Army HPC Research Center. University of Minnesota.

Outline Introduction Problem Definition Supervised Classification Spatial Context Markov Random Fields (MRF) Spatial Autoregression (SAR) Results

Introduction Spatial Databases Objectives Techniques Maps, Ground Observations, Multi-spectral/Multi-temporal remote sensing images e.g. Ecology (Wetlands), Forest Inventory, Aerial Photographs and Satellite Remote Sensing images Objectives Predict spatial distribution of marsh-breeding birds Thematic Classification – identification of objects in a imagery Techniques Supervised, Unsupervised Statistical, Neural Knowledge Based

Example Datasets

Example Datasets

Problem Definition Given Find Objective Constraints A spatial framework S consisting of {s1, …,sn} for an underlying geographic space G A collection of explanatory functions fXk : S -> Rk, k = 1, …,K. Rk is the range A dependent class variable fL : S -> L = {l1, …, lM} A value for parameter , relative importance of spatial accuracy Find Classification model: f^L : R1 x …RK –> L. Objective Maximize similarity ( = (1- )classification_acc(f^L , fL ) + ()spatial_acc(f^L , fL )) Constraints S is multi-dimensional Euclidean space Explanatory functions and response function may not be independent, i.e. spatial autocorrelation exists.

Accuracy Measure ADNP

Problem Definition Given Find Multi-spectral image X, composed of N-dimensional Pixels X(i,j) = [bv1, …bvn]’ Find Appropriate Label

Classical Techniques Logistic Regression Assumptions y = X  +  Assumptions independent, identical, zero-mean, normal distribution i.e. i = N ( 0, 2 ). Bayesian Classification Pr(li | X) = Pr(X | li) Pr(li) / Pr(X).

Classification Schemes

Pixel Classification Class to which a pixel at location x belongs it is strictly the conditional probability Decision Rule

Classification

Comparison Criteria Logistic Regression Bayesian Input fx1, …,fxk, fl Intermediate Results  Pr(li), Pr(X|li) Output Pr(li | X) based on  Pr(li | X) based on Pr(li), Pr(X|li) Decision Select most likely class For a given feature value Assumptions - Pr(X|li) - Class bndry - Autocorrelation Exponential family Linearly separable None -

Spatial Context What is spatial context? Why? Observations at location i depend on observations at location j i. Formally yi = f(yj), j = 1,2,…..n j i. Why? Natural Objects (Species) occur together Higher sensor resolution (i.e. object is bigger than pixel).

Prior Distribution Model: For Markov random field , the conditional distribution of a point in the field given all other points is only dependent on its neighbors. x s x x x x x x x x s x x x s x x x x x x x x x

MRF Continued A Clique is defined as a subset of points in S such that if s and r two points contained in clique C, then s and r are neighbors. s

Gibbs Distribution For a given neighborhood system, a Gibbs distribution is defined as any distribution expressed as:

Hammersley-Clifford Theorem H-C theorem states that is an MRF with strictly positive distribution and a neighborhood system the distribution of can be written as a Gibbs distribution with cliques induced by the neighborhood system. Therefore, if p( ) is formulated as Gibbs distribution, would have properties of MRF.

Gibbs Distribution

ICM 1. Compute 2. For all pixels (i,j), update 3. Repeat 2. Assuming multivariate normal distribution:

SAR y =  W y + X  +  W is the contiguity matrix  strength of spatial dependencies

Experimental Results Experimental Setup

Results

Results

Results

Results

Results

Results

Conclusion Comparison on a common probabilistic framework. Modeling spatial dependencies SAR makes more restrictive assumptions than MRF MRF allows flexible modeling of spatial context Relationship between SAR and MRF is analogues to logistic regression and Bayesian classifiers. Efficient Solution procedures Graph-cut Extend the graph-cut to SAR.