MIT AI Knowledge Based 3D Medical Image Segmentation Tina Kapur MIT Artificial Intelligence Laboratory

Slides:



Advertisements
Similar presentations
Bayes rule, priors and maximum a posteriori
Advertisements

SPM5 Segmentation. A Growing Trend Larger and more complex models are being produced to explain brain imaging data. Bigger and better computers allow.
Bayesian Belief Propagation
Active Appearance Models
MRI preprocessing and segmentation.
Maximum Likelihood And Expectation Maximization Lecture Notes for CMPUT 466/551 Nilanjan Ray.
Budapest May 27, 2008 Unifying mixed linear models and the MASH algorithm for breakpoint detection and correction Anders Grimvall, Sackmone Sirisack, Agne.
Jeroen Hermans, Frederik Maes, Dirk Vandermeulen, Paul Suetens
Visual Recognition Tutorial
EE 290A: Generalized Principal Component Analysis Lecture 6: Iterative Methods for Mixture-Model Segmentation Sastry & Yang © Spring, 2011EE 290A, University.
First introduced in 1977 Lots of mathematical derivation Problem : given a set of data (data is incomplete or having missing values). Goal : assume the.
1 On the Statistical Analysis of Dirty Pictures Julian Besag.
Lecture 5: Learning models using EM
12-Apr CSCE790T Medical Image Processing University of South Carolina Department of Computer Science 3D Active Shape Models Integrating Robust Edge.
Independent Component Analysis (ICA) and Factor Analysis (FA)
Expectation Maximization Algorithm
Announcements Readings for today:
Visual Recognition Tutorial
What is it? When would you use it? Why does it work? How do you implement it? Where does it stand in relation to other methods? EM algorithm reading group.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Preprocessing II: Between Subjects John Ashburner Wellcome Trust Centre for Neuroimaging, 12 Queen Square, London, UK.
Expectation-Maximization (EM) Chapter 3 (Duda et al.) – Section 3.9
Rician Noise Removal in Diffusion Tensor MRI
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Extracting Places and Activities from GPS Traces Using Hierarchical Conditional Random Fields Yong-Joong Kim Dept. of Computer Science Yonsei.
1 Patch Complexity, Finite Pixel Correlations and Optimal Denoising Anat Levin, Boaz Nadler, Fredo Durand and Bill Freeman Weizmann Institute, MIT CSAIL.
SegmentationSegmentation C. Phillips, Institut Montefiore, ULg, 2006.
Alignment and classification of time series gene expression in clinical studies Tien-ho Lin, Naftali Kaminski and Ziv Bar-Joseph.
DTU Medical Visionday May 27, 2009 Generative models for automated brain MRI segmentation Koen Van Leemput Athinoula A. Martinos Center for Biomedical.
Segmental Hidden Markov Models with Random Effects for Waveform Modeling Author: Seyoung Kim & Padhraic Smyth Presentor: Lu Ren.
November 1, 2012 Presented by Marwan M. Alkhweldi Co-authors Natalia A. Schmid and Matthew C. Valenti Distributed Estimation of a Parametric Field Using.
Enhanced Correspondence and Statistics for Structural Shape Analysis: Current Research Martin Styner Department of Computer Science and Psychiatry.
Image Reconstruction from Projections Antti Tuomas Jalava Jaime Garrido Ceca.
MEDICAL IMAGE ANALYSIS Marek Brejl Vital Images, Inc.
Course 9 Texture. Definition: Texture is repeating patterns of local variations in image intensity, which is too fine to be distinguished. Texture evokes.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
National Alliance for Medical Image Computing Segmentation Foundations Easy Segmentation –Tissue/Air (except bone in MR) –Bone in CT.
Lecture 6 Spring 2010 Dr. Jianjun Hu CSCE883 Machine Learning.
Image Analysis, Random Fields and Dynamic MCMC By Marc Sobel.
1 Markov random field: A brief introduction (2) Tzu-Cheng Jen Institute of Electronics, NCTU
Paper Reading Dalong Du Nov.27, Papers Leon Gu and Takeo Kanade. A Generative Shape Regularization Model for Robust Face Alignment. ECCV08. Yan.
Gaussian Mixture Models and Expectation-Maximization Algorithm.
Incorporating Non-rigid Registration into Expectation Maximization Algorithm to Segment MR Images By K.M. Pohl, W.M. Wells, A. Guimond, K. Kasai, M.E.
Lecture 2: Statistical learning primer for biologists
Information Bottleneck versus Maximum Likelihood Felix Polyakov.
Digital Image Processing
Bayesian Speech Synthesis Framework Integrating Training and Synthesis Processes Kei Hashimoto, Yoshihiko Nankaku, and Keiichi Tokuda Nagoya Institute.
Machine Learning 5. Parametric Methods.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Discriminative Training and Machine Learning Approaches Machine Learning Lab, Dept. of CSIE, NCKU Chih-Pin Liao.
Statistical Models for Automatic Speech Recognition Lukáš Burget.
MultiModality Registration Using Hilbert-Schmidt Estimators By: Srinivas Peddi Computer Integrated Surgery II April 6 th, 2001.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
National Alliance for Medical Image Computing Hierarchical Atlas Based EM Segmentation.
Machine Learning Expectation Maximization and Gaussian Mixtures CSE 473 Chapter 20.3.
Machine Learning Expectation Maximization and Gaussian Mixtures CSE 473 Chapter 20.3.
Automatic segmentation of brain structures
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
Yun, Hyuk Jin. Theory A.Nonuniformity Model where at location x, v is the measured signal, u is the true signal emitted by the tissue, is an unknown.
Yun, Hyuk Jin. Introduction MAGNETIC RESONANCE (MR) signal intensity measured from homogeneous tissue is seldom uniform; rather it varies smoothly across.
Biointelligence Laboratory, Seoul National University
Classification of unlabeled data:
Statistical Models for Automatic Speech Recognition
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Expectation-Maximization & Belief Propagation
Parametric Methods Berlin Chen, 2005 References:
Anatomical Measures John Ashburner
A Block Based MAP Segmentation for Image Compression
Mixture Models with Adaptive Spatial Priors
Presentation transcript:

MIT AI Knowledge Based 3D Medical Image Segmentation Tina Kapur MIT Artificial Intelligence Laboratory

MIT AI Outline Goal of Segmentation Applications Why is segmentation difficult? My method for segmentation of MRI Future Work

MIT AI The Goal of Segmentation

MIT AI The Goal of Segmentation

MIT AI Applications of Segmentation Image Guided Surgery

MIT AI Applications of Segmentation Image Guided Surgery

MIT AI Applications of Segmentation Image Guided Surgery Surgical Simulation

MIT AI Applications of Segmentation Image Guided Surgery Surgical Simulation

MIT AI Applications of Segmentation Image Guided Surgery Surgical Simulation Neuroscience Studies Therapy Evaluation

MIT AI Limitations of Manual Segmentation slow (up to 60 hours per scan) variable (up to 15% between experts) [Warfield 95, Kaus98]

MIT AI The Automatic Segmentation Challenge An automated segmentation method needs to reconcile –Gray-level appearance of tissue –Characteristics of imaging modality –Geometry of anatomy

MIT AI How to Segment? i.e. Issues in Segmentation of Anatomy

MIT AI How to Segment? i.e. Issues in Segmentation of Anatomy Tissue Intensity Models

MIT AI How to Segment? i.e. Issues in Segmentation of Anatomy Tissue Intensity Models –Parametric [Vannier] –Non-Parametric [Gerig] –Point distribution Models [Cootes] –Texture [Mumford]

MIT AI How to Segment? i.e. Issues in Segmentation of Anatomy Tissue Intensity Models Imaging Modality Models

MIT AI How to Segment? i.e. Issues in Segmentation of Anatomy Tissue Intensity Models Imaging Modality Models –MRI inhomogeneity [Wells]

MIT AI How to Segment? i.e. Issues in Segmentation of Anatomy Tissue Intensity Models Imaging Modality Models Anatomy Models: Shape, Geometric/Spatial

MIT AI How to Segment? i.e. Issues in Segmentation of Anatomy Tissue Intensity Models Imaging Modality Models Anatomy Models: Shape, Geometric/Spatial –PCA [Cootes and Taylor, Gerig, Duncan, Martin] –Landmark Based [Evans] –Atlas [Warfield]

MIT AI Typical Pipeline for Segmentation of Brain MRI Pre-processing for noise reduction EM Segmentation Morphological or other post-processing pre-processing (noise removal)

MIT AI Typical Pipeline for Segmentation of Brain MRI Pre-processing for noise reduction EM Segmentation Morphological or other post-processing pre-processing (noise removal) intensity-based classification

MIT AI Typical Pipeline for Segmentation of Brain MRI Pre-processing for noise reduction EM Segmentation Morphological or other post-processing pre-processing (noise removal) post-processing (morphology/other) intensity-based classification

MIT AI Contributions of Thesis Developed an integrated Bayesian Segmentation Method for MRI that incorporates de-noising and global geometric knowledge using priors into EM- Segmentation Applied integrated Bayesian method to segmentation of Brain and Knee MRI.

MIT AI Contributions of Thesis The Priors –de-noising: novel use of a Mean-Field Approximation to a Gibbs random field in conjunction with EM-Segmentation (EM-MF) –geometric: novel statistical description of global spatial relationships between structures, used as a spatially varying prior in EM- Segmentation

MIT AI Background to My Work Expectation-Maximization Algorithm EM-Segmentation

MIT AI Expectation-Maximization Relevant Literature: –[Dempster, Laird, Rubin 1977] –[Neal 1998]

MIT AI Expectation-Maximization (what?) Search Algorithm for Parameters of a Model to Maximize Likelihood of Data Data: some observed, some unobserved

MIT AI Expectation-Maximization (how?) Initial Guess of Model Parameters Re-estimate Model Parameters: –E Step: compute PDF for hidden variables, given observations and current model parameters –M Step: compute ML model parameters assuming pdf for hidden variables is correct

MIT AI Notation –Observed Variables: –Hidden Variables : –Model Parameters: Expectation-Maximization (how exactly?)

MIT AI Initial Guess: Successive Estimation of –E Step: –M Step: Expectation-Maximization (how exactly?)

MIT AI Expectation-Maximization Summary/Intuition: –If we had complete data, maximize likelihood –Since some data is missing, approximate likelihood with its expectation –Converges to local maximum of likelihood

MIT AI EM-Segmentation [Wells 1994] Observed Signal is modeled as a product of the true signal and a corrupting gain field due to the imaging equipment Expectation-Maximization is used on log- transformed observations for iterative estimation of –tissue classification –corrupting bias field (inhomogeneity correction)

MIT AI M-Step E-Step EM-Segmentation [Wells 1994]

MIT AI Estimate intensity correction using residuals based on current posteriors. Compute tissue posteriors using current intensity correction. M-Step E-Step EM-Segmentation [Wells 1994]

MIT AI Observed Variables –log transformed intensities in image Hidden Variables –indicator variables for classification Model Parameters –the slowly varying corrupting bias field ( refer to variables at voxel s in image) EM-Segmentation [Wells 1994]

MIT AI Initial Guess: Successive Estimation of –E Step: –M Step: EM-Segmentation [Wells 1994]

MIT AI Initial Guess: Successive Estimation of –E Step: –M Step: EM-Segmentation [Wells 1994]

MIT AI Situating My Work Prior in EM-Segmentation: –Independent and Spatially Stationary My contribution is addition of two priors: –a spatially stationary Gibbs prior to model local interactions between neighbors (thermal noise) –spatially varying prior to model global relationships between geometry of structures

MIT AI The Gibbs Prior Gibbs Random Field (GRF) –natural way to model piecewise homogeneous phenomena –used in image restoration [Geman&Geman 84] –Probability Model on a lattice –Partially Relaxes independence assumption to allow interactions between neighbors

MIT AI EM-MF Segmentation: EM + Gibbs Prior We model tissue classification W as a Gibbs random field:

MIT AI We model tissue classification W as a Gibbs random field: EM-MF Segmentation: Gibbs Prior on Classification

MIT AI To fully specify the Gibbs model: –define neighborhood system as a first order neighborhood system i.e. 6 closest voxels –use to define EM-MF Segmentation: Gibbs Prior on Classification

MIT AI EM-MF Segmentation: Gibbs form of Posterior Gibbs prior and Gaussian Measurement Models lead to Gibbs form for Posterior:

MIT AI Gibbs prior and Gaussian Measurement Models lead to Gibbs form for Posterior: EM-MF Segmentation: Gibbs form of Posterior

MIT AI EM-MF Segmentation For E-Step: Need values for

MIT AI EM-MF Segmentation For E-Step: Need values for Cannot compute directly from Gibbs form

MIT AI EM-MF Segmentation For E-Step: Need values for Cannot compute directly from Gibbs form Note

MIT AI EM-MF Segmentation For E-Step: Need values for Cannot compute directly from Gibbs form Note Can approximate – Mean-Field Approximation to GRF

MIT AI Mean-Field Approximation Deterministic Approximation to GRF [Parisi84] –the mean/expected value of a GRF is obtained as a solution to a set of consistency equations Update Equation is obtained using derivative of partition function with respect to the external field g. [Elfadel 93] Used in image reconstruction [Geiger, Yuille, Girosi 91]

MIT AI Mean-Field Approximation to Posterior GRF Intuition: denominator is normalizer numerator captures: effect of labels at neighbors measurement at voxel itself

MIT AI Summary of EM-MF Segmentation Modeled piecewise homogeneity of tissue using a Gibbs prior on classification Lead to Gibbs form for Posteriors Posterior Probabilities in E-Step are approximated as a Mean-Field solution

MIT AI EM-MF Results Application: Brain MRI –white matter, gray matter, fluid/air, skin/scalp Results Comparison with Manual Segmentation

MIT AI Some Results EM EM-MF

MIT AI Some Results EM EM-MF

MIT AI More Results Noisy MRIEM Segmentation EM-MF Segmentation

MIT AI Posterior Probabilities (EM) White matter Gray matter

MIT AI Posterior Probabilities (EM-MF) White matter Gray matter

MIT AI Results

MIT AI Modeling Global Geometric Relationships between Structures

MIT AI Relative Geometry Models Motivate Using Knee MRI Brain MRI Example Modeling Global Geometric Relationships between Structures

MIT AI Segmented Knee MRI Femur Tibia Femoral Cartilage Tibial Cartilage MERL, SPL, MIT, CMU Surgical Simulation (Sarah Gibson, PI)

MIT AI Motivation Primary Structures –image well –easy to segment Secondary Structures –image poorly –relative to primary Tibial Cartilage Femoral Cartilage Tibia Femur

MIT AI Relative Geometric Prior Approach Select primary/secondary structures Measure geometric relation between primary and secondary structures from training data Given novel image –segment primary structures –use geometric relation as prior on secondary structure in EM-MF Segmentation

MIT AI Segment Primary Structures: Femur, Tibia SeedRegion GrowingBoundary Localization

MIT AI Status Have Bone Want Cartilage

MIT AI Measure Geometric Relationship between Primary and Secondary Structures Femur Tibia Femoral Cartilage Tibial Cartilage Using primitives such as –distances between surfaces –local normals of primary structures –local curvature of primary structures –etc.

MIT AI Femur Tibia Femoral Cartilage Tibial Cartilage Measure Geometric Relationship between Primary and Secondary Structures

MIT AI Estimate of

MIT AI Status Have Bone Have spatial relation between Bone and Cartilage Need Cartilage

MIT AI Use Relative Geometric Prior in EM Segmentation Replace stationary prior with relative geometric prior:

MIT AI Results: Segmentation of Femoral & Tibial Cartilage MRI Image Model-Based Segmentation Manual Segmentation

MIT AI Relative Geometric Priors for Brain Tissue Prior Estimation –Select primary structures (boundary of skin, ventricles) –Estimate Using Prior in Segmentation –Segment primary structures: skin, ventricles –Use as geometric prior

MIT AI White MatterGray Matter Estimate distance to Ventricles distance to Skin

MIT AI Resultant Segmentation MRIEM Segmentation EM-MF with Geometric Prior

MIT AI Posterior Probabilities Gray Matter White Matter EM-MF+Geometric PriorEM

MIT AI In Summary Incorporated robustness to thermal noise by using Mean-Field Approximation to Gibbs model in conjunction with EM Segmentation. Applied to Brain MRI. Introduced Relative-Geometry Models and applied to Brain and Knee MRI.

MIT AI Future Work Further development of Relative-Geometry Models: –Automatic selection of primary/secondary structures –Additional primitives for Spatial Relationships

MIT AI STOP

MIT AI d1 (distance to Ventricles) d2 (distance to Skin) White Matter Grey Matter Fluid Air Left Caudate Right Caudate Class Conditional Density

MIT AI Unified Bayesian Segmentation Method Simultaneous noise reduction intensity-based classification use of geometric information for segmentation.

MIT AI Rest of Talk: 1.The Unified Segmentation Method 2.Two Priors 3.Results on Brain, Knee Segmentation 4.Conclusions

MIT AI The Gibbs Prior To fully specify the Gibbs model: –define neighborhood system as a first order neighborhood system i.e. 6 closest voxels –use to define

MIT AI Components of EM Framework Measurement models for tissue Prior models for tissue Model for bias field –piecewise smooth

MIT AI Addition of Two Priors Gibbs prior on tissue appearance –models tissue as piecewise constant Geometric Prior to encode spatial relations –gray matter is outside ventricles and inside skull

MIT AI Gibbs Model

MIT AI Gibbs Model probability model on a lattice independence assumption is partially relaxed spatial range of interaction is local neighborhood

MIT AI

MIT AI Gibbs Model probability model on a lattice independence assumption is partially relaxed spatial range of interaction is local neighborhood Mean-Field Approximation Approximates neighboring random variables with their mean values: - algebraic and computational simplicity

MIT AI Contributions of Thesis MRI+NoiseEM Segmentation EM-MF with Geometric Prior

MIT AI Proposed MRI Segmentation Method Bayesian Statistical Classification Scheme that uses Expectation-Maximization Replaces pipeline with Priors on intensity and geometry

MIT AI Proposed MRI Segmentation Method Previous Work [Wells 1994, 1996] –derived as a special case –spatially stationary, independent priors –piecewise smooth inhomogeneity model This Work: –locally interacting prior for intensity –spatially varying prior for geometry

MIT AI Next Background on Expectation-Maximization

MIT AI 1. Use Bayes’ rule, and independence between and to write: 2. Specify Measurement Models as Gaussian Computation of

MIT AI 3. Specify Prior on Classification as a Gibbs Random Field. 4. Gibbs Prior + Gaussian Measurement Model imply is also a GRF. 5. Approximate using Mean- Field Solution for the GRF. Computation of

MIT AI Recap: 1. Bayes’ Rule to rewrite … 2. Gaussian Measurement Models  3. Gibbs form of Prior  4. Gibbs form of Posterior  5. Mean-Field Approximation to Gibbs form Computation of

MIT AI The Gibbs Prior Tissue-class interaction Matrix Stationary Prior

MIT AI The Mean-Field Solution Gibbs models can be solved using –[Metropolis 1953], [Geman and Geman 1984], [Besag 1986] etc. We use Mean-Field Approximation to estimate the expected value of the posterior GRF as a solution to a set of consistency equations.

MIT AI The Mean-Field Approximation Deterministic Approximation Update Equation is obtained using derivative of partition function with respect to the external field g.

MIT AI The Mean-Field Solution Intuition: denominator is normalizer numerator captures: effect of labels at neighbors measurement at voxel itself

MIT AI Initial Guess: Successive Estimation of –E Step: Estimate as Mean-Field Solution to a Gibbs Random Field –M Step: Compute same as [Wells 1996] EM-MF Summary