Mutual Information Narendhran Vijayakumar 03/14/2008.

Slides:



Advertisements
Similar presentations
Dynamic View Selection for Time-Varying Volumes Guangfeng Ji* and Han-Wei Shen The Ohio State University *Now at Vital Images.
Advertisements

Grand Canonical Ensemble and Criteria for Equilibrium
Mutual Information Based Registration of Medical Images
Binary Symmetric channel (BSC) is idealised model used for noisy channel. symmetric p( 01) =p(10)
Pattern Recognition and Machine Learning
Introduction A recursive approach A Gerber Shiu function at claim instants Numerical illustrations Conclusions.
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
LIMITS AND CONTINUITY OF FUNCTIONS Introduction to Limits “ I’m nearing the limit of my patience” Consider the function determined by the formula Note.
Entropy in the Quantum World Panagiotis Aleiferis EECS 598, Fall 2001.
Measures of Information Hartley defined the first information measure: –H = n log s –n is the length of the message and s is the number of possible values.
Using Diffusion Weighted Magnetic Resonance Image (DWMRI) scans, it is possible to calculate an Apparent Diffusion Coefficient (ADC) for a Region of Interest.
1 Feature Selection using Mutual Information SYDE 676 Course Project Eric Hui November 28, 2002.
Medical Imaging Mohammad Dawood Department of Computer Science University of Münster Germany.
Mutual Information Mathematical Biology Seminar
1 Improving Entropy Registration Theodor D. Richardson.
Improving Image registration accuracy Narendhran Vijayakumar 02/29/2008.
Optimum Gray Level Narendhran Vijayakumar 02/08/2008.
Mutual Information as a Measure for Image Quality of Temporally Subtracted Chest Radiographs Samantha Passen Samuel G. Armato III, Ph.D.
Mutual Information for Image Registration and Feature Selection
Yujun Guo Kent State University August PRESENTATION A Binarization Approach for CT-MR Registration Using Normalized Mutual Information.
Different registration methods Narendhran Vijayakumar 01/25/2008.
Image Registration Narendhran Vijayakumar (Naren) 12/17/2007 Department of Electrical and Computer Engineering 1.
Abdallah Kassir 1. Information Theory Entropy: Conditional Entropy: Mutual Information: 2.
Improving the MI registration metric Narendhran Vijayakumar 04/25/2008.
(1) A probability model respecting those covariance observations: Gaussian Maximum entropy probability distribution for a given covariance observation.
Quantum Shannon Theory Patrick Hayden (McGill) 17 July 2005, Q-Logic Meets Q-Info.
An Information Theoretic Approach to Bilingual Word Clustering Manaal Faruqui & Chris Dyer Language Technologies Institute SCS, CMU.
Results Theory Abstract Evaluation of Scintillation Index and Intensity of Partially Coherent Laser Light MIDN 4/C Meredith L. Lipp and MIDN 4/C Kathryn.
If we measured a distribution P, what is the tree- dependent distribution P t that best approximates P? Search Space: All possible trees Goal: From all.
National Alliance for Medical Image Computing Registration in Slicer3 Julien Jomier Kitware Inc.
Entropy and some applications in image processing Neucimar J. Leite Institute of Computing
INFORMATION THEORY BYK.SWARAJA ASSOCIATE PROFESSOR MREC.
1 Logarithms Definition If y = a x then x = log a y For log 10 x use the log button. For log e x use the ln button.
Mutual Information-based Stereo Matching Combined with SIFT Descriptor in Log-chromaticity Color Space Yong Seok Heo, Kyoung Mu Lee, and Sang Uk Lee.
§4 Continuous source and Gaussian channel
Unbounded Knowledge Acquisition Based Upon Mutual Information in Dependent Questions Tony C. Smith & Chris van de Molen Department of Computer Science.
Ch2: Probability Theory Some basics Definition of Probability Characteristics of Probability Distributions Descriptive statistics.
Alignment methods April 26, 2011 Return Quiz 1 today Return homework #4 today. Next homework due Tues, May 3 Learning objectives- Understand the Smith-Waterman.
Similar Figures Similar Figures Definition of Similar Figures Similar figures are figures that have the same shape but not necessarily the same size. Example:
Jan Kamenický Mariánská  We deal with medical images ◦ Different viewpoints - multiview ◦ Different times - multitemporal ◦ Different sensors.
MEDICAL IMAGE REGISTRATION BY MAXIMIZATION OF MUTUAL INFORMATION Dissertation Defense by Chi-hsiang Lo June 27, 2003 PRESENTATION.
1/30 Challenge the future Auto-alignment of the SPARC mirror W.S. Krul.
A New Method of Probability Density Estimation for Mutual Information Based Image Registration Ajit Rajwade, Arunava Banerjee, Anand Rangarajan. Dept.
Conclusions The success rate of proposed method is higher than that of traditional MI MI based on GVFI is robust to noise GVFI based on f1 performs better.
Information Theory Metrics Giancarlo Schrementi. Expected Value Example: Die Roll (1/6)*1+(1/6)*2+(1/6)*3+(1/6)*4+(1/6)*5+( 1/6)*6 = 3.5 The equation.
Object Recognition a Machine Translation Learning a Lexicon for a Fixed Image Vocabulary Miriam Miklofsky.
Medical Image Analysis Dr. Mohammad Dawood Department of Computer Science University of Münster Germany.
1 Value of information – SITEX Data analysis Shubha Kadambe (310) Information Sciences Laboratory HRL Labs 3011 Malibu Canyon.
Lecture 4: Lossless Compression(1) Hongli Luo Fall 2011.
October 1, 2013Computer Vision Lecture 9: From Edges to Contours 1 Canny Edge Detector However, usually there will still be noise in the array E[i, j],
Uses of Information Theory in Medical Imaging Wang Zhan, Ph.D. Center for Imaging of Neurodegenerative Diseases Tel: x2454,
Information and Entropy. Consider W discrete events with probabilities p i such that ∑ i=1 W p i =1. Shannon’s (1) measure of the amount of choice for.
The Channel and Mutual Information
MultiModality Registration Using Hilbert-Schmidt Estimators By: Srinivas Peddi Computer Integrated Surgery II April 6 th, 2001.
Basic Concepts of Information Theory A measure of uncertainty. Entropy. 1.
Lecture 3 Appendix 1 Computation of the conditional entropy.
Mutual Information Brian Dils I590 – ALife/AI
Photoconsistency constraint C2 q C1 p l = 2 l = 3 Depth labels If this 3D point is visible in both cameras, pixels p and q should have similar intensities.
Mutual Information and Channel Capacity Multimedia Security.
Outline Time series prediction Find k-nearest neighbors Lag selection Weighted LS-SVM.
Image from
Basics of Multivariate Probability
Fereshteh S. Bashiri Advisors: Zeyun Yu, Roshan M. D’souza
Towards Measuring Anonymity
Mutual Information Based Registration of Medical Images
Multi-modality image registration using mutual information based on gradient vector flow Yujun Guo May 1,2006.
Image Registration 박성진.
Simulating Atmospheric Strehl
LECTURE 23: INFORMATION THEORY REVIEW
Similarities Differences
Presentation transcript:

Mutual Information Narendhran Vijayakumar 03/14/2008

Definition Mutual Information – I(A,B) Similarity measure – Amount of information B contains about A Registration metric – Maximum MI corresponds to Perfect Alignment – Amount of information B contains about A is maximum when the images are aligned perfectly 2

Mathematical Definition I(A,B) = H(A)+H(B)-H(A,B) – A is the reference image – B is the floating image – H(.) is entropy – I(.,.) is mutual information 3

Entropy H(A) is defined as – H(A) = -∑p i log 2 p i – P i – Probability of the intensities in an image Measure of dispersion of pdf Measure of intensity 4

Joint Probability Image 1 Image 2 5

Joint Probability-Registered image

Misaligned image 567X 13124X 10511X 846X 7

Joint Probability-misregistered images

Conclusion Dispersion Increases (for misaligned images) Joint Probability Value Increases I(A,B) reduces Registration is achieved by minimizing H(A,B) 9