Algebraic and Statistic Reconstruction Algorithms Liran Levy Advanced Topics in Sampling (049029), Winter 2008/9.

Slides:



Advertisements
Similar presentations
An Image Filtering Technique for SPIDER Visible Tomography N. Fonnesu M. Agostini, M. Brombin, R.Pasqualotto, G.Serianni 3rd PhD Event- York- 24th-26th.
Advertisements

Image Reconstruction.
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
1 Low-Dose Dual-Energy CT for PET Attenuation Correction with Statistical Sinogram Restoration Joonki Noh, Jeffrey A. Fessler EECS Department, The University.
CS479/679 Pattern Recognition Dr. George Bebis
 Nuclear Medicine Effect of Overlapping Projections on Reconstruction Image Quality in Multipinhole SPECT Kathleen Vunckx Johan Nuyts Nuclear Medicine,
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Implementing the Probability Matrix Technique for Positron Emission Tomography By: Chris Markson Student Adviser: Dr. Kaufman.
Segmentation and Fitting Using Probabilistic Methods
Visual Recognition Tutorial
Statistical image reconstruction
Volkan Cevher, Marco F. Duarte, and Richard G. Baraniuk European Signal Processing Conference 2008.
Active Calibration of Cameras: Theory and Implementation Anup Basu Sung Huh CPSC 643 Individual Presentation II March 4 th,
Clustering short time series gene expression data Jason Ernst, Gerard J. Nau and Ziv Bar-Joseph BIOINFORMATICS, vol
Project Overview Reconstruction in Diffracted Ultrasound Tomography Tali Meiri & Tali Saul Supervised by: Dr. Michael Zibulevsky Dr. Haim Azhari Alexander.
Resampling techniques Why resampling? Jacknife Cross-validation Bootstrap Examples of application of bootstrap.
Motion Analysis (contd.) Slides are from RPI Registration Class.
Predictive Automatic Relevance Determination by Expectation Propagation Yuan (Alan) Qi Thomas P. Minka Rosalind W. Picard Zoubin Ghahramani.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
An Optimal Learning Approach to Finding an Outbreak of a Disease Warren Scott Warren Powell
Evaluating Hypotheses
Introduction to Longitudinal Phase Space Tomography Duncan Scott.
Application of Digital Signal Processing in Computed tomography (CT)
Planar scintigraphy produces two-dimensional images of three dimensional objects. It is handicapped by the superposition of active and nonactive layers.
Maurizio Conti, Siemens Molecular Imaging, Knoxville, Tennessee, USA
Radial Basis Function Networks
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Feature and object tracking algorithms for video tracking Student: Oren Shevach Instructor: Arie nakhmani.
Design and simulation of micro-SPECT: A small animal imaging system Freek Beekman and Brendan Vastenhouw Section tomographic reconstruction and instrumentation.
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
Image Restoration using Iterative Wiener Filter --- ECE533 Project Report Jing Liu, Yan Wu.
Computed Tomography References The Essential Physics of Medical Imaging 2 nd ed n. Bushberg J.T. et.al Computed Tomography 2 nd ed n : Seeram Physics of.
Medical Image Analysis Image Reconstruction Figures come from the textbook: Medical Image Analysis, by Atam P. Dhawan, IEEE Press, 2003.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Image Reconstruction from Projections Antti Tuomas Jalava Jaime Garrido Ceca.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Binary Stochastic Fields: Theory and Application to Modeling of Two-Phase Random Media Steve Koutsourelakis University of Innsbruck George Deodatis Columbia.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Single Photon Emission Computed Tomography
Professor Brian F Hutton Institute of Nuclear Medicine University College London Emission Tomography Principles and Reconstruction.
PET/SPECT Phantom. Side View of Phantom Image Resolution Intrinsic resolution FWHM Intrinsic resolution FWHM Field of view Field of view Measurement:
Digital Image Processing Lecture 10: Image Restoration March 28, 2005 Prof. Charlene Tsai.
Lecture 4: Statistics Review II Date: 9/5/02  Hypothesis tests: power  Estimation: likelihood, moment estimation, least square  Statistical properties.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
D. Gallagher, M. Adrian, J. Green, C. Gurgiolo, G. Khazanov, A. King, M. Liemohn, T. Newman, J. Perez, J. Taylor, B. Sandel IMAGE EUV & RPI Derived Distributions.
Digital Image Processing Lecture 10: Image Restoration
CHAPTER 5 SIGNAL SPACE ANALYSIS
Li HAN and Neal H. Clinthorne University of Michigan, Ann Arbor, MI, USA Performance comparison and system modeling of a Compton medical imaging system.
ECE-7000: Nonlinear Dynamical Systems Overfitting and model costs Overfitting  The more free parameters a model has, the better it can be adapted.
Nuclear Medicine: Tomographic Imaging – SPECT, SPECT-CT and PET-CT Katrina Cockburn Nuclear Medicine Physicist.
Radiation Detection and Measurement, JU, 1st Semester, (Saed Dababneh). 1 Radioactive decay is a random process. Fluctuations. Characterization.
Single-Slice Rebinning Method for Helical Cone-Beam CT
1 Introduction to Statistics − Day 4 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Lecture 2 Brief catalogue of probability.
October 1, 2013Computer Vision Lecture 9: From Edges to Contours 1 Canny Edge Detector However, usually there will still be noise in the array E[i, j],
Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen
September 28, 2000 Improved Simultaneous Data Reconciliation, Bias Detection and Identification Using Mixed Integer Optimization Methods Presented by:
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Maximum likelihood estimators Example: Random data X i drawn from a Poisson distribution with unknown  We want to determine  For any assumed value of.
Rate Distortion Theory. Introduction The description of an arbitrary real number requires an infinite number of bits, so a finite representation of a.
Chapter-4 Single-Photon emission computed tomography (SPECT)
RECONSTRUCTION OF MULTI- SPECTRAL IMAGES USING MAP Gaurav.
Part 3: Estimation of Parameters. Estimation of Parameters Most of the time, we have random samples but not the densities given. If the parametric form.
Chapter-4 Single-Photon emission computed tomography (SPECT)
Shaohua Kevin Zhou Center for Automation Research and
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Counting Statistics and Error Prediction
Prepared by Nuhu Abdulhaqq Isa
5.2 Least-Squares Fit to a Straight Line
Image and Video Processing
Computed Tomography.
Presentation transcript:

Algebraic and Statistic Reconstruction Algorithms Liran Levy Advanced Topics in Sampling (049029), Winter 2008/9

Presentation Outline Chapter 7: Algebraic Reconstruction Algorithms Principles of Computerized Tomographic Imaging, A. C. Kak and Malcolm Slaney, IEEE Press, 1988 An Evaluation of Maximum Likelihood Reconstruction for SPECT E. S. CHORNOBOY, C. J. CHEN, M. I. MILLER, T. R. MILLER, AND D. L. SNYDER, IEEE TRANSACTIONS ON MEDICAL IMAGING, VOL. 9, NO. I, MARCH 1990 Algebraic Reconstruction Techniques Can Be Made Computationally Efficient Gabor T. Herman, Fellow, ZEEE, and Lorraine B. Meyer IEEE TRANSACTIONS ON MEDICAL IMAGING, VOL. 12, NO. 3, SEPTEMBER 1993

Algebraic Reconstruction Algorithms Motivation: Transform based techniques require uniform distributions over 180/360 degrees, and a large number of projections. When this is not possible, we can assume that there is a cross section of arrays of unknowns, and use algebraic algorithms.

One index representation: M rays, each of width, intersecting the grid. is the fractional area of the j th cell to the i th ray. A line integral will be called a ray-sum. The measured data is : When: Matrix inversion impossible since: N,M~65,000, M<N, noise Image and Projection Representation

Iterative Solutions is single point in an N-dimensional space, and each ray-sum is a hyperplane. The intersections of all this hyperplanes is a single point. The computational procedure is (for N=2): 1. Initial guess 2. Projecting the initial guess on the first plane: 3. Projecting the result of (2) on the second plane. 4. Projecting the result back on the first plane, and so forth. 2D Realization:

Iterative Solutions, Notations For N-dimensional space: Initial guess: The projected result of on the i th plane: The process of projection is described by: Final result (after M projections, which is 1 iteration): If there is a unique solution, then:

Iterative Solutions, Comments If M<N there is no unique solution, and the final result depends on the initial guess. If the hyperplanes are orthogonal, the convergence is fast (1 iteration). If the angle between the hyperplanes is small, the convergence is slow. If there is a special condition, it can be applied after each step of the iteration, such as: or In the presence of noise, there is no unique solution. The algorithm solution oscillates.

Iterative Solutions, Noise

ART: Algebraic Reconstruction Techniques Storing and retrieval can pose a problem. Thus we write the projection as: We assume,depending on whether the i th ray intersects with the j th pixel. Another approach is to replace with: To cancel salt and pepper noise, we update by: decreases along the iterations.

SIRT: Simultaneous Iterative Reconstruction Techniques First we calculate all Then averaging and updating. The result is better looking images, at the expense of convergence rate.

SART: Simultaneous Algebraic Reconstruction Techniques We use N basis functions: to approximate :

SART(2): Simultaneous Algebraic Reconstruction Techniques If we choose to be N squares, we get back to the pixel basis functions. Another choice is bilinear functions, which leads to a continuous form of We allow finding of over a set of equidistant points. The step size is a parameter that can be changed.

Reconstruction: SART We examine the following test image:

Reconstruction: SART (2) Using N= (128x128), I=12,700 (under determined by 25%), we get: (Line plot at y=-0.605) Salt and pepper noise is dominant even for the big tumors.

Reconstruction: SART (3) Another solution is to calculate the average of the corrections for all the rays (for all pixels): This technique is fast converging like ART, and noise suppressing like SIRT.

Reconstruction: SART (4) Another solution: adding a longitudinal Hamming window, by replacing the coefficient with weighted correction term: 1 iteration SART+Hamming window:

Reconstruction: SART (5) 1 iteration sequential ART+Hamming window:

Reconstruction: SART (6) 2 iterations SART+Hamming window: 3 iterations SART+Hamming window:

Conclusions For the 1 iteration reconstruction: All the structures are fairly well distinguished, and the noise is practically gone. As the number of iterations increases, the salt and pepper increases as well. The use of Hamming window, and the average of corrections helps the algorithms.

Maximum Likelihood Reconstruction An Evaluation of Maximum likelihood Reconstruction for SPECT

Background SPECT (Single Photon Emission Computed Tomography): The body is injected with a radioactive tracer, which emits gamma radiation. A gamma camera measure these rays over multiple angles. Motivation: Use photon emission statistics, to obtain an iterative algorithm that maximizes the log-likelihood functional. Fundamental phenomena for SPECT imaging: 1. Random radioactive process ("low count" data). 2. Depth dependent response function. 3. Anisotropic attenuation response.

Assumptions Mapping of emission space into measurement space : Discrete set of angular camera positions: Each position consists measurements. x1x1 x2x2 D u

A statistical model, Basics Radioactive emission is modeled as a Poisson process: Ideal case: perfect line integral collimation, zero scattering, zero attenuation. The measurement is a Poisson process with intensity: Many problems: The collected data is a poor estimate for The inversion is not well defined. FBP (filtered backprojection) needs a smoothing function. Not all 3 phenomena's included.

A statistical model for SPECT Translated random error, depending on a transition probability density. is the measurement process, a Poisson process with intensity: Random deletion of the translated photons: is the final measurement, a Poisson process with mean intensity (the detector efficiency is, the attenuation density is ) :

Application of the EM procedure Assuming that are known quantities. Iterative algorithm for,maximizes the log-likelihood function: The estimate is defined by the recursion:

Application of the EM procedure (2) This is expressed by (just for reference): The error between the measurement and the i th iteration projected estimate: The correction term is:

Experimental Results: Comparison Criteria Let be the source intensity, and a two dimensional circular Gaussian smoothing kernel. is the correlation factor: The SNR is: The image resolution is, the FWHM that maximizes the correlation.

1 st Image Comparison: Bar Phantom This is a low-count data ("noisy"), in the absence of photon attenuation: FBP: ~0.5M measurements, 90 viewing angles, Hanning window (K=1). ML : ~0.5M measurements, 90 viewing angles, 500 iterations.

2 nd Image Comparison: Circular Cold Spot Phantom A uniform/nonuniform attenuation medium, 6.35 cm diameter rod is placed in a 21.6 cm diameter cylindrical, 3M measurements of 1.1cm.

3 rd Image Comparison: Chest Phantom Sourced imaged with 360 degrees (2 degrees steps), 0.4M counts. Attenuation map contains 3 different values: air, soft tissue, bones.

Final Conclusions ML reconstructions exhibit improved SNR (compared to a standard FBP), improved resolution, better image quantifiability, better ability to define objects boundaries. The results maintain under low-count data. ROI estimations is better than higher data count, or more iterations. Future research: PSRF and attenuation function are assumed to be known. In real life they should be measured. This causes more errors, and should be characterized.

Comparison: ART vs. ML-EM Algebraic Reconstruction Techniques Can Be Made Computationally Efficient

Comparison Layout Improving ART efficiency: 1. Building the mapping 2. The iterative step 3. Choosing the relaxation parameter ML-EM: Iterative algorithm, best fitted to PET. (PET: Positron Emission Tomography) Building testing set and training set (for ART efficiency improvement). Comparing results obtained using ART to those obtained using ML-EM. Conclusions

PET: Positron Emission Tomography

Improving ART efficiency: Introduction Two choices are left open for ART. 1. The order of the collected data to be accessed during the reconstruction procedure. 2. Choice of the relaxation parameter. Approximation by a finite set of basis functions is: is the value of the i th measurement of the j th basis function. The initial guess is the average activity:

ART: (1) The Mapping A permutation of the measurements: The mapping calculations (skipping the mathematics) 1. Divide P (number of rays/views) to its prime factors: 2. Assign each index a vector. (Skipping the details of how to build the vector) 3. Calculate : Result: This assures that dividing into sets of size L, will make each subgroup values as far as possible from each other, thus the subgroups are highly independent.

ART: (2) Iterative Step Assuming we have, the iteration step is: The relaxation parameter is.

ART: (3) Relaxation Parameter The algorithm suggested for choosing : 1. Calculate the FOM for three values, and decide on a direction (up/down). 2. Build ascending/descending series, and find a point where the FOM direction revere. 3. Continue until the convergence is 5% or less. 4. Find for all k.

EM Iteration A very basic EM iterative algorithm, which is superior to others: Both have the same computational cost per iterative step. FOM used is MSE (FOM: Figure Of Merit).

Comparing: ART vs. ML-EM We assign 300 views, with 99 rays each. EM: 15,30,45 iterations Phantom originals (3 copies) ART: 1,2,3 iterations

Comparing: ART vs. ML-EM (2) Divide 26 images simulating brain slices into: Training set: 6 images, Testing set: 20 images. 5 phantoms are generated for each image (total of 30,100). The above is repeated 10 times (choose 6 out of 26). Comparing the relaxation parameter: Iteration number is much more significant than the random selection of the training set.

Comparing: ART vs. ML-EM (3)

Comparing: ART vs. ML-EM (4) Results after 1 (2) iteration of ART are better than those after 45 (60) iterations of EM. After 6 (90) iterations results are equal and stable. ART (EM) doesn't improve beyond 6 (90) iterations. PET measurements have inaccuracies as a Poisson random variable model. Otherwise, EM would be better than ART.

Comparing: ART vs. ML-EM (5) EM can be "accelerated" by a factor of 3-4 top (to achieve a specific likelihood, not specific MSE. This will "slow" the "acceleration"). Matching a relaxation parameter and a training set for optimizing EM would speed it by a factor of 3 top. ART deals with data items one by one, while EM deals simultaneously. EM can also deal with a block of items, but this changes the concept of EM. Results cannot be extended to other FOM (figure of merit).

Comparing ART to Other Versions of ART Insuring non-negativity doesn't improve due to the nature of the phantom. Variant of ART guaranteed to converge to a regularized least square solution is not superior. Using a whole block of ART at a time (instead of one item at a time) is not superior. Conclusion: Choosing the data access order and the relaxation parameter greatly improves the performance of ART (compared to a basic version), and influences results of any comparison to other methods.

The End