Iterative Algorithm g = Af g : Vector of projection data

Slides:



Advertisements
Similar presentations
Curved Trajectories towards Local Minimum of a Function Al Jimenez Mathematics Department California Polytechnic State University San Luis Obispo, CA
Advertisements

ECE 471/571 - Lecture 13 Gradient Descent 10/13/14.
Medical Image Registration Kumar Rajamani. Registration Spatial transform that maps points from one image to corresponding points in another image.
Optimization of thermal processes
Supervised Learning Recap
Steepest Decent and Conjugate Gradients (CG). Solving of the linear equation system.
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Statistical image reconstruction
Bayesian Image Super-resolution, Continued Lyndsey C. Pickup, David P. Capel, Stephen J. Roberts and Andrew Zisserman, Robotics Research Group, University.
Last Time Pinhole camera model, projection
Canny Edge Detector.
Motion Analysis (contd.) Slides are from RPI Registration Class.
CSci 6971: Image Registration Lecture 4: First Examples January 23, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart, RPI Dr.
Optimization Methods One-Dimensional Unconstrained Optimization
Real-time Combined 2D+3D Active Appearance Models Jing Xiao, Simon Baker,Iain Matthew, and Takeo Kanade CVPR 2004 Presented by Pat Chan 23/11/2004.
L15:Microarray analysis (Classification). The Biological Problem Two conditions that need to be differentiated, (Have different treatments). EX: ALL (Acute.
Advanced Topics in Optimization
Motion from normal flow. Optical flow difficulties The aperture problemDepth discontinuities.
Introduction to Longitudinal Phase Space Tomography Duncan Scott.

9 1 Performance Optimization. 9 2 Basic Optimization Algorithm p k - Search Direction  k - Learning Rate or.
By Mary Hudachek-Buswell. Overview Atmospheric Turbulence Blur.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems Prof. Hao Zhu Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign.
UNCONSTRAINED MULTIVARIABLE
Collaborative Filtering Matrix Factorization Approach
Feature and object tracking algorithms for video tracking Student: Oren Shevach Instructor: Arie nakhmani.
Medical Image Analysis Image Reconstruction Figures come from the textbook: Medical Image Analysis, by Atam P. Dhawan, IEEE Press, 2003.
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
Linear Models for Classification
July 11, 2006Bayesian Inference and Maximum Entropy Probing the covariance matrix Kenneth M. Hanson T-16, Nuclear Physics; Theoretical Division Los.
02/17/10 CSCE 769 Optimization Homayoun Valafar Department of Computer Science and Engineering, USC.
Quasi-Newton Methods of Optimization Lecture 2. General Algorithm n A Baseline Scenario Algorithm U (Model algorithm for n- dimensional unconstrained.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Supervised Learning Resources: AG: Conditional Maximum Likelihood DP:
CHAPTER 10 Widrow-Hoff Learning Ming-Feng Yeh.
Variations on Backpropagation.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Statistical Methods for Image Reconstruction
Daphne Koller Introduction Motivation and Overview Probabilistic Graphical Models.
Pattern Recognition and Image Analysis Dr. Manal Helal – Fall 2014 Lecture 7 Linear Classifiers.
Fast 3D Least-squares Migration with a Deblurring Filter Wei Dai.
Hidden Markov Models HMM Hassanin M. Al-Barhamtoshy
Lecture 16: Image alignment
Krylov-Subspace Methods - I
Non-linear Minimization
Digital Visual Effects, Spring 2007 Yung-Yu Chuang 2007/4/17
Classification with Perceptrons Reading:
Hidden Markov Models - Training
Super-resolution Image Reconstruction
Unsupervised-learning Methods for Image Clustering
A special case of calibration
CS5321 Numerical Optimization
Collaborative Filtering Matrix Factorization Approach
Variations on Backpropagation.
Outline Single neuron case: Nonlinear error correcting learning
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Reconstruction.
CS5321 Numerical Optimization
Introduction to Scientific Computing II
Canny Edge Detector.
Introduction to Scientific Computing II
15.5 Directional Derivatives
Introduction to Scientific Computing II
The loss function, the normal equation,
Mathematical Foundations of BME Reza Shadmehr
Neural networks (1) Traditional multi-layer perceptrons
Introduction to Scientific Computing II
Variations on Backpropagation.
Performance Optimization
Calibration and homographies
Computed Tomography.
Presentation transcript:

Iterative Algorithm g = Af g : Vector of projection data f : vector of image data A : matrix The principle of the iterative algorithms is to find a solution by successive estimates.

Iterative Algorithm Algorithms of General Concept of Iterative Methods: make the first arbitrary estimate of the slice. int maximumOfLoop = MAX_NUM Int current_loop = 0; while (current_loop <= maximumOfLoop){ generate estimate projections; compare the projections of estimate with measured projections; calculate correction factors; if(the correction factors == 0) break; else apply corrections to the estimate and make the new estimate of the slice }

Iterative Algorithm Example of ART Iterative Algorithm Formula: fk+1 : new estimates fk : current estimates: N: the num of pixels along ray i, : the sum of counts in the N pixels along ray i for the k iteration.

Gradient Algorithm Features of Gradient: Handle huge number of images See the different which between measured projections and image projections as mountainous terrain. Using 3D plot to display the different. The Goal of Gradient Algorithm: Find the location of the point which has minimal difference.

Gradient Algorithm Steps of Gradient: Start from an arbitrary location. Find the direction with steepest descent. Step in that direction with optimizing step length. Stop before ascend. If new direction and new step length can be found, step in new direction with new step length, else, the algorithm ends.

Gradient Algorithm

Gradient Algorithm fk+1 : new estimates fk : current estimates k : step length pk : new direction

CG Algorithm Disadvantages of Gradient Algorithm: Not effcient Local minimal. Conjugate gradient algorithm: Find the new direction based on current location combine with previous location.

MLEM Algorithm g = Af Features of MLEM: Use Poisson probabilistic phenomena of the radioactive disintegrations. g is a particular measurement. 3. f is the particular solution corresponding to that particular measurement g. The Goal of MLEM Algorithm Find a general solution as the best estimate for f: the mean number of radioactive disintegrations f in the image that can produce the sinogram g with the highest likelihood.

MLEM Algorithm Steps of MLEM: Find an arbitrary start point. 2. Calculate the likelihood of any reconstructed image given the measured data is formed. 3. The image that has the greatest likelihood to give the measured data is found

MLEM Algorithm Problem of MLEM Algorithm: With the increasing of iterations, the noisy will also be increase. Why? Because noisy reconstructed images may yield projections that are very close to the measured noisy projections.

New Criterion Old Criterion: 1, estimated projections have to be as close as possible to measured projections. New Criterion: 2, the reconstructed images have to be not too noisy.

Maximum A Posteriori (MAP) Algorithms : Use to enforce smoothing