Figure 6. Parameter Calculation. Parameters R, T, f, and c are found from m ij. Patient f : camera focal vector along optical axis c : camera center offset.

Slides:



Advertisements
Similar presentations
Simultaneous surveillance camera calibration and foot-head homology estimation from human detection 1 Author : Micusic & Pajdla Presenter : Shiu, Jia-Hau.
Advertisements

The fundamental matrix F
Lecture 11: Two-view geometry
Introduction to Computer Vision 3D Vision Lecture 4 Calibration CSc80000 Section 2 Spring 2005 Professor Zhigang Zhu, Rm 4439
3D reconstruction.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
GIS and Image Processing for Environmental Analysis with Outdoor Mobile Robots School of Electrical & Electronic Engineering Queen’s University Belfast.
Camera calibration and epipolar geometry
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
MSU CSE 240 Fall 2003 Stockman CV: 3D to 2D mathematics Perspective transformation; camera calibration; stereo computation; and more.
Stereo Algorithm Grimson’s From Images to Surfaces stereo algorithm Multi-resolution Proceed from coarse to fine level Assume 0 initial disparity — depth-dependent.
Epipolar geometry. (i)Correspondence geometry: Given an image point x in the first view, how does this constrain the position of the corresponding point.
Lecture 21: Multiple-view geometry and structure from motion
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Lecture 11: Structure from motion CS6670: Computer Vision Noah Snavely.
Passive Object Tracking from Stereo Vision Michael H. Rosenthal May 1, 2000.
CMPUT 412 3D Computer Vision Presented by Azad Shademan Feb , 2007.
Previously Two view geometry: epipolar geometry Stereo vision: 3D reconstruction epipolar lines Baseline O O’ epipolar plane.
COMP322/S2000/L23/L24/L251 Camera Calibration The most general case is that we have no knowledge of the camera parameters, i.e., its orientation, position,
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
3D Computer Vision and Video Computing 3D Vision Lecture 15 Stereo Vision (II) CSC 59866CD Fall 2004 Zhigang Zhu, NAC 8/203A
3D Computer Vision and Video Computing 3D Vision Lecture 14 Stereo Vision (I) CSC 59866CD Fall 2004 Zhigang Zhu, NAC 8/203A
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
A Novel 2D To 3D Image Technique Based On Object- Oriented Conversion.
CSE473/573 – Stereo Correspondence
Camera parameters Extrinisic parameters define location and orientation of camera reference frame with respect to world frame Intrinsic parameters define.
Stereo Sebastian Thrun, Gary Bradski, Daniel Russakoff Stanford CS223B Computer Vision (with slides by James Rehg and.
Stockman MSU/CSE Math models 3D to 2D Affine transformations in 3D; Projections 3D to 2D; Derivation of camera matrix form.
Lecture 12: Structure from motion CS6670: Computer Vision Noah Snavely.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #15.
3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective.
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
Automatic Camera Calibration
Computer vision: models, learning and inference
Lecture 11 Stereo Reconstruction I Lecture 11 Stereo Reconstruction I Mata kuliah: T Computer Vision Tahun: 2010.
© 2005 Yusuf Akgul Gebze Institute of Technology Department of Computer Engineering Computer Vision Geometric Camera Calibration.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Shape from Stereo  Disparity between two images  Photogrammetry  Finding Corresponding Points Correlation based methods Feature based methods.
Lecture 04 22/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: , Perspective Geometry Camera Model Stereo Triangulation 3D Reconstruction by.
CS654: Digital Image Analysis Lecture 8: Stereo Imaging.
PFI Cobra/MC simulator Peter Mao. purpose develop algorithms for fiducial (FF) and science (SF) fiber identification under representative operating conditions.
3D Reconstruction Jeff Boody. Goals ● Reconstruct 3D models from a sequence of at least two images ● No prior knowledge of the camera or scene ● Use the.
Medical Image Analysis Image Registration Figures come from the textbook: Medical Image Analysis, by Atam P. Dhawan, IEEE Press, 2003.
1 Formation et Analyse d’Images Session 7 Daniela Hall 25 November 2004.
Plenoptic Modeling: An Image-Based Rendering System Leonard McMillan & Gary Bishop SIGGRAPH 1995 presented by Dave Edwards 10/12/2000.
Single Photon Emission Computed Tomography
Lec 22: Stereo CS4670 / 5670: Computer Vision Kavita Bala.
Announcements Project 3 due Thursday by 11:59pm Demos on Friday; signup on CMS Prelim to be distributed in class Friday, due Wednesday by the beginning.
Lecture 03 15/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
Computer Vision : CISC 4/689 Going Back a little Cameras.ppt.
Single-view geometry Odilon Redon, Cyclops, 1914.
Corner Detection & Color Segmentation CSE350/ Sep 03.
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
3D Reconstruction Using Image Sequence
3D Computer Vision and Video Computing 3D Vision Topic 2 of Part II Calibration CSc I6716 Spring2013 Zhigang Zhu, City College of New York
Camera Model Calibration
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
Camera Calibration Course web page: vision.cis.udel.edu/cv March 24, 2003  Lecture 17.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry
Processing visual information for Computer Vision
Answering ‘Where am I?’ by Nonlinear Least Squares
CS4670 / 5670: Computer Vision Kavita Bala Lec 27: Stereo.
Orthogonal and Least-Squares Based Coordinate Transforms for Optical Alignment Verification in Radiosurgery Ernesto Gomez PhD, Yasha Karant PhD, Veysi.
Epipolar geometry.
Computer Graphics Recitation 12.
Image Stitching Linda Shapiro ECE/CSE 576.
Presentation transcript:

Figure 6. Parameter Calculation. Parameters R, T, f, and c are found from m ij. Patient f : camera focal vector along optical axis c : camera center offset Calculate Parameters Trucco & Verri, “Intro Techniques for 3-D Comp Vision”, PROJ_MAT_CALIB Write world  image transformation equations 2 equations in 12 unknowns/point pair Use enough point pairs to determine Solve system of equations using SVD Extract parameters from SVD solution Coordinate systems Gamma camera / World coordinates x w Optical camera coordinates x c is rotation & translation from world x w Image coordinates x i is projection from camera x c Equations World point x w =(x w,y w,z w ) projects to x i =(x i,y i ) where Rewrite to be linear in m ij as Each world / image point pair gives N point pairs yield 2 N equations, so need  6 point pairs to compute m ij Solve Am=0 where Find m in A’ s nullspace using Singular Value Decomposition 2. Procedure We designed a calibration phantom, comprising a set of Lucite disks, that can be imaged by an optical camera and a gamma camera. The disk arrangement is non-coplanar and asymmetric, guaranteeing a unique solution for the calibration parameter equations. Abstract Objectives: One approach to motion detection in SPECT is to observe the patient using optical cameras. Patient motion is estimated from changes in the images and is used to modify the reconstruction algorithm. An important subproblem is calibrating the optical cameras and the gamma camera. That is, it is necessary to determine the transformation from the gamma camera coordinate system to the optical camera coordinate system such that given a gamma camera point, one may compute the corresponding optical camera point. Conversely, given an optical camera point, one may compute the corresponding patient ray. Methods: We have devised a calibration phantom that can be imaged using both optical and gamma cameras. The phantom comprises a set of Lucite disks; each disk supports 2 low-intensity light bulbs and a 0.8mm diameter hole centered between the bulbs to hold a 99m Tc point source. The radioactive source location for each disk in image coordinates is taken to be the midpoint of the bulbs. The radioactive source location in gamma camera coordinates is found by segmenting the reconstructed source distribution and computing the centroid of the activity of each source. At least 6 such point pairs are needed, although 7 are used in practice to provide increased accuracy. Using procedure PROJ_MAT_CALIB of Trucco & Verri Introductory Techniques for Computer Vision, we compute the 11 parameters of the coordinate transformation and the residual error. Because we do not know in advance which optical camera points match which gamma camera points, an exhaustive search is used to find lowest-error matches. Results: We have been able to match optical and gamma camera points and determine the transformation. Tomographic reconstruction and segmentation take up most of the processing time; point matching and parameter calculation take less than 14 seconds of processor time on a Digital Alpha 433au workstation. Conclusions: A calibration phantom can be imaged simultaneously to calibrate optical and gamma cameras and the transformation computed with no other input required. 1. Introduction One approach to motion detection in SPECT is to observe the patient using optical cameras. Patient motion is estimated from changes in the images and is used to modify the reconstruction algorithm. In order to relate changes in patient position as observed by the optical cameras, to SPECT data as observed by the gamma camera, it is necessary to determine the camera parameters. This is the calibration problem— determining the transformation from the gamma camera coordinate system to an optical camera coordinate system and vice versa. 99m Tc well (approx 0.1 mCi) Light bulbs Figure 2. Calibration Phantom comprising 7 Lucite disks (left), each holding 2 light bulbs and a 99m Tc source. Complete phantom is shown at right. Figure 3. Optical (left) and gamma (right) images of the phantom. The 7 pairs of light bulbs are clearly visible in the optical image. The reconstructed gamma image shows 64 out of 128 slices, with 5 out of 7 gamma source blobs. Optical Blobs X Centroid Y Centroid Gamma Blobs X Centroid Y Centroid Z Centroid Figure 4. Blob Detection. Processing is similar for optical and gamma blob detection. For optical blobs, final centroids are computed by taking midpoints of pairs of closest blobs. Algorithm Read image Threshold Segment Select regions Optical: manually Gamma: largest regions Compute centroids Match Generation Generate all possible image  world point matches If N points, generate N! permutations Compute camera parameters for each possible match For each parameter set, calculate residual error defined as Select parameter set with lowest residual error Figure 5. Match Generation. Best match is found by exhaustive search. Image camera parameters Calculate Parameters Acquire Optical Image Detect Blobs Generate Matches possible matches Select Best image point list Acquire Gamma Image Detect Blobs Image world point list Figure 1. Calibration Processing Flow. Figures 2–7 show module details. Calibration Phantom 3. Conclusions We have successfully calibrated optical and gamma cameras. The best match residual error is 1000 pixel 2, giving confidence that the best match of optical and gamma points has been found. Figure 7. Sample Output showing image points and world points in correct correspondence, with residual error=9.05. Camera parameters are computed from matrix entries m ij. For a 640x480 image the expected camera center offset is [319.5,239.5]. Note the close agreement with image center IC=[317.2,238.6]. Sample Output IPL = ImagePointList[ ImagePoint(177.5, 82.0), ImagePoint(222.5, 211.0), ImagePoint(193.5, 349.0), ImagePoint(293.5, 289.5), ImagePoint(359.0, 83.5), ImagePoint(312.0, 347.0), ImagePoint(269.5, 437.0)] WPL = WorldPointList[ WorldPoint[88.3, 39.7, 62.2], WorldPoint[66.9, 87.3, 79.6], WorldPoint[82.4, 87.1, 31.0], WorldPoint[53.8, 87.5, 48.3], WorldPoint[42.0, 40.2, 71.6], WorldPoint[53.7, 86.8, 29.7], WorldPoint[70.5, 86.4, 6.8]] Res = 9.05 CPs = Camera Parameters[ T:[79.8, -19.6, 104.2], R:[[-0.923, , ], [-0.010, 0.747, ], [-0.383, 0.610, 0.692]], IC:[[317.2], [238.6]], fx:650.7, fy:672.4] Calibrating Optical Images and Gamma Camera Images for Motion Detection Michael A. Gennert 1,2, Philippe P. Bruyant 1, Manoj V. Narayanan 1, Michael A. King 1 1 University of Massachusetts Medical School, Worcester, MA 2 Worcester Polytechnic Institute, Worcester, MA