3D Scene Calibration for Infrared Image Analysis

Slides:



Advertisements
Similar presentations
Distinctive Image Features from Scale-Invariant Keypoints
Advertisements

V. Martin et al. 1 (19) WFDPVA, ENEA Frascati 28/03/12 3D Scene Calibration for Infrared Image Analysis V. Martin, V. Gervaise, V. Moncada, M.H. Aumeunier,
V. Martin et al. 1 (18) WFDPVA, ENEA Frascati 28/03/12 3D Scene Calibration for Infrared Image Analysis V. Martin, V. Gervaise, V. Moncada, M.H. Aumeunier,
Feature Detection. Description Localization More Points Robust to occlusion Works with less texture More Repeatable Robust detection Precise localization.
Real-Time Projector Tracking on Complex Geometry Using Ordinary Imagery Tyler Johnson and Henry Fuchs University of North Carolina – Chapel Hill ProCams.
Image Registration  Mapping of Evolution. Registration Goals Assume the correspondences are known Find such f() and g() such that the images are best.
Medical Image Registration Kumar Rajamani. Registration Spatial transform that maps points from one image to corresponding points in another image.
TP14 - Local features: detection and description Computer Vision, FCUP, 2014 Miguel Coimbra Slides by Prof. Kristen Grauman.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Instructor: Mircea Nicolescu Lecture 13 CS 485 / 685 Computer Vision.
Computational Photography
Lecture 6: Feature matching CS4670: Computer Vision Noah Snavely.
Probabilistic video stabilization using Kalman filtering and mosaicking.
Direct Methods for Visual Scene Reconstruction Paper by Richard Szeliski & Sing Bing Kang Presented by Kristin Branson November 7, 2002.
Feature matching and tracking Class 5 Read Section 4.1 of course notes Read Shi and Tomasi’s paper on.
Feature tracking Class 5 Read Section 4.1 of course notes Read Shi and Tomasi’s paper on good features.
Object Detection and Tracking Mike Knowles 11 th January 2005
Image-Based Rendering using Hardware Accelerated Dynamic Textures Keith Yerex Dana Cobzas Martin Jagersand.
1 Invariant Local Feature for Object Recognition Presented by Wyman 2/05/2006.
Lecture 6: Feature matching and alignment CS4670: Computer Vision Noah Snavely.
Computer vision.
1 Mean shift and feature selection ECE 738 course project Zhaozheng Yin Spring 2005 Note: Figures and ideas are copyrighted by original authors.
Optical Flow Donald Tanguay June 12, Outline Description of optical flow General techniques Specific methods –Horn and Schunck (regularization)
Overview Harris interest points Comparing interest points (SSD, ZNCC, SIFT) Scale & affine invariant interest points Evaluation and comparison of different.
Local invariant features Cordelia Schmid INRIA, Grenoble.
3D SLAM for Omni-directional Camera
Correspondence-Free Determination of the Affine Fundamental Matrix (Tue) Young Ki Baik, Computer Vision Lab.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Phase Congruency Detects Corners and Edges Peter Kovesi School of Computer Science & Software Engineering The University of Western Australia.
A Frequency-Domain Approach to Registration Estimation in 3-D Space Phillip Curtis Pierre Payeur Vision, Imaging, Video and Autonomous Systems Research.
1 Registration algorithm based on image matching for outdoor AR system with fixed viewing position IEE Proc.-Vis. Image Signal Process., Vol. 153, No.
Digital Image Processing Definition: Computer-based manipulation and interpretation of digital images.
Lecture 7: Features Part 2 CS4670/5670: Computer Vision Noah Snavely.
Local invariant features Cordelia Schmid INRIA, Grenoble.
Raquel A. Romano 1 Scientific Computing Seminar May 12, 2004 Projective Geometry for Computer Vision Projective Geometry for Computer Vision Raquel A.
Vision and SLAM Ingeniería de Sistemas Integrados Departamento de Tecnología Electrónica Universidad de Málaga (Spain) Acción Integrada –’Visual-based.
Jack Pinches INFO410 & INFO350 S INFORMATION SCIENCE Computer Vision I.
Features, Feature descriptors, Matching Jana Kosecka George Mason University.
CSE 185 Introduction to Computer Vision Feature Matching.
Local features: detection and description
Lecture 9 Feature Extraction and Motion Estimation Slides by: Michael Black Clark F. Olson Jean Ponce.
Advanced Science and Technology Letters Vol.29 (SIP 2013), pp Electro-optics and Infrared Image Registration.
Comparison of Image Registration Methods David Grimm Joseph Handfield Mahnaz Mohammadi Yushan Zhu March 18, 2004.
Robotics Chapter 6 – Machine Vision Dr. Amit Goradia.
Image features and properties. Image content representation The simplest representation of an image pattern is to list image pixels, one after the other.
Optical flow and keypoint tracking Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Frank Bergschneider February 21, 2014 Presented to National Instruments.
MASKS © 2004 Invitation to 3D vision Lecture 3 Image Primitives andCorrespondence.
CSCI 631 – Foundations of Computer Vision March 15, 2016 Ashwini Imran Image Stitching.
CSCI 631 – Foundations of Computer Vision March 15, 2016 Ashwini Imran Image Stitching Link: singhashwini.mesinghashwini.me.
Design and Calibration of a Multi-View TOF Sensor Fusion System Young Min Kim, Derek Chan, Christian Theobalt, Sebastian Thrun Stanford University.
General Engineering Research Institute
SIFT Scale-Invariant Feature Transform David Lowe
Automatically Collect Ground Control Points from Online Aerial Maps
S. N. Simrock, A. Aallekar, L. Abadie, L. Bertalot, M. Cheon, C
Design for Embedded Image Processing on FPGAs
Lecture 07 13/12/2011 Shai Avidan הבהרה: החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
Project 1: hybrid images
CS4670 / 5670: Computer Vision Kavita Bala Lec 27: Stereo.
TP12 - Local features: detection and description
Local features: detection and description May 11th, 2017
Image Primitives and Correspondence
Feature description and matching
Filtering Things to take away from this lecture An image as a function
Application to the VIS/IR Imaging System of ITER
SIFT keypoint detection
Antonio Plaza University of Extremadura. Caceres, Spain
CSE 185 Introduction to Computer Vision
Feature descriptors and matching
Presented by Xu Miao April 20, 2005
Presentation transcript:

3D Scene Calibration for Infrared Image Analysis V. Martin, V. Gervaise, V. Moncada, M.H. Aumeunier, M. Firdaouss, J.M. Travere (CEA) S. Devaux (IPP), G. Arnoux (CCFE) and JET-EFDA contributors Workshop on Fusion Data Processing Validation and Analysis, ENEA Frascati, 26-28 March 2012

3D IR Scene Calibration Issue: a complex thermal scene JET #81313 KL7 (images in DL) Issue: a complex thermal scene Wide angle views with high geometrical effects: depth of field and curvature Many metallic materials (Be, W) with different and changing optical (reflectance) and thermal (emissivity) properties Objective: Match each pixel with the 3D scene model of in-vessel components for: getting the real geometry of the viewed objects reliable linking between viewed objects and their related properties Applications Image processing (e.g. event characterization) IR data calibration: Tsurf = f(material emissivity) Bulk Be W coated CFC Bulk Be Bulk Be Be coated linconel W coated CFC Bulk W

2D/3D Scene Model Mapping Methodology Calibration chain NUC Dead pixel Map Reference image 2D/3D scene models Knowledge base of the thermal scene Image Correction Image Stabilization 2D/3D Scene Model Mapping Image Processing Registered & Geo-calibrated Image Camera

Illustration of Motion in Images Camera vibrations lead to misalignments of ROIs (PFC RT protection) = false alarms or worth missed alarms Image stabilization is a mandatory step for heat flux deposit analysis based on Tsurf(t)-Tsurf(t-1) estimations

Image Stabilization Important factors for method selection Deformation type: planar (homothety), non-planar Target application: real-time processing, off-line analysis Data quality and variability: noise level, pixel intensity changes, image entropy Required precision level: pixel, sub-pixel Applications in tokamaks (non-exhaustive list) Motion amplitude Target application Precision required Difficulty JET KL7 wide-angle 5-10 pixels (camera vibrations) Hot spot detection PFC protection pixel low image entropy windowed up to 15 pixels (disruptions) Physics analysis (e.g. heat load during disruptions…) pixel intensity changes JET KL9 divertor tiles <1 pixel (sensor affected by magnetic fields) Physics analysis (power deposit influx) sub-pixel low resolution, slow motion, aliasing

Image Stabilization Classical Methodology Feature Detection Local descriptors: Harris corners, MSER, codebooks, Gabor wavelets (see Craciunescu talk), SIFT, SURF, FAST… Global descriptors: Tsallis entropy (see Murari talk), edge detectors… Fourier analysis: spectral magnitude & phase, pixel gradients, log-polar mapping… Feature Matching Spatial cross-correlation techniques: normalized cross-correlation, Hausdorff distance… Fourier domain: normalized cross-spectrum and its extensions Transform Model Estimation Shape preserving mapping (rotation, translation and scaling only) Elastic mapping: warping techniques… Image transformation 2D Interpolation: nearest neighboor, bilinear, bicubic… See Zitova’s survey, Image and Vision Computing, vol. 21(2003), pp. 977-1000

Proposed Algorithm Masked FFT-based image registration [1] Deterministic computing time Accelerating hardware compatible algorithm (e.g. FFT on GPU) → real time applications Local analysis with dynamic intensity-based pixel masking (e.g. mask the divertor bright region) with sub-pixel precision [2] Slow drift compensation and dynamic update of the reference image Robust to image intensity and structural changes Evaluation of the registration quality over time [1] D. Padfield, IEEE CVPR’10, pp. 2918-2925, 2010 [2] M. Guizar-Sicairos et al., Opt. Lett., vol. 33, no. 2, pp. 156-158, 2008

Principle of Fourier-based Correlation Let Iref a reference image, It an image at time t and DFT the Discrete 2D Fourier transform such as It ( x , y ) = Iref ( x-x0 , y-y0 ) Iref It max (NCC(Iref, It)) NCC(Iref, It) NCC is the Normalized Cross Correlation figure (image) and the position of the peak gives the coordinates of the translation ( x0 , y0 )

Sub-pixel Precision Up-sample k times the DFT of NCC (trigonometric interpolation): The peak coordinates ( x0 , y0 ) give F the translation with 1/k pixel of precision:

Reference Image Updating Goal: maintaining a good reliability of the motion estimator (NCC peak value) while image appearance changes during the pulse.

Reference Image Updating Solution: use the NCC peak value to trigger the update of Iref such as: update Iref update Iref update Iref update Iref NCC peak too low, no Iref update

Results JET #81313 (MARFE, disruption), KL7, 480x512 pixels, 50 Hz, 251 frames k=1/4 pixel

Results JET #80827 (disruption), KL7, 128x256 pixels, 540 Hz, 13425 frames k=1/2 pixel

Results JET #82278, KL9B (slow drift), 32x96 pixels, 6 kHz, 4828 frames 96 pixels 32 pixels

Computational Performance High frame rate performance using GPU 256x256, k=1/4 → 700 fps

From 2D to 3D Challenge Method transform pixel coordinates into machine coordinates: (x, y)  (r, θ, φ) Method Ray-tracing method from 3D/simplified CAD files

3D Scene Model for Image Processing S. Palazzo, A. Murari et al., RSI 81, 083505, 2010 Z Map (depth) 1 mm 2 2m 1 Blobs 1 & 2 must not be merged! 7m 2 1 2 V. Martin et al.

2D/3D Scene Model Mapping Integrated Framework An integrated software for IR data stabilization & analysis Set sub-pixel precision factor Set mask Load/save translations Used for PFC protection Used for temperature evaluation Used for event triggering Image Stabilization 2D/3D Scene Model Mapping Image Processing Image Correction Camera NUC Dead pixel Map Reference image 2D/3D scene models Knowledge base of the thermal scene Registered & Calibrated Image Plasma ImagiNg data Understanding Platform (PINUP)

Conclusion Summary Outlook Complex IR scenes require a new approach for reliable data analysis including image stabilization and 3D mapping. A robust and fast image stabilization algorithm with sub-pixel precision has been proposed. A first demonstration of 3D model for IR data analysis has been successfully carried out at JET on the wide-angle ITER-like viewing system (KL7). An integrated software (PINUP) implementing these features is available for users upon request. Outlook Test of the stabilization algorithm on visible imaging data (JET KL8) with rotation compensation Full integration of 3D scene models into PINUP Improvement of image processing algorithms (e.g. hot spot detection) with 3D information