S TRUCTURED L Prasanna RangarajanDr. Marc P Christensen Vikrant R BhaktaDr. Panos Papamichalis.

Slides:



Advertisements
Similar presentations
Procam and Campro Shree K. Nayar Computer Science Columbia University Support: NSF, ONR Procams 2006 PROCAMS Shree K. Nayar,
Advertisements

A Keystone-free Hand-held Mobile Projection System Li Zhaorong And KH Wong Reference: Zhaorong Li, Kin-Hong Wong, Yibo Gong, and Ming-Yuen Chang, “An Effective.
High-Resolution Three- Dimensional Sensing of Fast Deforming Objects Philip Fong Florian Buron Stanford University This work supported by:
IITB-Monash Research Academy An Indian-Australian Research Partnership IIT Bombay Projection Defocus Correction using Adaptive Kernel Sampling and Geometric.
Micro Phase Shifting Mohit Gupta and Shree K. Nayar Computer Science Columbia University Supported by: NSF and ONR.
www-video.eecs.berkeley.edu/research
--- some recent progress Bo Fu University of Kentucky.
Implications In any collocated camera+projector setup, there is a special illumination pattern that appears undistorted to the camera, for arbitrary scene.
Tsing Hua University, Taiwan Solar Acoustic Holograms January 2008, Tucson Dean-Yi Chou.
Synchrotron Diffraction. Synchrotron Applications What? Diffraction data are collected on diffractometer beam lines at the world’s synchrotron sources.
Generalized Mosaics Yoav Y. Schechner, Shree Nayar Department of Computer Science Columbia University.
Intelligent Systems Lab. Extrinsic Self Calibration of a Camera and a 3D Laser Range Finder from Natural Scenes Davide Scaramuzza, Ahad Harati, and Roland.
Stereo.
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
X From Video - Seminar By Randa Khayr Eli Shechtman, Yaron Caspi & Michal Irani.
SIGGRAPH Course 30: Performance-Driven Facial Animation Section: Markerless Face Capture and Automatic Model Construction Part 2: Li Zhang, Columbia University.
1 MURI review meeting 09/21/2004 Dynamic Scene Modeling Video and Image Processing Lab University of California, Berkeley Christian Frueh Avideh Zakhor.
3D Measurements by PIV  PIV is 2D measurement 2 velocity components: out-of-plane velocity is lost; 2D plane: unable to get velocity in a 3D volume. 
Oct 11, 2005CS477: Analog and Digital Communications1 FM Generation and Detection Analog and Digital Communications Autumn
Projection Defocus Analysis for Scene Capture and Image Display Li Zhang Shree Nayar Columbia University IIS SIGGRAPH Conference July 2006, Boston,
Fourier Theory and its Application to Vision
CSE473/573 – Stereo Correspondence
INTERFEROMETRIC TECHNIQUES
Sep 30, 2005CS477: Analog and Digital Communications1 SSB and Demodulation Analog and Digital Communications Autumn
Image-based Water Surface Reconstruction with Refractive Stereo Nigel Morris University of Toronto.
Course 3: Computational Photography Ramesh Raskar Mitsubishi Electric Research Labs Jack Tumblin Northwestern University Course WebPage :
1 REAL-TIME IMAGE PROCESSING APPROACH TO MEASURE TRAFFIC QUEUE PARAMETERS. M. Fathy and M.Y. Siyal Conference 1995: Image Processing And Its Applications.
MERL, MIT Media Lab Reinterpretable Imager Agrawal, Veeraraghavan & Raskar Amit Agrawal, Ashok Veeraraghavan and Ramesh Raskar Mitsubishi Electric Research.
Detecting Electrons: CCD vs Film Practical CryoEM Course July 26, 2005 Christopher Booth.
Chapter 5 Frequency Domain Analysis of Systems. Consider the following CT LTI system: absolutely integrable,Assumption: the impulse response h(t) is absolutely.
Camera Calibration & Stereo Reconstruction Jinxiang Chai.
A critical review of the Slanted Edge method for MTF measurement of color cameras and suggested enhancements Prasanna Rangarajan Indranil Sinharoy Dr.
Components of a computer vision system
C OMPUTER V Prasanna Rangarajan04/09/10 Dr. Panos Papamichalis.
Signal and Systems Prof. H. Sameti Chapter 8: Complex Exponential Amplitude Modulation Sinusoidal AM Demodulation of Sinusoidal AM Single-Sideband (SSB)
Introduction to Computational Photography. Computational Photography Digital Camera What is Computational Photography? Second breakthrough by IT First.
Real-Time High Resolution Photogrammetry John Morris, Georgy Gimel’farb and Patrice Delmas CITR, Tamaki Campus, University of Auckland.
MIT 2.71/2.710 Optics 12/06/04 wk14-a- 1 Holography Preamble: modulation and demodulation The principle of wavefront reconstruction The Leith-Upatnieks.
Surface Computing Turning everyday surfaces into interactive intelligent interfaces Co-located input and output Mixed reality: tangible objects, natural.
Chapter 5 Frequency Domain Analysis of Systems. Consider the following CT LTI system: absolutely integrable,Assumption: the impulse response h(t) is absolutely.
1 Chapter 5 Ideal Filters, Sampling, and Reconstruction Sections Wed. June 26, 2013.
Austin Roorda, Ph.D. University of Houston College of Optometry
December 4, 2014Computer Vision Lecture 22: Depth 1 Stereo Vision Comparing the similar triangles PMC l and p l LC l, we get: Similarly, for PNC r and.
Yu-Wing Tai, Hao Du, Michael S. Brown, Stephen Lin CVPR’08 (Longer Version in Revision at IEEE Trans PAMI) Google Search: Video Deblurring Spatially Varying.
Metrology 1.Perspective distortion. 2.Depth is lost.
Structured Light Based Depth Edge Detection for Object Shape Recovery Cheolhwon Kim, Jiyoung Park, Juneho Yi School of Information and Communication Engineering.
Resolution. Resolution: Refers to the sharpness and clarity of an image. The term is most often used to describe monitors, printers, and bit- mapped graphic.
Mitsubishi Electric Research Labs (MERL) Super-Res from Single Motion Blur PhotoAgrawal & Raskar Amit Agrawal and Ramesh Raskar Mitsubishi Electric Research.
Chapter 2 Signals and Spectra (All sections, except Section 8, are covered.)
Computer Vision Introduction to Digital Images.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
Vision Review: Miscellaneous Course web page: October 8, 2002.
Observing Transfer Functions For Multimode Spectrometers.
Tal Amir Advanced Topics in Computer Vision May 29 th, 2015 COUPLED MOTION- LIGHTING ANALYSIS.
Che-An Wu Background substitution. Background Substitution AlphaMa p Trimap Depth Map Extract the foreground object and put into another background Objective.
L08Tarange from cameras1.
range from cameras stereoscopic (3D) camera pairs illumination-based
VERTICAL SCANNING INTERFEROMETRY VSI
Date of download: 10/20/2017 Copyright © ASME. All rights reserved.
Uncontrolled Modulation Imaging
Rogerio Feris 1, Ramesh Raskar 2, Matthew Turk 1
EE 638: Principles of Digital Color Imaging Systems
Chapter I, Digital Imaging Fundamentals: Lesson II Capture
Generalized sampling theorem (GST) interpretation of DSR
Range Imaging Through Triangulation
Single Image Haze Removal Using Dark Channel Prior
Signal and Systems Chapter 8: Modulation
Macroscopic Interferometry with Electrons, not Photons
LIGO Scientific Collaboration
Acoustic Holography Sean Douglass.
Presentation transcript:

S TRUCTURED L Prasanna RangarajanDr. Marc P Christensen Vikrant R BhaktaDr. Panos Papamichalis

Page  2 Organization Imaging under “Structured Light” –what is “Structured Light” ? –estimating depth using “Structured Light” –Optical Super-Resolution : using Structured Light to overcome the lowpass nature of an imaging system –how is Optical Super-Resolution different from Digital Super-Resolution ? –what is wrong with state-of-the-art in Optical Super-Resolution ? Macroscopic OSR using Structured Light ( Uncalibrated ) Depth estimation using Structured Light ( Uncalibrated ) OSR + Depth estimation in a single setup ( Experimental Results )

Page  3 Structured Light and its applications What is Structured Light ?..... periodic light patterns Why is it useful ? –Traditionally, used to recover depth maps & surface topology –Recently, used in microscopes to resolve spatial detail that cannot be resolved by the microscope

Page  4 Closer look at Depth from Structured Light Phase Measuring Profilometry Principle –project a sinusoidal (periodic) pattern onto the scene, at a known angle –image of scene viewed from a different position AND-OR angle, reveals lateral displacements + frequency changes related to topological variations Mephisto 3D Scanner from 3D Dynamics SL hits from TI website 1.Application Report DLPA021 “Using the DLP Pico 2.0 Kit for Structured Light Applications” 2.Blog entry “3D Metrology and Structured Light”, by Dennis Doane DLP Other DLP based SL-Scanners ViaLUX, GFM, 3D3, ShapeQuest

Page  5 Problem : Cameras behave like low-pass filters because their impulse response is real non-negative finite bandwidth & resolution Objective of Optical Super-Resolution : Improve the resolution of a camera without altering its physical parameters: Optical Super-Resolution using Structured Light has revolutionized microscopy in recent years Principle : shift frequencies outside the passband into the passband How ? modulate the amplitude of a periodic pattern with scene information

Page  6 Optical Super-Resolution using Structured Light How is it different from Digital Super-Resolution ? Optical Super-Resolution See Optical Super-Resolution in action Digital Super-Resolution Recover spatial frequencies ( beyond the optical cutoff ) Recover spatial frequencies lost to aliasing ( but upto the optical cutoff )

Page  7 Optical Super-Resolution using Structured Light Perspective & De-magnification scene-dependent distortion ( useful for recovering depth but not OSR ) Perspective & de-magnification present a real challenge for macroscopic imaging /illumination systems such as commercial cameras/projectors Imaging & illumination systems in Structured Light-microscopy DO NOT experience significant perspective effects imaging parallel lines on railroad track How do we eliminate the scene-dependent distortion ?

Page  8 Solution-1 : Collocate the camera & projector, and illuminate the scene with a specific periodic pattern Daniel A. Vaquero, Ramesh Raskar, Rogerio S. Feris, & Matthew Turk. ”A Projector-Camera Setup for Geometry-Invariant Frequency Demultiplexing”. In IEEE Computer Vision and Pattern Recognition (CVPR'09) Macroscopic OSR using Structured Light Eliminating the scene-dependent distortion Are we really shifting frequencies outside the passband of the optics, into the passband ? Solution-2 : Coincide the camera & projector using a beam-splitter L. Zhang & S. K. Nayar, “Projection Defocus Analysis for Scene Capture and Image Display”, SIGGRAPH2006. “Macroscopic OSR” for imaging systems observing a 3D scene unsolved since 1963 W. Lukosz and M. Marchand, "Optischen Abbildung Unter Ueberschreitung der Beugungsbedingten Aufloesungsgrenze," Opt. Acta 10, (1963)

Page  9 Macroscopic OSR using Structured Light Our contributions –Identify a family of camera+projector setups that can realize OSR in macroscopic imaging, for arbitrary scenes –Unify existing embodiments of Structured Light –Single setup for recovering depth & realizing OSR raw image super-resolved image depth map Publications “Perspective Imaging under Structured Light”, accepted for publication in European Conference on Computer Vision, 2010 “Surpassing the Diffraction-limit of Digital Imaging Systems using Sinusoidal Illumination Patterns”, Computational Optical Sensing and Imaging, OSA Technical Digest (Optical Society of America), 2009 “A Method and Apparatus for Surpassing the Diffraction Limit in Imaging Systems”, filed patent

Page  10 Organization Imaging under “Structured Light” –what is “Structured Light” ? –estimating depth using “Structured Light” –Optical Super-Resolution : using Structured Light to overcome the lowpass nature of an imaging system –how is Optical Super-Resolution different from Digital Super-Resolution ? –what is wrong with state-of-the-art in Optical Super-Resolution ? Macroscopic OSR using Structured Light ( Uncalibrated ) Depth estimation using Structured Light ( Uncalibrated ) OSR + Depth estimation in a single setup ( Experimental Results )

Page  11 Identify the raw image and the exponentially modulated images Macroscopic OSR under Structured Light Complete Workflow Camera images under sinusoidal ilumination

Page  12 Identify the frequency of the modulating pattern After modulation, the DC component in shifts to the carrier frequency Macroscopic OSR under Structured Light Complete Workflow The DC component of the super-resolved image must have zero phase. Use this to identify Camera images under sinusoidal ilumination

Page  13 Aliasing Management avoid aliasing demodulated spatial frequencies that exceed the detector Nyquist frequency Macroscopic OSR under Structured Light Complete Workflow Camera images under sinusoidal ilumination

Page  14 Aliasing Management avoid aliasing demodulated spatial frequencies that exceed the detector Nyquist frequency Macroscopic OSR under Structured Light Complete Workflow How is it done ? (sinc-interpolation) symmetrically increase the size of the modulated images by prior to demodulation Camera images under sinusoidal ilumination

Page  15 Macroscopic OSR under Structured Light Complete Workflow Without aliasing management With aliasing management Aliasing Management avoid aliasing demodulated spatial frequencies that exceed the detector Nyquist frequency Camera images under sinusoidal ilumination

Page  16 Demodulation + Phase Compensation Macroscopic OSR under Structured Light Complete Workflow Any collocated/co-incident camera+projector setup can be used to recover spatial frequencies exceeding the bandwidth of an imaging system Quick Recap Camera images under sinusoidal ilumination

Page  17 Organization Imaging under “Structured Light” –what is “Structured Light” ? –estimating depth using “Structured Light” –Optical Super-Resolution : using Structured Light to overcome the lowpass nature of an imaging system –how is Optical Super-Resolution different from Digital Super-Resolution ? –what is wrong with state-of-the-art in Optical Super-Resolution ? Macroscopic OSR using Structured Light ( Uncalibrated ) Depth estimation using Structured Light ( Uncalibrated ) OSR + Depth estimation in a single setup ( Experimental Results )

Page  18 Recap : Depth from Structured Light Phase Measuring Profilometry Principle –project a sinusoidal (periodic) pattern onto the scene, at a known angle –image of scene viewed from a different position AND-OR angle, reveals lateral displacements + frequency changes related to topological variations Mephisto 3D Scanner from 3D Dynamics SL hits from TI website 1.Application Report DLPA021 “Using the DLP Pico 2.0 Kit for Structured Light Applications” 2.Blog entry “3D Metrology and Structured Light”, by Dennis Doane DLP Other DLP based SL-Scanners ViaLUX, GFM, 3D3, ShapeQuest

Page  19 Depth from Collocated Structured Light Complete Workflow To avoid ambiguities in phase unwrapping, 2 patterns ( 1 small frequency, 1 large frequency) are employed

Page  20 Organization Imaging under “Structured Light” –what is “Structured Light” ? –estimating depth using “Structured Light” –Optical Super-Resolution : using Structured Light to overcome the lowpass nature of an imaging system –how is Optical Super-Resolution different from Digital Super-Resolution ? –what is wrong with state-of-the-art in Optical Super-Resolution ? Macroscopic OSR using Structured Light ( Uncalibrated ) Depth estimation using Structured Light ( Uncalibrated ) OSR + Depth estimation in a single setup ( Experimental Results )

Page  21 Experimental Results Setup-1 : vertically collocated camera+projector

Page  22 Experimental Results - OSR Setup-1 : vertically collocated camera+projector OSR is possible only in the horizontal direction

Page  23 Experimental Results – Estimating depth Setup-1 : vertically collocated camera+projector

Page  24 Experimental Results - OSR Setup-2 : non-collocated camera+projector

Page  25 Experimental Results Setup-2 : non-collocated camera+projector Without aliasing management With aliasing management

Page  26 Closing Arguments & Open Issues Putting things in perspective It is possible to resolve detail exceeding the BW of a macroscopic imaging system There are camera+projector setups that can recover depth information + resolve detail exceeding the bandwidth of the imaging system Can we super-reslove when the optical axes of the camera and projector are crossed ? Can we accommodate aliasing during image capture ? Bar-code scanners Counterfeit Bill Detection Non-contact fingerprint scanning Non-contact archived document scanning Artwork authentication