When Does a Camera See Rain? Kshitiz Garg, Shree K. Nayar ICCV’ 05.

Slides:



Advertisements
Similar presentations
The Film Camera.
Advertisements

ISO, Aperture and Shutter Speed For Beginners. The photographer can control how much natural light reaches the sensor by adjusting the camera's ISO shutter.
Optics and Human Vision The physics of light
Depth from Structured Light II: Error Analysis
Color spaces CIE - RGB space. HSV - space. CIE - XYZ space.
CASTLEFORD CAMERA CLUB DEPTH OF FIELD. DEPTH OF FIELD (DOF) DOF is the portion of a scene that appears acceptably sharp in the image.
Lesson 1: The Art and Physics of Photography Digital Photography MITSAA IAP 2003 Rob Zehner.
3D Computer Vision and Video Computing 3D Vision Topic 4 of Part II Visual Motion CSc I6716 Fall 2011 Cover Image/video credits: Rick Szeliski, MSR Zhigang.
Shree Nayar and Srinivasa Narasimhan Computer Science Columbia University ICCV Conference Korfu, Greece, September 1999 Sponsors: NSF Vision in Bad Weather.
Multi video camera calibration and synchronization.
Spatio-Temporal Frequency Analysis for Removing Rain and Snow from Videos Carnegie Mellon University June 16, 2007 Peter Barnum Takeo Kanade Srinivasa.
Image Formation and Optics
When Does a Camera See Rain? Department of Computer Science Columbia University Kshitiz Garg Shree K. Nayar ICCV Conference October 2005, Beijing, China.
Detection and Removal of Rain from Videos Department of Computer Science Columbia University Kshitiz Garg and Shree K. Nayar IEEE CVPR Conference June.
High Dynamic Range Imaging: Spatially Varying Pixel Exposures Shree K. Nayar, Tomoo Mitsunaga CPSC 643 Presentation # 2 Brien Flewelling March 4 th, 2009.
Photorealistic Rendering of Rain Streaks Department of Computer Science Columbia University Kshitiz Garg Shree K. Nayar SIGGRAPH Conference July 2006,
CCU VISION LABORATORY Object Speed Measurements Using Motion Blurred Images 林惠勇 中正大學電機系
A Fast and Efficient VOP Extraction Method Based on Watershed Segmentation Alireza Tavakkoli Dr. Shohreh Kasaei Gholamreza Amayeh Sharif University of.
Lensless Imaging with A Controllable Aperture Assaf Zomet and Shree K. Nayar Columbia University IEEE CVPR Conference June 2006, New York, USA.
Augmented Reality: Object Tracking and Active Appearance Model
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Optic flow from dynamic anchor point attributes a feasibility study Bart Janssen.
Factors affecting the depth of field for SEM Afshin Jooshesh.
Importance Driven Volume Rendering Authors: I. Viola, A. Kanitsar, M. Gröler Visualization II Instructor: Jessica Crouch.
Optical flow (motion vector) computation Course: Computer Graphics and Image Processing Semester:Fall 2002 Presenter:Nilesh Ghubade
How the Camera Works ( both film and digital )
Multiple View Geometry in Computer Vision Slides modified from Marc Pollefeys’ online course materials Lecturer: Prof. Dezhen Song.
Urban Photo Album. NEW PICTURE EFFECTS Introducing.
Cameras Course web page: vision.cis.udel.edu/cv March 22, 2003  Lecture 16.
A Brief Overview of Computer Vision Jinxiang Chai.
1 Optical systems: Cameras and the eye Hecht 5.7 Friday October 4, 2002.
1 CS6825: Image Formation How are images created. How are images created.
Introduction to Computational Photography. Computational Photography Digital Camera What is Computational Photography? Second breakthrough by IT First.
Real Camera Real-time Rendering of Physically Based Optical Effects in Theory and Practice Yoshiharu Gotanda tri-Ace, Inc.
Shedding Light on the Weather
High-Resolution Interactive Panoramas with MPEG-4 발표자 : 김영백 임베디드시스템연구실.
Motion Segmentation By Hadas Shahar (and John Y.A.Wang, and Edward H. Adelson, and Wikipedia and YouTube) 1.
CS348B Lecture 7Pat Hanrahan, 2005 Camera Simulation EffectCause Field of viewFilm size, stops and pupils Depth of field Aperture, focal length Motion.
Copyright © by Holt, Rinehart and Winston. All rights reserved. ResourcesChapter menu Section 3 Precipitation Chapter 23 Objectives Identify the four forms.
Aperture What is it and how does it affect your pictures?
Raquel A. Romano 1 Scientific Computing Seminar May 12, 2004 Projective Geometry for Computer Vision Projective Geometry for Computer Vision Raquel A.
Precipitation Precipitation refers to any product of the condensation of atmospheric water vapour that is deposited on the Earth's surface. Precipitation.
Chapter 9: Weather Factors Section 5: Precipitation clouds.
Water in the Atmosphere Section 3 Section 3: Precipitation Preview Key Ideas Forms of Precipitation Causes of Precipitation Measuring Precipitation Weather.
1 Perception and VR MONT 104S, Fall 2008 Lecture 4 Lightness, Brightness and Edges.
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
Thin Lenses A lens is an optical device consisting of two refracting surfaces The simplest lens has two spherical surfaces close enough together that we.
Course14 Dynamic Vision. Biological vision can cope with changing world Moving and changing objects Change illumination Change View-point.
Lesson 1: The Art and Physics of Photography Basic Photography MITSAA Spring 2003 Rob Zehner.
Motion / Optical Flow II Estimation of Motion Field Avneesh Sud.
Introduction to Photography To take beautiful photographs you do not need an expensive camera and a bag full of equipment. What is important is the photographer’s.
Dynamic Perspective How a moving camera reveals scene depth and egomotion parameters Jitendra Malik UC Berkeley.
Motion Segmentation at Any Speed Shrinivas J. Pundlik Department of Electrical and Computer Engineering, Clemson University, Clemson, SC.
Camera Settings What Do They Do?. Opening in the camera that controls the amount of light that reaches the image sensor Aperture.
Water in the Atmosphere Section 3 Section 3: Precipitation Preview Key Ideas Forms of Precipitation Causes of Precipitation Measuring Precipitation Weather.
CSE 185 Introduction to Computer Vision
CAMERAS, PARTS of the CAMERA, and ACCESSORIES (TAKE NOTES ON THE UNDERLINED MATERIAL AND LABELLED DIAGRAMS)
Instant Dehazing of Images using Polarization
Camera Settings What Do They Do?.
The physics of falling In this section you will
Depth of Field Objective: to photograph a subject with a foreground and background, giving a better understanding of aperture and its relation to depth.
RAINDROP FALL VELOCITY DEVIATIONS FROM THE TERMINAL VELOCITIES
THE INTELLIGENT WINDSHIELD WIPER SYSTEM
Alejandro López Ontavilla
Rob Fergus Computer Vision
The Brightness Constraint
International Young Naturalists’ Tournament
The physics of falling In this section you will
Image Segmentation.
Photographic Image Formation I
Presentation transcript:

When Does a Camera See Rain? Kshitiz Garg, Shree K. Nayar ICCV’ 05

Outline Abstract Dynamic Weather And Vision Visibility of Rain Camera Parameters for Rain Removal Camera Based Rain Gauge Conclusion

Abstract The Intensity fluctuations depend on the camera parameters, the properties of rain, and the brightness of the scene. We use the camera parameters to remove rain in images and videos.

Dynamic Weather And Vision Dynamic weather (rain and snow) introduces sharp intensity fluctuation in images and videos.

Dynamic Weather And Vision (Cont ’ d) Analysis of Visibility of Rain  The square of the raindrop size  The brightness of the background scene  Exposure time  The depth of field

Dynamic Weather And Vision (Cont ’ d) Camera Parameters for Removal of Rain  Exposure time  F-number  Focus setting

Dynamic Weather And Vision (Cont ’ d) Camera Based Rain Gauge  Camera parameters can be set to enhance the visual effects of rain.  We can build a camera-based rain gauge to measure rain rate.

Visibility of Rain Variance at a pixel over time can be used to measure the visibility of rain and can be used as a quantitative measure of it.

Visibility of Rain (Cont ’ d) Camera and Intensity of a Raindrop Ref: B.K.P. Horn. Robot Vision. The MIT Press, 1986.

Visibility of Rain (Cont ’ d) Raindrops and Exposure Time Rain visible region: Rain visible region

Visibility of Rain (Cont ’ d) Raindrops and Exposure Time  When : Ref: K. Garg and S.K. Nayar. Detection and removal of rain in videos. In CVPR04,2004.

Visibility of Rain (Cont ’ d) Rain and Depth of Field  Since raindrops fall at high velocity, we assume Ref: K. Garg and S.K. Nayar. Detection and removal of rain in videos. In CVPR04,2004.

Visibility of Rain (Cont ’ d) Rain and Depth of Field  When : Ref: K. Garg and S.K. Nayar. Detection and removal of rain in videos. In CVPR04,2004.

Visibility of Rain (Cont ’ d) Camera Parameters and Volume of Rain  We partition the volume into thin layers of rain of thickness.  The variance due to a variance of rain is then the sum of the variances due to the different layers.

Visibility of Rain (Cont ’ d) Camera Parameters and Volume of Rain  The variance due to a single layer of rain:

Visibility of Rain (Cont ’ d) Camera Parameters and Volume of Rain  The standard deviation due to a volume of rain:

Visibility of Rain (Cont ’ d) Camera Parameters and Volume of Rain  The visibility of rain decreases as with exposure time of the camera.

Visibility of Rain (Cont ’ d) Camera Parameters and Volume of Rain  initially increases rapidly with F-number and then reaches saturation for higher F-number.

Visibility of Rain (Cont ’ d) Camera Parameters and Volume of Rain  dependence with respect to distance of the focal plane.

Camera Parameters for Rain Removal Reducing Rain using Depth of Field  As decreases, decreases.

Camera Parameters for Rain Removal (Cont ’ d) Reducing Rain using Exposure Time  As increases, decreases.

Camera Parameters for Rain Removal (Cont ’ d) Reducing Rain using Multiple Parameters

Camera Parameters for Rain Removal (Cont ’ d) Reducing Heavy Rain

Camera Parameters for Rain Removal (Cont ’ d) This method will not be able to reduce rain in scenes with very fast motion and when objects are very close to the camera.

Camera Based Rain Gauge A camera-based rain gauge - a device that measures rain rate. Ref: R. Gunn and G.D. Kinzer. Terminal velocity for water droplet in stagnant air. J. Metero., 6:243–248, T. Wang and R.S Clifford. Use of rainfall-induced optical scintillations to measure path-averaged rain parameters. JOSA, 8:927–937, 1975.

Camera Based Rain Gauge (Cont ’ d)

Conclusion The visibility of rain is affected by factors such as camera parameters, properties of rain, and the brightness of scene. This method is not as effective in reducing rain from scenes with very heavy rain or scenes with fast-moving objects that are close to the camera.

Conclusion (Cont ’ d) The visibility of rain can be enhanced to build an inexpensive and portable camera-based rain gauge that provides instantaneous rain rate.