Introduction to Computational Photography. Computational Photography Digital Camera What is Computational Photography? Second breakthrough by IT First.

Slides:



Advertisements
Similar presentations
Aperture, Exposure and Depth of Field
Advertisements

MIT Media Lab Camera Culture Image Destabilization: Programmable Defocus using Lens and Sensor Motion Ankit Mohan, Douglas Lanman, Shinsaku Hiura, Ramesh.
Exposure Basics Introduction to Photography. What is Exposure  In photography, exposure is the total amount of light allowed to fall on the digital sensor.
ISO, Aperture and Shutter Speed For Beginners. The photographer can control how much natural light reaches the sensor by adjusting the camera's ISO shutter.
SLR Photography Camera Settings and Exposure. What is exposure? In photography, exposure is the total amount of light allowed to fall on the film (or.
Po-Hsiang Chen Advisor: Sheng-Jyh Wang. People  Shree K. Nayar  Ramesh Raskar  Ren Ng 2.
Fourier Slice Photography
Light Fields PROPERTIES AND APPLICATIONS. Outline  What are light fields  Acquisition of light fields  from a 3D scene  from a real world scene 
Light Field Rendering Shijin Kong Lijie Heng.

Diffusion Coding Photography for Extended Depth of Field SIGGRAPH 2010 Ollie Cossairt, Changyin Zhou, Shree Nayar Columbia University.
CCU VISION LABORATORY Object Speed Measurements Using Motion Blurred Images 林惠勇 中正大學電機系
CSCE 641 Computer Graphics: Image-based Rendering (cont.) Jinxiang Chai.
Linear View Synthesis Using a Dimensionality Gap Light Field Prior
1 Diffusion Coded Photography for Extended Depth of Field SIGGRAPH 2010 Oliver Cossairt, Changyin Zhou, Shree Nayar Columbia University Supported by ONR.
CSCE 641: Computer Graphics Image-based Rendering Jinxiang Chai.
 Marc Levoy IBM / IBR “The study of image-based modeling and rendering is the study of sampled representations of geometry.”
 Marc Levoy IBM / IBR “The study of image-based modeling and rendering is the study of sampled representations of geometry.”
Lecture 33: Computational photography CS4670: Computer Vision Noah Snavely.
Light field photography and microscopy Marc Levoy Computer Science Department Stanford University.
Introduction to Digital Photography Gr. 11 Comm Bluevale.
Multi-Aperture Photography Paul Green – MIT CSAIL Wenyang Sun – MERL Wojciech Matusik – MERL Frédo Durand – MIT CSAIL.
Light Field. Modeling a desktop Image Based Rendering  Fast Realistic Rendering without 3D models.
Photography Parts of a Camera. Aperture size (or width or diameter) of the opening of a lens diaphragm inside a photographic lens regulates the amount.
How the Camera Works ( both film and digital )
Northeastern University, Fall 2005 CSG242: Computational Photography Ramesh Raskar Mitsubishi Electric Research Labs Northeastern University September.
Light Field Video Stabilization ICCV 2009, Kyoto Presentation for CS 534: Computational Photography Friday, April 22, 2011 Brandon M. Smith Li Zhang University.
In Three Parts. » Know ˃What exposure is ˃What affects exposure » Show ˃Define Exposure ˃Identify an over, under, and perfectly exposed photo ˃Use exposure.
Intro to Photography. Types of Cameras Single Lens Reflex A single-lens reflex (SLR) camera typically uses a mirror and prism system that allows the photographer.
MERL, MIT Media Lab Reinterpretable Imager Agrawal, Veeraraghavan & Raskar Amit Agrawal, Ashok Veeraraghavan and Ramesh Raskar Mitsubishi Electric Research.
Multi-Focus Range Sensor using Coded Aperture Takashi MATSUYAMA Kyoto Univ. Shinsaku HIURA Osaka Univ.
Camera Basics. ● DSLR – Digital Single Lens Reflex ● The camera has a viewfinder that sees through the lens by way of a 45°-angled mirror that flips.
Digital Photography A tool for Graphic Design Graphic Design: Digital Photography.
Comparing Regular Film to Digital Photography
Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy.
Computational photography CS4670: Computer Vision Noah Snavely.
Lecture Exposure/histograms. Exposure - Four Factors A camera is just a box with a hole in it. The correct exposure is determined by four factors: 1.
Macro and Close-up Photography Digital Photography DeCal 2010 Nathan Yan Kellen Freeman Some slides adapted from Zexi Eric Yan Photo by Daniel Schwen.
Nonphotorealistic rendering, and future cameras Computational Photography, Bill Freeman Fredo Durand May 11, 2006.
Yu-Wing Tai, Hao Du, Michael S. Brown, Stephen Lin CVPR’08 (Longer Version in Revision at IEEE Trans PAMI) Google Search: Video Deblurring Spatially Varying.
Photography Seeing through the camera’s eye. Vocabulary Definitions Photography: the art or technique of exposing light to an electronic sensor or film.
Mitsubishi Electric Research Labs (MERL) Super-Res from Single Motion Blur PhotoAgrawal & Raskar Amit Agrawal and Ramesh Raskar Mitsubishi Electric Research.
EG 2011 | Computational Plenoptic Imaging STAR | VI. High Speed Imaging1 Computational Plenoptic Imaging Gordon Wetzstein 1 Ivo Ihrke 2 Douglas Lanman.
How to use your camera on a Manual mode?.. Objectif: Understand the three camera settings: aperture, ISO and shutter speed.
What do they do? Why do we need them? Adding to the ‘toolbox’ Shutter Speed and Aperture.
1. Photography 101 First Class –Elements of a camera and how an image is made –Types of cameras (deleting the duplicate) –Exposure Second Class –Metering.
by: Taren Haynes, Robert Barnes, Devin Guelfo, Ashley Evans.
Aperture & Shutter Speed Digital Photography. Aperture Also called the f-stop Refers to the adjustable opening in an optical instrument, such as a camera.
Extracting Depth and Matte using a Color-Filtered Aperture Yosuke Bando TOSHIBA + The University of Tokyo Bing-Yu Chen National Taiwan University Tomoyuki.
CS559: Computer Graphics Lecture 3: Image Sampling and Filtering Li Zhang Spring 2010.
Fundamentals of Digital Images & Photography. Pixels & Colors The pixel (a word invented from "picture element") is the basic unit of programmable color.
In Photography, there is an Exposure Triangle (Reciprocity Theory) Aperture – size of the iris opening, how much light come into the “window” Shutter Speed.
U Fast Shutter Speed = Stops the Action u Slow Shutter Speed = Blurs the Action (Dragging the shutter) 1/6th 1/500th Photography Basics u Aperture and.
CAMERA CONTROLS Lighting. Shutter DEFINITION:  blades or diaphragm that opens and closes for distinct periods of time to allow light into the camera.
Digital Photography. Photography Triangle Shutter Speed Speed at which the film or sensor is exposed to light Usually a fraction 1/250 = th of.
Understanding Aperture (a beginner’s guide) Understanding Aperture (a beginner’s guide)
Introduction Computational Photography Seminar: EECS 395/495
Aperture and Depth of Field
DIGITAL PHOTOGRAPHY.
A tool for Graphic Design
The Multisensor Camera
Reconstruction For Rendering distribution Effect
What I Need To Know About Operating A Camera
Sampling and Reconstruction of Visual Appearance
Rob Fergus Computer Vision
The Amount of light or darkness on a photograph is known as the:
A tool for Graphic Design
Need more help? Attend after school sessions
byAlishahSmithandMsJeske2006
Exposure Defined In photography, exposure is the amount of light per unit area (the image plane illuminance times the exposure time) reaching a photographic.
Presentation transcript:

Introduction to Computational Photography

Computational Photography Digital Camera What is Computational Photography? Second breakthrough by IT First : electronic image sensor (digital camera) Digital representation of “image formed by lens” Second : Re-definition of whole camera (optics, usage) Image is reconstructed by computation Image processing Image Image sensor Optics Whole part of camera is affected by computational photography Film camera Digital Camera

Light field (Light space) What is camera? Camera is a machine to record the distribution of the light in a scene How to represent the distribution of the light in the scene? 3-D coordinate of the point where the light passing through : X, Y, Z Direction of the light : θ,Φ Wavelength of the light (color of the light) : λ Time : t The 7 parameters function P which represent the distribution of the light is called “Plenoptic function” Light source Object

Light field (light ray)Optics (lens)Image sensor (pixel) Integration of camera Camera integrates the light for all 7 parameters Position(range of X, Y, Z : aperture size should not be zero) Direction(range of θ,Φ : pixel size is not zero) Wavelength( range of λ : No single wavelength filter) Exposure time ( range of t : shutter speed should not be too fast) Multiple samples - θ,Φ:number of pixel, λ:RGB,t : burst shot So, what is multiple sampling for X, Y, Z?

Camera array Measuring the distribution of the light at multiple position ProFUSION25 (ViewPlus, Inc.) The Stanford Multi-Camera Array (Marc University)

Use of camera array Free-viewpoint image Defocus generation by synthetic aperture 3-D video (Matsuyama lab, Kyoto Univ.)Synthetic aperture

Defocus control by Uncalibrated Synthetic Aperture Natsumi Kusumoto, Shinsaku Hiura and Kosuke Sato, Uncalibrated Synthetic Aperture for Defocus Control, CVPR2009 (Jun. 2009)

Reviewing “integration” Some part of information is lost by integration Sine wave which period is just as same as the integration duration is lost Blur of object within an exposure time Defocus by misfocus × = 0

Coded Exposure Coded exposure : exposure is coded in time axis Flutter Shutter Camera

This Traditional Coded Exposure Image of Static Object Deblurred Image Slide by R. Raskar

Coded Exposure Temporal 1-D broadband code: Motion Deblurring Coded Aperture Spatial 2-D broadband mask: Focus Deblurring Slide by R. Raskar

Captured Blurred Photo Slide by R. Raskar

Refocused on Person Slide by R. Raskar

Coded Aperture Depth estimation by single image (manual operation is necessary)

Coded Aperture

Coded Aperture

Multi-focus camera with Coded Aperture Stabilizing the depth estimation and deblur by coded aperture Simultaneous capture of 3 images with different focused distance Hiura et al, CVPR(1998), SSII(1999)

Multi-Focus Range Sensor using Coded Aperture

Invariant integration Defocus : changed according to the distance Blur : changed according to the speed of the object Reconstruction is not easy because the estimation of the speed or distance is necessary  Is it possible to make defocus or blur invariant to the distance or speed?

Invariant integration Defocus Special optics : Wavefront Coding Motion of the image sensor while exposure Blur Reciprocal motion of the camera CDM Optics, Inc.

Motion of the image sensor for invariant defocus H. Nagahara, S. Kuthirummal, C. Zhou, and S.K. Nayar, Flexible Depth of Field Photography, ECCV2008 H. Nagahara, S. Kuthirummal, C. Zhou, and S.K. Nayar, Flexible Depth of Field Photography, ECCV2008

H. Nagahara, S. Kuthirummal, C. Zhou, and S.K. Nayar, Flexible Depth of Field Photography, ECCV2008 Motion of the image sensor for invariant defocus

Deblur by reciprocal motion of the camera A. Levin, P. Sand, T. S. Cho, F. Durand, W. T. Freeman. Motion- Invariant Photography. SIGGRAPH2008. A. Levin, P. Sand, T. S. Cho, F. Durand, W. T. Freeman. Motion- Invariant Photography. SIGGRAPH2008. Input imageDeblurred image

A. Levin, P. Sand, T. S. Cho, F. Durand, W. T. Freeman. Motion- Invariant Photography. SIGGRAPH2008. EquipmentConceptual figure for Light sources with different speed Deblur by reciprocal motion of the camera

More.. Resources on www Wikipedia : computational photographycomputational photography Conferences International Conference on Computational Photography SIGGRAPH, CVPR,.. Session about computational photography