Modeling Light 15-463: Rendering and Image Processing Alexei Efros.

Slides:



Advertisements
Similar presentations
The Camera : Computational Photography Alexei Efros, CMU, Fall 2006.
Advertisements

CS 691 Computational Photography Instructor: Gianfranco Doretto 3D to 2D Projections.
Three-Dimensional Viewing Sang Il Park Sejong University Lots of slides are stolen from Jehee Lee’s.
Computer Vision CS 776 Spring 2014 Cameras & Photogrammetry 1 Prof. Alex Berg (Slide credits to many folks on individual slides)
Project 3 Results
Computational Photography
Paris town hall.
The Camera CS194: Image Manipulation & Computational Photography Alexei Efros, UC Berkeley, Fall 2014.
Projection Readings Szeliski 2.1. Projection Readings Szeliski 2.1.
Image formation and cameras CSE P 576 Larry Zitnick Many slides courtesy of Steve Seitz.
Cameras (Reading: Chapter 1) Goal: understand how images are formed Camera obscura dates from 15 th century Basic abstraction is the pinhole camera Perspective.
Wrap Up : Rendering and Image Processing Alexei Efros.
Announcements. Projection Today’s Readings Nalwa 2.1.
Lecture 5: Projection CS6670: Computer Vision Noah Snavely.
Announcements Mailing list (you should have received messages) Project 1 additional test sequences online Talk today on “Lightfield photography” by Ren.
Computer Vision - A Modern Approach
CSCE641: Computer Graphics Image Formation Jinxiang Chai.
Announcements Mailing list Project 1 test the turnin procedure *this week* (make sure it works) vote on best artifacts in next week’s class Project 2 groups.
Image or Object? Michael F. Cohen Microsoft Research.
Lecture 4a: Cameras CS6670: Computer Vision Noah Snavely Source: S. Lazebnik.
Lecture 12: Projection CS4670: Computer Vision Noah Snavely “The School of Athens,” Raphael.
Computational Photography Light Field Rendering Jinxiang Chai.
Intromission Theory: Plato, Euclid, Ptolemy, da Vinci. Plato for instance, wrote in the fourth century B. C. that light emanated from the eye, seizing.
Single-view Metrology and Camera Calibration Computer Vision Derek Hoiem, University of Illinois 02/26/15 1.
CSCE 641: Computer Graphics Image-based Rendering Jinxiang Chai.
The Camera : Computational Photography Alexei Efros, CMU, Fall 2008.
Wrap Up : Computational Photography Alexei Efros, CMU, Fall 2005 © Robert Brown.
CSCE 641: Computer Graphics Image Formation & Plenoptic Function Jinxiang Chai.
Cameras, lenses, and calibration
Image formation and cameras
Perspective projection
Cameras CSE 455, Winter 2010 January 25, 2010.
EEM 561 Machine Vision Week 10 :Image Formation and Cameras
Building a Real Camera.
Northeastern University, Fall 2005 CSG242: Computational Photography Ramesh Raskar Mitsubishi Electric Research Labs Northeastern University September.
776 Computer Vision Jan-Michael Frahm Fall Camera.
CPSC 641: Computer Graphics Image Formation Jinxiang Chai.
Recap from Wednesday Two strategies for realistic rendering capture real world data synthesize from bottom up Both have existed for 500 years. Both are.
776 Computer Vision Jan-Michael Frahm, Enrique Dunn Spring 2013.
Single-view Metrology and Camera Calibration Computer Vision Derek Hoiem, University of Illinois 01/25/11 1.
Image-based rendering Michael F. Cohen Microsoft Research.
Physiology of Vision: a swift overview : Learning-Based Methods in Vision A. Efros, CMU, Spring 2009 Some figures from Steve Palmer.
Physiology of Vision: a swift overview : Advanced Machine Perception A. Efros, CMU, Spring 2006 Some figures from Steve Palmer.
Peripheral drift illusion. Multiple views Hartley and Zisserman Lowe stereo vision structure from motion optical flow.
Modeling Light cs129: Computational Photography
What is light? Electromagnetic radiation (EMR) moving along rays in space R( ) is EMR, measured in units of power (watts) – is wavelength Light: Travels.
CSL 859: Advanced Computer Graphics Dept of Computer Sc. & Engg. IIT Delhi.
Panorama artifacts online –send your votes to Li Announcements.
Projection Readings Nalwa 2.1 Szeliski, Ch , 2.1
Jinxiang Chai CSCE 441 Computer Graphics 3-D Viewing 0.
CS 691B Computational Photography Instructor: Gianfranco Doretto Modeling Light.
More with Pinhole + Single-view Metrology
Announcements Project 1 grading session this Thursday 2:30-5pm, Sieg 327 –signup ASAP:signup –10 minute slot to demo your project for a TA »have your program.
Lecture 18: Cameras CS4670 / 5670: Computer Vision KavitaBala Source: S. Lazebnik.
Outline 3D Viewing Required readings: HB 10-1 to 10-10
CSE 185 Introduction to Computer Vision
CS5670: Computer Vision Lecture 9: Cameras Noah Snavely
Jan-Michael Frahm Fall 2016
The Camera : Computational Photography
Physiology of Vision: a swift overview
Announcements Project 1 Project 2
CSCE 441 Computer Graphics 3-D Viewing
CS4670 / 5670: Computer Vision Lecture 12: Cameras Noah Snavely
Lecture 13: Cameras and geometry
Lecturer: Dr. A.H. Abdul Hafez
The Camera : Computational Photography
Announcements Midterm out today Project 1 demos.
Projection Readings Nalwa 2.1.
Projection Readings Szeliski 2.1.
Announcements Midterm out today Project 1 demos.
Presentation transcript:

Modeling Light : Rendering and Image Processing Alexei Efros

On Simulating the Visual Experience Just feed the eyes the right data No one will know the difference! Philosophy: Ancient question: “Does the world really exist?” Science fiction: Many, many, many books on the subject Latest take: The Matrix Physics: Slowglass might be possible? Computer Science: Virtual Reality To simulate we need to know: How and what does a person see?

Today How do we see the world? Geometry of Image Formation What do we see? The Plenoptic Function How do we recreate visual reality? Sampling the Plenoptic Function Ray Reuse The “Theatre Workshop” metaphor

How do we see the world? Let’s design a camera Idea 1: put a piece of film in front of an object Do we get a reasonable image? Slide by Steve Seitz

Pinhole camera Add a barrier to block off most of the rays This reduces blurring The opening known as the aperture How does this transform the image? Slide by Steve Seitz

Camera Obscura The first camera Known to Aristotle Depth of the room is the focal length Pencil of rays – all rays through a point Can we measure distances? Slide by Steve Seitz

Distant objects are smaller Figure by David Forsyth

Camera Obscura Drawing from “The Great Art of Light and Shadow “ Jesuit Athanasius Kircher, How does the aperture size affect the image?

Shrinking the aperture Why not make the aperture as small as possible? Less light gets through Diffraction effects… Less light gets through Slide by Steve Seitz

Shrinking the aperture

Home-made pinhole camera

The reason for lenses Slide by Steve Seitz

Adding a lens A lens focuses light onto the film There is a specific distance at which objects are “in focus” –other points project to a “circle of confusion” in the image Changing the shape of the lens changes this distance “circle of confusion” Slide by Steve Seitz

Modeling projection The coordinate system We will use the pin-hole model as an approximation Put the optical center (Center Of Projection) at the origin Put the image plane (Projection Plane) in front of the COP –Why? The camera looks down the negative z axis –we need this if we want right-handed-coordinates – Slide by Steve Seitz

Modeling projection Projection equations Compute intersection with PP of ray from (x,y,z) to COP Derived using similar triangles (on board) We get the projection by throwing out the last coordinate: Slide by Steve Seitz

Homogeneous coordinates Is this a linear transformation? Trick: add one more coordinate: homogeneous image coordinates homogeneous scene coordinates Converting from homogeneous coordinates no—division by z is nonlinear Slide by Steve Seitz

Perspective Projection Projection is a matrix multiply using homogeneous coordinates: divide by third coordinate This is known as perspective projection The matrix is the projection matrix Can also formulate as a 4x4 divide by fourth coordinate Slide by Steve Seitz

Orthographic Projection Special case of perspective projection Distance from the COP to the PP is infinite Also called “parallel projection” What’s the projection matrix? Image World Slide by Steve Seitz

Spherical Projection What if PP is spherical with center at COP? In spherical coordinates, projection is trivial:  Note: doesn’t depend on focal length d!

The eye The human eye is a camera! Iris - colored annulus with radial muscles Pupil - the hole (aperture) whose size is controlled by the iris What’s the “film”? –photoreceptor cells (rods and cones) in the retina

The Plenoptic Function Q: What is the set of all things that we can ever see? A: The Plenoptic Function (Adelson & Bergen) Let’s start with a stationary person and try to parameterize everything that he can see… Figure by Leonard McMillan

Grayscale snapshot is intensity of light Seen from a single view point At a single time Averaged over the wavelengths of the visible spectrum (can also do P(x,y), but spherical coordinate are nicer) P(  )

Color snapshot is intensity of light Seen from a single view point At a single time As a function of wavelength P(  )

A movie is intensity of light Seen from a single view point Over time As a function of wavelength P( ,t)

Holographic movie is intensity of light Seen from ANY viewpoint Over time As a function of wavelength P( ,t,V X,V Y,V Z )

The Plenoptic Function Can reconstruct every possible view, at every moment, from every position, at every wavelength Contains every photograph, every movie, everything that anyone has ever seen! it completely captures our visual reality! Not bad for function… P( ,t,V X,V Y,V Z )

Sampling Plenoptic Function (top view) Just lookup -- Quicktime VR

Ray Let’s not worry about time and color: 5D 3D position 2D direction P(  V X,V Y,V Z ) Slide by Rick Szeliski and Michael Cohen

Ray Reuse Infinite line Assume light is constant (vacuum) 4D 2D direction 2D position non-dispersive medium Slide by Rick Szeliski and Michael Cohen

Can still sample all images! Slide by Rick Szeliski and Michael Cohen

Lumigraph / Lightfield Outside convex space 4D Stuff Empty Slide by Rick Szeliski and Michael Cohen

Lumigraph - Organization 2D position 2D direction s  Slide by Rick Szeliski and Michael Cohen

Lumigraph - Organization 2D position 2 plane parameterization s u Slide by Rick Szeliski and Michael Cohen

Lumigraph - Organization 2D position 2 plane parameterization u s t s,t u,v v s,t u,v Slide by Rick Szeliski and Michael Cohen

Lumigraph - Organization Hold s,t constant Let u,v vary An image s,t u,v Slide by Rick Szeliski and Michael Cohen

Lumigraph / Lightfield

Lumigraph - Capture Idea 1 Move camera carefully over s,t plane Gantry –see Lightfield paper s,t u,v Slide by Rick Szeliski and Michael Cohen

Lumigraph - Capture Idea 2 Move camera anywhere Rebinning –see Lumigraph paper s,t u,v Slide by Rick Szeliski and Michael Cohen

Lumigraph - Rendering q For each output pixel determine s,t,u,v either use closest discrete RGB interpolate near values s u Slide by Rick Szeliski and Michael Cohen

Lumigraph - Rendering Nearest closest s closest u draw it Blend 16 nearest quadrilinear interpolation s u Slide by Rick Szeliski and Michael Cohen

2D: Image What is an image? All rays through a point Panorama? Slide by Rick Szeliski and Michael Cohen

Spherical Panorama All light rays through a point form a ponorama Totally captured in a 2D array -- P(  ) Where is the geometry??? See also: 2003 New Years Eve

The “Theatre Workshop” Metaphor desired image (Adelson & Pentland,1996) PainterLighting Designer Sheet-metal worker

Painter (images)

Lighting Designer (environment maps) Show Naimark SF MOMA video

Sheet-metal Worker (geometry) Let surface normals do all the work!

… working together Want to minimize cost Each one does what’s easiest for him Geometry – big things Images – detail Lighting – illumination effects clever Italians

Façade demo Campanile Movie

Next Time Start Small: Image Processing Assignment 1: Out by Monday (check the web)