Download presentation
Presentation is loading. Please wait.
Published byJefferson Fretwell Modified over 10 years ago
1
C280, Computer Vision Prof. Trevor Darrell trevor@eecs.berkeley.edu Lecture 2: Image Formation
2
Administrivia We’re now in 405 Soda… New office hours: Thurs. 5-6pm, 413 Soda. I’ll decide on waitlist decisions tomorrow. Any Matlab issues yet? Roster…
3
Physical parameters of image formation Geometric – Type of projection – Camera pose Optical – Sensor’s lens type – focal length, field of view, aperture Photometric – Type, direction, intensity of light reaching sensor – Surfaces’ reflectance properties Sensor – sampling, etc.
4
Physical parameters of image formation Geometric – Type of projection – Camera pose Optical – Sensor’s lens type – focal length, field of view, aperture Photometric – Type, direction, intensity of light reaching sensor – Surfaces’ reflectance properties Sensor – sampling, etc.
5
Perspective and art Use of correct perspective projection indicated in 1 st century B.C. frescoes Skill resurfaces in Renaissance: artists develop systematic methods to determine perspective projection (around 1480-1515) Durer, 1525Raphael K. Grauman
6
Perspective projection equations 3d world mapped to 2d projection in image plane Forsyth and Ponce Camera frame Image plane Optical axis Focal length Scene / world points Scene point Image coordinates ‘’‘’ ‘’‘’
7
Homogeneous coordinates Is this a linear transformation? Trick: add one more coordinate: homogeneous image coordinates homogeneous scene coordinates Converting from homogeneous coordinates no—division by z is nonlinear Slide by Steve Seitz
8
Perspective Projection Matrix divide by the third coordinate to convert back to non- homogeneous coordinates Projection is a matrix multiplication using homogeneous coordinates: Slide by Steve Seitz Complete mapping from world points to image pixel positions?
9
Perspective projection & calibration Perspective equations so far in terms of camera’s reference frame…. Camera’s intrinsic and extrinsic parameters needed to calibrate geometry. Camera frame K. Grauman
10
Perspective projection & calibration Camera frame Intrinsic: Image coordinates relative to camera Pixel coordinates Extrinsic: Camera frame World frame World frame World to camera coord. trans. matrix (4x4) Perspective projection matrix (3x4) Camera to pixel coord. trans. matrix (3x3) = 2D point (3x1) 3D point (4x1) K. Grauman
11
Intrinsic parameters: from idealized world coordinates to pixel values Forsyth&Ponce Perspective projection W. Freeman
12
Intrinsic parameters But “pixels” are in some arbitrary spatial units W. Freeman
13
Intrinsic parameters Maybe pixels are not square W. Freeman
14
Intrinsic parameters We don’t know the origin of our camera pixel coordinates W. Freeman
15
Intrinsic parameters May be skew between camera pixel axes W. Freeman
16
Intrinsic parameters, homogeneous coordinates Using homogenous coordinates, we can write this as: or: In camera-based coords In pixels W. Freeman
17
Extrinsic parameters: translation and rotation of camera frame Non-homogeneous coordinates Homogeneous coordinates W. Freeman
18
Combining extrinsic and intrinsic calibration parameters, in homogeneous coordinates Forsyth&Ponce Intrinsic Extrinsic World coordinates Camera coordinates pixels W. Freeman
19
Other ways to write the same equation pixel coordinates world coordinates Conversion back from homogeneous coordinates leads to: W. Freeman
20
Calibration target http://www.kinetic.bc.ca/CompVision/opti-CAL.html Find the position, u i and v i, in pixels, of each calibration object feature point.
21
Camera calibration So for each feature point, i, we have: From before, we had these equations relating image positions, u,v, to points at 3-d positions P (in homogeneous coordinates): W. Freeman
22
Camera calibration Stack all these measurements of i=1…n points into a big matrix: W. Freeman
23
Showing all the elements: In vector form: Camera calibration W. Freeman
24
We want to solve for the unit vector m (the stacked one) that minimizes Q m = 0 The minimum eigenvector of the matrix Q T Q gives us that (see Forsyth&Ponce, 3.1), because it is the unit vector x that minimizes x T Q T Q x. Camera calibration W. Freeman
25
Once you have the M matrix, can recover the intrinsic and extrinsic parameters as in Forsyth&Ponce, sect. 3.2.2. Camera calibration W. Freeman
26
Recall, perspective effects… Far away objects appear smaller Forsyth and Ponce
27
Perspective effects
29
Parallel lines in the scene intersect in the image Converge in image on horizon line Image plane (virtual) Scene pinhole
30
Projection properties Many-to-one: any points along same ray map to same point in image Points ? – points Lines ? – lines (collinearity preserved) Distances and angles are / are not ? preserved – are not Degenerate cases: – Line through focal point projects to a point. – Plane through focal point projects to line – Plane perpendicular to image plane projects to part of the image.
31
Weak perspective Approximation: treat magnification as constant Assumes scene depth << average distance to camera World points: Image plane
32
Orthographic projection Given camera at constant distance from scene World points projected along rays parallel to optical access
35
2D
36
3D
37
Other types of projection Lots of intriguing variants… (I’ll just mention a few fun ones) S. Seitz
38
360 degree field of view… Basic approach – Take a photo of a parabolic mirror with an orthographic lens (Nayar) – Or buy one a lens from a variety of omnicam manufacturers… See http://www.cis.upenn.edu/~kostas/omni.html http://www.cis.upenn.edu/~kostas/omni.html S. Seitz
39
Tilt-shift Titlt-shift images from Olivo Barbieri and Photoshop imitationsOlivo Barbieriimitations http://www.northlight-images.co.uk/article_pages/tilt_and_shift_ts-e.html S. Seitz
40
tilt, shift http://en.wikipedia.org/wiki/Tilt-shift_photography
41
Tilt-shift perspective correction http://en.wikipedia.org/wiki/Tilt-shift_photography
42
normal lenstilt-shift lens http://www.northlight-images.co.uk/article_pages/tilt_and_shift_ts-e.html
43
Rollout Photographs © Justin Kerr http://research.famsi.org/kerrmaya.html Rotating sensor (or object) Also known as “cyclographs”, “peripheral images” S. Seitz
44
Photofinish S. Seitz
45
Physical parameters of image formation Geometric – Type of projection – Camera pose Optical – Sensor’s lens type – focal length, field of view, aperture Photometric – Type, direction, intensity of light reaching sensor – Surfaces’ reflectance properties Sensor – sampling, etc.
46
Pinhole size / aperture Smaller Larger How does the size of the aperture affect the image we’d get? K. Grauman
47
Adding a lens A lens focuses light onto the film –Rays passing through the center are not deviated –All parallel rays converge to one point on a plane located at the focal length f Slide by Steve Seitz focal point f
48
Pinhole vs. lens K. Grauman
49
Cameras with lenses focal point F optical center (Center Of Projection) A lens focuses parallel rays onto a single focal point Gather more light, while keeping focus; make pinhole perspective projection practical K. Grauman
50
Human eye Fig from Shapiro and Stockman Pupil/Iris – control amount of light passing through lens Retina - contains sensor cells, where image is formed Fovea – highest concentration of cones Rough analogy with human visual system:
51
Thin lens Rays entering parallel on one side go through focus on other, and vice versa. In ideal case – all rays from P imaged at P’. Left focus Right focus Focal length fLens diameter d K. Grauman
52
Thin lens equation Any object point satisfying this equation is in focus K. Grauman
53
Focus and depth of field Image credit: cambridgeincolour.com
54
Focus and depth of field Depth of field: distance between image planes where blur is tolerable Thin lens: scene points at distinct depths come in focus at different image planes. (Real camera lens systems have greater depth of field.) Shapiro and Stockman “circles of confusion”
55
Focus and depth of field How does the aperture affect the depth of field? A smaller aperture increases the range in which the object is approximately in focus Flower images from Wikipedia http://en.wikipedia.org/wiki/Depth_of_fieldhttp://en.wikipedia.org/wiki/Depth_of_field Slide from S. Seitz
56
Depth from focus [figs from H. Jin and P. Favaro, 2002] Images from same point of view, different camera parameters 3d shape / depth estimates
57
Field of view Angular measure of portion of 3d space seen by the camera Images from http://en.wikipedia.org/wiki/Angle_of_view K. Grauman
58
As f gets smaller, image becomes more wide angle – more world points project onto the finite image plane As f gets larger, image becomes more telescopic – smaller part of the world projects onto the finite image plane Field of view depends on focal length from R. Duraiswami
59
Field of view depends on focal length Smaller FOV = larger Focal Length Slide by A. Efros
60
Vignetting http://www.ptgui.com/examples/vigntutorial.html http://www.tlucretius.net/Photo/eHolga.html
61
Vignetting “natural”: “mechanical”: intrusion on optical path
62
Chromatic aberration
64
Physical parameters of image formation Geometric – Type of projection – Camera pose Optical – Sensor’s lens type – focal length, field of view, aperture Photometric – Type, direction, intensity of light reaching sensor – Surfaces’ reflectance properties Sensor – sampling, etc.
65
Environment map http://www.sparse.org/3d.html
66
BDRF
67
Diffuse / Lambertian
68
Foreshortening
69
Specular reflection
70
Phong Diffuse+specular+ambient:
71
Physical parameters of image formation Geometric – Type of projection – Camera pose Optical – Sensor’s lens type – focal length, field of view, aperture Photometric – Type, direction, intensity of light reaching sensor – Surfaces’ reflectance properties Sensor – sampling, etc.
72
Digital cameras Film sensor array Often an array of charge coupled devices Each CCD is light sensitive diode that converts photons (light energy) to electrons camera CCD array optics frame grabber computer K. Grauman
73
Historical context Pinhole model: Mozi (470-390 BCE), Aristotle (384-322 BCE) Principles of optics (including lenses): Alhacen (965-1039 CE) Camera obscura: Leonardo da Vinci (1452-1519), Johann Zahn (1631-1707) First photo: Joseph Nicephore Niepce (1822) Daguerréotypes (1839) Photographic film (Eastman, 1889) Cinema (Lumière Brothers, 1895) Color Photography (Lumière Brothers, 1908) Television (Baird, Farnsworth, Zworykin, 1920s) First consumer camera with CCD: Sony Mavica (1981) First fully digital camera: Kodak DCS100 (1990) Niepce, “La Table Servie,” 1822 CCD chip Alhacen’s notes Slide credit: L. Lazebnik K. Grauman
75
Digital Sensors
76
Resolution sensor: size of real world scene element a that images to a single pixel image: number of pixels Influences what analysis is feasible, affects best representation choice. [fig from Mori et al]
77
Digital images Think of images as matrices taken from CCD array. K. Grauman
78
im[176][201] has value 164 im[194][203] has value 37 width 520 j=1 500 height i=1 Intensity : [0,255] Digital images K. Grauman
79
Color sensing in digital cameras Source: Steve Seitz Estimate missing components from neighboring values (demosaicing) Bayer grid
80
RG B Color images, RGB color space K. Grauman Much more on color in next lecture…
81
Physical parameters of image formation Geometric – Type of projection – Camera pose Optical – Sensor’s lens type – focal length, field of view, aperture Photometric – Type, direction, intensity of light reaching sensor – Surfaces’ reflectance properties Sensor – sampling, etc.
82
Summary Image formation affected by geometry, photometry, and optics. Projection equations express how world points mapped to 2d image. Homogenous coordinates allow linear system for projection equations. Lenses make pinhole model practical Photometry models: Lambertian, BRDF Digital imagers, Bayer demosaicing Parameters (focal length, aperture, lens diameter, sensor sampling…) strongly affect image obtained. K. Grauman
83
Slide Credits Bill Freeman Steve Seitz Kristen Grauman Forsyth and Ponce Rick Szeliski and others, as marked…
84
` Next time Color Readings: – Forsyth and Ponce, Chapter 6 – Szeliski, 2.3.2
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.