03/03/03© 2003 University of Wisconsin Last Time Subsurface scattering models Sky models.

Slides:



Advertisements
Similar presentations
Filtration based on Color distance
Advertisements

Computer Vision Radiometry. Bahadir K. Gunturk2 Radiometry Radiometry is the part of image formation concerned with the relation among the amounts of.
Grey Level Enhancement Contrast stretching Linear mapping Non-linear mapping Efficient implementation of mapping algorithms Design of classes to support.
Visualization and graphics research group CIPIC May 25, 2004Realistic Image Synthesis1 Tone Mapping Presented by Lok Hwa.
High Dynamic Range Imaging Samu Kemppainen VBM02S.
Color & Light, Digitalization, Storage. Vision Rods work at low light levels and do not see color –That is, their response depends only on how many photons,
 Image Characteristics  Image Digitization Spatial domain Intensity domain 1.
CPSC 641 Computer Graphics: Radiometry and Illumination Jinxiang Chai Many slides from Pat Haranhan.
16421: Vision Sensors Lecture 6: Radiometry and Radiometric Calibration Instructor: S. Narasimhan Wean 5312, T-R 1:30pm – 2:50pm.
Topic 4 - Image Mapping - I DIGITAL IMAGING Course 3624 Department of Physics and Astronomy Professor Bob Warwick.
Image (and Video) Coding and Processing Lecture 5: Point Operations Wade Trappe.
Image Processing IB Paper 8 – Part A Ognjen Arandjelović Ognjen Arandjelović
Color spaces CIE - RGB space. HSV - space. CIE - XYZ space.
Computer graphics & visualization HDRI. computer graphics & visualization Image Synthesis – WS 07/08 Dr. Jens Krüger – Computer Graphics and Visualization.
Color Image Processing
Computational Photography Prof. Feng Liu Spring /15/2015.
EEE 498/591- Real-Time DSP1 What is image processing? x(t 1,t 2 ) : ANALOG SIGNAL x : real value (t 1,t 2 ) : pair of real continuous space (time) variables.
CS443: Digital Imaging and Multimedia Point Operations on Digital Images Spring 2008 Ahmed Elgammal Dept. of Computer Science Rutgers University Spring.
Lecture 30: Light, color, and reflectance CS4670: Computer Vision Noah Snavely.
What is color for?.
Image Enhancement.
A Novel 2D To 3D Image Technique Based On Object- Oriented Conversion.
Gradient Domain High Dynamic Range Compression
1/22/04© University of Wisconsin, CS559 Spring 2004 Last Time Course introduction Image basics.
Computer Graphics Inf4/MSc Computer Graphics Lecture Notes #16 Image-Based Lighting.
9/14/04© University of Wisconsin, CS559 Spring 2004 Last Time Intensity perception – the importance of ratios Dynamic Range – what it means and some of.
Perception-motivated High Dynamic Range Video Encoding
Tone mapping with slides by Fredo Durand, and Alexei Efros Digital Image Synthesis Yung-Yu Chuang 11/08/2005.
09/05/02© University of Wisconsin, CS559 Spring 2002 Last Time Course introduction Assignment 1 (not graded, but necessary) –View is part of Project 1.
CS559-Computer Graphics Copyright Stephen Chenney 2001 The Human Eye Graphics is concerned with the visual transmission of information How do we see? –Light.
Computer Science 631 Lecture 7: Colorspace, local operations
Manipulating contrast/point operations. Examples of point operations: Threshold (demo) Threshold (demo) Invert (demo) Invert (demo) Out[x,y] = max – In[x,y]
03/2/05© 2005 University of Wisconsin Last Time Sub-surface and Atmospheric Scattering.
EE 7700 High Dynamic Range Imaging. Bahadir K. Gunturk2 References Slides and papers by Debevec, Ward, Pattaniak, Nayar, Durand, et al…
09/09/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Event management Lag Group assignment has happened, like it or not.
16421: Vision Sensors Lecture 7: High Dynamic Range Imaging Instructor: S. Narasimhan Wean 5312, T-R 1:30pm – 3:00pm.
Point Operations – Chapter 5. Definition Some of the things we do to an image involve performing the same operation on each and every pixel (point) –We.
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
1 Chapter 2: Color Basics. 2 What is light?  EM wave, radiation  Visible light has a spectrum wavelength from 400 – 780 nm.  Light can be composed.
03/05/03© 2003 University of Wisconsin Last Time Tone Reproduction If you don’t use perceptual info, some people call it contrast reduction.
MULTIMEDIA TECHNOLOGY SMM 3001 MEDIA - IMAGES. Processing digital Images digital images are often processed using “digital filters” digital images are.
Digital Image Processing Part 1 Introduction. The eye.
Intensity Transformations (Histogram Processing)
Histograms and Color Balancing Computational Photography Derek Hoiem, University of Illinois 09/10/15 “Empire of Light”, Magritte.
The Reason Tone Curves Are The Way They Are. Tone Curves in a common imaging chain.
Lecture 34: Light, color, and reflectance CS4670 / 5670: Computer Vision Noah Snavely.
02-Gray Scale Control TTF. A TTF tells us how an imaging device relates the gray level of the input to the gray level of the output. P L Luminance, L.
Histograms and Color Balancing Computational Photography Derek Hoiem, University of Illinois 09/13/11 “Empire of Light”, Magritte.
Digital Image Processing EEE415 Lecture 3
02/05/2002 (C) University of Wisconsin 2002, CS 559 Last Time Color Quantization Mach Banding –Humans exaggerate sharp boundaries, but not fuzzy ones.
Pure Path Tracing: the Good and the Bad Path tracing concentrates on important paths only –Those that hit the eye –Those from bright emitters/reflectors.
Sampling Pixel is an area!! – Square, Rectangular, or Circular? How do we approximate the area? – Why bother? Color of one pixel Image Plane Areas represented.
Local Illumination and Shading
Digital Image Processing CSC331 Image Enhancement 1.
David Luebke 1 2/5/2016 Color CS 445/645 Introduction to Computer Graphics David Luebke, Spring 2003.
03/04/05© 2005 University of Wisconsin Last Time Tone Reproduction –Histogram method –LCIS and improved filter-based methods.
Digital Image Processing Part 2 Contrast processing.
Tone mapping Digital Visual Effects, Spring 2007 Yung-Yu Chuang 2007/3/13 with slides by Fredo Durand, and Alexei Efros.
Digital Image Processing Image Enhancement in Spatial Domain
01/27/03© 2002 University of Wisconsin Last Time Radiometry A lot of confusion about Irradiance and BRDFs –Clarrified (I hope) today Radiance.
09/10/02(c) University of Wisconsin, CS559 Fall 2002 Last Time Digital Images –Spatial and Color resolution Color –The physics of color.
Color Models Light property Color models.
Color Image Processing
CPSC 6040 Computer Graphics Images
Color Image Processing
Perception and Measurement of Light, Color, and Appearance
Fast Bilateral Filtering for the Display of High-Dynamic-Range Images
Color Image Processing
Grey Level Enhancement
Histograms and Color Balancing
Presentation transcript:

03/03/03© 2003 University of Wisconsin Last Time Subsurface scattering models Sky models

03/03/03© 2003 University of Wisconsin Today Tone-reproduction

03/03/03© 2003 University of Wisconsin Dynamic Range Real scenes contain both very bright and very dark regions Dynamic range is the ratio of the brightest to darkest points in the scene Standard measurement is candelas per m 2 (defined shortly) For example, in the interior of the church the dynamic range is 100,000 to 1

03/03/03© 2003 University of Wisconsin Tone Reproduction The human eye can globally adapt to about 10 9 :1 –Adjusts for the average brightness we perceive in one scene –Global adaptation lets us see in very low light or very bright conditions –But it’s slow - how slow? The human eye can locally adapt to about 10,000:1 –In a single scene (global adaptation level), we can perceive contrast across this range Most display devices have a very limited dynamic range –On the order of 100:1 for a very good monitor or film Tone reproduction is the problem of making the 10,000:1 scene look right on a 100:1 display device

03/03/03© 2003 University of Wisconsin An Artist’s Approach Artists have known how to do this for centuries e.g. Vermeer (spot the tricks)

03/03/03© 2003 University of Wisconsin More Vermeer

03/03/03© 2003 University of Wisconsin Measurements We have discussed things in terms of radiometry, with quantities such as Watts, meters, seconds, steradians Photometry deals with radiometric quantities pushed through a response function –A sensor (the human eye) has a response curve, V –Luminance (candelas per meter squared, cd/m 2 ) is the integral of radiance against the luminous efficiency function (the sensor response function)

03/03/03© 2003 University of Wisconsin Luminous Efficiency

03/03/03© 2003 University of Wisconsin Color and Luminance The CIE XYZ color space is intended to encode luminance in the Y channel To get from RGB to XYZ, apply the following linear transform: –Taken from Sillion and Puech, other sources differ

03/03/03© 2003 University of Wisconsin CIE (Y,x,y) The XYZ space includes brightness info with color info –X and Z get bigger as the color gets brighter To avoid this, use (Y,x,y) –x=X/(X+Y+Z) –y=Y/(X+Y+Z) (x,y) are chromaticity values

03/03/03© 2003 University of Wisconsin Automatic Tone Reproduction There are three main classes of solutions: –Global operators find a mapping from image to display luminance that is the same for every pixel –Local operators change the mapping from one pixel to the next –Perceptually guided operators use elements of human perception to guide the tone reproduction process

03/03/03© 2003 University of Wisconsin Linear Mappings The simplest thing to do is to linearly map the highest intensity in the image to the highest display intensity, or the lowest to the lowest –This gives very bad results, shown for mapping the maps lowest to lowest

03/03/03© 2003 University of Wisconsin Non-Linear Mappings Instead of linear, define some other mapping: L d =M(L w ) –Display luminance is some function of the world luminance It is important to retain relative brightness, but not absolute brightness –If one point is brighter than another in the source, it should be brighter in the output –The mapping M should be strictly increasing

03/03/03© 2003 University of Wisconsin Histogram Methods (Ward Larson, Rushmeier and Piatko, TVCG 97) In any one scene, the dynamic range is not filled uniformly The aim of histogram methods is to generate a mapping that “fills in the gaps” in the range

03/03/03© 2003 University of Wisconsin Building Histograms Work in brightness: B=log 10 (L) –Humans are more sensitive to “brightness,” but the formula is a hack The histogram is a count of how many pixels have each brightness –Choose a set of bins –Break the image into chunks that subtend about 1  of arc Assumes you know the camera –Average brightness in each chunk –Count chunks that fall in each bin, f(b i ) –Result is graph on previous image

03/03/03© 2003 University of Wisconsin Cumulative Distribution We can consider the historgram counts as the probability of seeing a pixel in each range The cumulative distribution function is defined:

03/03/03© 2003 University of Wisconsin Histogram Equalization The aim is an output histogram in which all the bins are roughly equally filled The naïve way to do this is to set: Then, go through and convert brightness back into luminance for display

03/03/03© 2003 University of Wisconsin Naïve Histogram Linear left. Histogram right. What went wrong?

03/03/03© 2003 University of Wisconsin Avoiding Super-Sensitivity We should make sure that we do not increase contrast beyond the linear mapping contrast: This imposes a constraint on the frequency in each bin: Reduce the count of bins that exceed the maximum –Have to iterate, because changing counts changes T

03/03/03© 2003 University of Wisconsin Better Histogram Adjustment Old New

03/03/03© 2003 University of Wisconsin Cumulative Distributions When does it fail to converge? When does it fail to maintain contrast?

03/03/03© 2003 University of Wisconsin More Histogram Methods The process can be further adapted to handle human contrast sensitivity –Avoid wasting range on things that cannot be resolved –Explicitly make sure that irresolvable features remain that way Details next lecture

03/03/03© 2003 University of Wisconsin Local Methods Histogram methods do poorly if there is too much dynamic range in the input –Can’t avoid reducing contrast below what should be resolvable Local methods exploit the local nature of contrast –You don’t compare the brightness of two things across the room, only neighboring points These methods are justified (in part) by the following: –Pixel intensity depends on the product of reflectance and illumination –Reflectance changes fast –Illumination changes slowly

03/03/03© 2003 University of Wisconsin Basic Idea Filter to separate high and low frequencies Compress the low frequencies –Using diffusion filters, typically Add back in the high frequencies LCIS is one algorithm –Tumblin and Turk, SIGGRAPH 99

03/03/03© 2003 University of Wisconsin What’s the Problem?

03/03/03© 2003 University of Wisconsin For Comparison

03/03/03© 2003 University of Wisconsin LCIS Problems Luminance does change quickly sometimes –At shadow boundaries and occlusion boundaries The filtering has to be carefully adapted to make sure the reflectance and lighting components are separated –Standard problem is a dark halo around a bright object –Sharp changes at shadows get interpreted as shading –When added back into contrast reduced base layer, they give artifacts –This can be fixed (next lecture)