Digital Image Synthesis Yung-Yu Chuang 11/8/2007

Slides:



Advertisements
Similar presentations
Digital Image Processing
Advertisements

Grey Level Enhancement Contrast stretching Linear mapping Non-linear mapping Efficient implementation of mapping algorithms Design of classes to support.
Visualization and graphics research group CIPIC May 25, 2004Realistic Image Synthesis1 Tone Mapping Presented by Lok Hwa.
High Dynamic Range Imaging Samu Kemppainen VBM02S.
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Digital Imaging and Image Analysis
Chapter 10 Digital Imaging: Capture. Digital imaging – electronically producing, viewing, or reproducing an image Pixel – a square with a uniform brightness.
Histograms – Chapter 4 Continued.
Computer graphics & visualization HDRI. computer graphics & visualization Image Synthesis – WS 07/08 Dr. Jens Krüger – Computer Graphics and Visualization.
Recap of Previous Lecture Matting foreground from background Using a single known background (and a constrained foreground) Using two known backgrounds.
Computational Photography Prof. Feng Liu Spring /15/2015.
Apparent Greyscale: A Simple and Fast Conversion to Perceptually Accurate Images and Video Kaleigh SmithPierre-Edouard Landes Joelle Thollot Karol Myszkowski.
High Dynamic Range Images : Computational Photography Alexei Efros, CMU, Fall 2005 …with a lot of slides stolen from Paul Debevec and Yuanzhen Li,
Connecting with Computer Science, 2e
High Dynamic Range Images : Rendering and Image Processing Alexei Efros.
High Dynamic Range Images : Computational Photography Alexei Efros, CMU, Fall 2006 …with a lot of slides stolen from Paul Debevec and Yuanzhen Li,
Digital Audio, Image and Video Hao Jiang Computer Science Department Sept. 6, 2007.
Images and colour Colour - colours - colour spaces - colour models Raster data - image representations - single and multi-band (multi-channel) images -
Graphics in the web Digital Media: Communication and Design
Colour Digital Multimedia, 2nd edition Nigel Chapman & Jenny Chapman
Digital Multimedia, 2nd edition Nigel Chapman & Jenny Chapman Chapter 6 This presentation © 2004, MacAvon Media Productions Colour.
Perception-motivated High Dynamic Range Video Encoding
Tone mapping with slides by Fredo Durand, and Alexei Efros Digital Image Synthesis Yung-Yu Chuang 11/08/2005.
Image Processing.  a typical image is simply a 2D array of color or gray values  i.e., a texture  image processing takes as input an image and outputs.
Faculty of Sciences and Social Sciences HOPE Website Development Graphics Stewart Blakeway FML 213
COMP Bitmapped and Vector Graphics Pages Using Qwizdom.
Lab #5-6 Follow-Up: More Python; Images Images ● A signal (e.g. sound, temperature infrared sensor reading) is a single (one- dimensional) quantity that.
1 Image Basics Hao Jiang Computer Science Department Sept. 4, 2014.
High Dynamic Range Images cs195g: Computational Photography James Hays, Brown, Spring 2010 slides from Alexei A. Efros and Paul Debevec.
Spatial Tone Mapping in High Dynamic Range Imaging Zhaoshi Zheng.
Interactive Time-Dependent Tone Mapping Using Programmable Graphics Hardware Nolan GoodnightGreg HumphreysCliff WoolleyRui Wang University of Virginia.
CS654: Digital Image Analysis Lecture 17: Image Enhancement.
Film Digital Image Synthesis Yung-Yu Chuang 11/5/2008 with slides by Pat Hanrahan and Matt Pharr.
1 Chapter 2: Color Basics. 2 What is light?  EM wave, radiation  Visible light has a spectrum wavelength from 400 – 780 nm.  Light can be composed.
Tone Mapping on GPUs Cliff Woolley University of Virginia Slides courtesy Nolan Goodnight.
Image Representation. Digital Cameras Scanned Film & Photographs Digitized TV Signals Computer Graphics Radar & Sonar Medical Imaging Devices (X-Ray,
High Dynamic Range Images : Computational Photography Alexei Efros, CMU, Spring 2010 …with a lot of slides stolen from Paul Debevec © Alyosha Efros.
High dynamic range imaging Digital Visual Effects, Spring 2006 Yung-Yu Chuang 2006/3/8 with slides by Fedro Durand, Brian Curless, Steve Seitz and Alexei.
Digital Image Processing EEE415 Lecture 3
Advanced Computer Graphics CSE 190 [Winter 2016], Lecture 2 Ravi Ramamoorthi
Colors. Color Systems In computer graphics, we use RGB colors. But… –Can it represent all colors? –Is it linear? For example, (1.0, 1.0, 1.0) is white.
03/03/03© 2003 University of Wisconsin Last Time Subsurface scattering models Sky models.
Tone mapping Digital Visual Effects, Spring 2007 Yung-Yu Chuang 2007/3/13 with slides by Fredo Durand, and Alexei Efros.
Cameras Digital Image Synthesis Yung-Yu Chuang 11/01/2005 with slides by Pat Hanrahan and Matt Pharr.
Image: Susanne Rafelski, Marshall lab Introduction to Digital Image Analysis Part I: Digital Images Kurt Thorn NIC UCSF.
Introduction to Digital Image Analysis Kurt Thorn NIC.
ITEC2110, Digital Media Chapter 2 Fundamentals of Digital Imaging 1 GGC -- ITEC Digital Media.
Images and 2D Graphics COMP
Data Representation.
Binary Notation and Intro to Computer Graphics
Sampling, Quantization, Color Models & Indexed Color
Digital Data Format and Storage
Images, Display, Perception
Advanced Computer Graphics
Digital 2D Image Basic Masaki Hayashi
Chapter III, Desktop Imaging Systems and Issues: Lesson IV Working With Images
Image Enhancement.
Outline Image formats and basic operations Image representation
Digital Image Processing using MATLAB
Digital Visual Effects, Spring 2006 Yung-Yu Chuang 2006/3/8
Digital Image Synthesis Yung-Yu Chuang 11/01/2005
Digital Image Synthesis Yung-Yu Chuang 10/19/2006
Introduction to Digital Image Analysis Part I: Digital Images
Lecture Four Chapter Three
Digital Image Processing
Image Processing 고려대학교 컴퓨터 그래픽스 연구실 kucg.korea.ac.kr.
ECE/CSE 576 HW 1 Notes.
Histograms and Color Balancing
Image Processing 고려대학교 컴퓨터 그래픽스 연구실 kucg.korea.ac.kr.
Presentation transcript:

Digital Image Synthesis Yung-Yu Chuang 11/8/2007 Film Digital Image Synthesis Yung-Yu Chuang 11/8/2007 with slides by Pat Hanrahan and Matt Pharr

Film Film class simulates the sensing device in the simulated camera. It determines samples’ contributions to the nearby pixels and writes the final floating-point image to a file on disk. Tone mapping operations can be used to display the floating-point image on a display. core/film.*

Film class Film { public: Film(int xres, int yres) : xResolution(xres), yResolution(yres) {} virtual ~Film() {} virtual void AddSample(Sample &sample, Ray &ray, Spectrum &L, float alpha); virtual void WriteImage(); virtual void GetSampleExtent(int *xstart,int *xend, int *ystart, int *yend); // Film Public Data const int xResolution, yResolution; }; Camera uses this to compute raster-to-camera transform

ImageFilm film/image.cpp implements the only film plug-in in pbrt. It simply filters samples and writes the resulting image. ImageFilm::ImageFilm(int xres, int yres,Filter *filt, float crop[4],string &filename, bool premult, int wf) { ... pixels = new BlockedArray<Pixel>(xPixelCount, yPixelCount); <precompute filter table> } useful for debugging, in NDC space write frequency

AddSample extent sample precomputed Filter table find the nearest neighbor in the filter table grid of pixels

WriteImage Called to store the final image or partial images to disk The device-independent RGB is converted to the device-dependent RGB. First, convert to device-independent XYZ. Then, convert to device-dependent RGB according to your display. Here, pbrt uses the HDTV standard. Pbrt uses the EXR format to store image.

Portable floatMap (.pfm) 12 bytes per pixel, 4 for each channel sign exponent mantissa Text header similar to Jeff Poskanzer’s .ppm image format: PF 768 512 1 <binary image data> Floating Point TIFF similar

Radiance format (.pic, .hdr, .rad) 32 bits / pixel Red Green Blue Exponent (145, 215, 87, 149) = (145, 215, 87) * 2^(149-128) = (1190000, 1760000, 713000) (145, 215, 87, 103) = (145, 215, 87) * 2^(103-128) = (0.00000432, 0.00000641, 0.00000259) Ward, Greg. "Real Pixels," in Graphics Gems IV, edited by James Arvo, Academic Press, 1994

ILM’s OpenEXR (.exr) sign exponent mantissa 6 bytes per pixel, 2 for each channel, compressed sign exponent mantissa Several lossless compression options, 2:1 typical Compatible with the “half” datatype in NVidia's Cg Supported natively on GeForce FX and Quadro FX Available at http://www.openexr.net/

Tone mapping Converts HDR images to LDR image for display void ApplyImagingPipeline(float *rgb, int xResolution, int yResolution, float *yWeight, float bloomRadius, float bloomWeight, const char *toneMapName, const ParamSet *toneMapParams, float gamma, float dither, int maxDisplayValue) Not called in pbrt, but used by tools. It is possible to write a Film plugin to call tone mapping and store regular image. weights to convert RGB to Y

Image pipeline Possibly apply bloom effect to image Apply tone reproduction to image Handle out-of-gamut RGB values Apply gamma correction to image Map image to display range Dither image

Bloom without bloom with bloom blurred glow a much brighter feel

Bloom Apply a very wide filter that falls off quickly to obtain a filtered image Blend the original image and the filtered image by a user-specified weight to obtain the final image

Tone mapping Two categories: Spatially uniform (global): find a monotonic mapping to map pixel values to the display’s dynamic range Spatially varying (local): based on the fact that human eye is more sensitive to local contrast than overall luminance core/tonemap.h, tonemaps/* class ToneMap { public: // ToneMap Interface virtual ~ToneMap() { } virtual void Map(const float *y,int xRes,int yRes, float maxDisplayY, float *scale) const = 0; }; input radiance array display’s limit scale factor for each pixel

Maximum to white class MaxWhiteOp : public ToneMap { public: // MaxWhiteOp Public Methods void Map(const float *y, int xRes, int yRes, float maxDisplayY, float *scale) const { // Compute maximum luminance of all pixels float maxY = 0.; for (int i = 0; i < xRes * yRes; ++i) maxY = max(maxY, y[i]); float s = maxDisplayY / maxY; scale[i] = s; } }; 1. Does not consider HVS, two images different in scales will be rendered the same 2. A small number of bright pixels can cause the overall image too dark to see

Results input max-to-white

Contrast-based scale Developed by Ward (1994); compress the range but maintain the JND (just noticeable difference) If the radiance is Ya, the difference larger than ΔY is noticeable. Find s so that ; it gives We calculate the log average radiance as display radiance real radiance

Results input contrast-based

Varying adaptation luminance It computes a local adaptation luminance that smoothly varies over the image. The local adaptation luminance is then used to compute a scale factor. How to compute a local adaptation luminance? Find most blurred value Bs(x,y) so that the local contrast lc(x,y) is smaller than a threshold.

Varying adaptation luminance With the smooth local adaptation luminance image, the scale can be computed in a similar way to contrast-based method. target display luminance capacity function (intensity levels in terms of JND)

Results with fixed radius base on local contrast

Spatially varying nonlinear scale Empirical approach which works very well in practice. Similar to Reinhard 2002. not the luminance Y, but the y component in XYZ space

Results input nonlinear scale

Final stages Handle out-of-range gamut: scale by the maximum of each pixel Apply gamma correction: inverse gamma mapping for CRT‘s gamma mapping Map to display range: scaled by maxDisplayValue Dither image: add some noise in pixel values