RGB Models human visual system? Gives an absolute color description?

Slides:



Advertisements
Similar presentations
JPEG DCT Quantization FDCT of 8x8 blocks.
Advertisements

Introduction to Computer Graphics ColorColor. Specifying Color Color perception usually involves three quantities: Hue: Distinguishes between colors like.
Color Appearance Models The Nayatani et al. Model The Hunt 91 and 94 Model The RLAB Model Iris Zhao April 21, 2004.
Color.
Color Image Processing
Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.
School of Computing Science Simon Fraser University
Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.
Printer/monitor incompatibilities Gamut –Colors in one that are not in the other –Different whitepoint –Complements of one not in the other Luminance.
SWE 423: Multimedia Systems Chapter 4: Graphics and Images (2)
Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color perceived to belong to.
Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.
JPEG DCT Quantization FDCT of 8x8 blocks. –Order in increasing spatial frequency (zigzag) Low frequencies have more shape information, get finer quantization.
What is color for?.
Color Management Systems Problems –Solve gamut matching issues –Attempt uniform appearance Solutions –Image dependent manipulations (e.g. Stone) –Device.
Quantization If too few levels of gray, (e.g. decrease halftone spot size to increase spatial resolution), then boundaries between adjacent gray levels.
1 Computer Science 631 Lecture 6: Color Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
Trichromacy Helmholtz thought three separate images went forward, R, G, B. Wrong because retinal processing combines them in opponent channels. Hering.
Color Representation Lecture 3 CIEXYZ Color Space CIE Chromaticity Space HSL,HSV,LUV,CIELab X Z Y.
COLOR and the human response to light
Gamut Mapping First try: map black points and fill destination gamut.
Display Issues Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico.
RGB Models human visual system? Gives an absolute color description? Models color similarity? Linear model? Convenient for color displays?
Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.
1 CSCE441: Computer Graphics: Color Models Jinxiang Chai.
CS559-Computer Graphics Copyright Stephen Chenney Color Recap The physical description of color is as a spectrum: the intensity of light at each wavelength.
Color & Color Management. Overview I. Color Perception Definition & characteristics of color II. Color Representation RGB, CMYK, XYZ, Lab III. Color Management.
Dye Sublimation Color Management
9/14/04© University of Wisconsin, CS559 Spring 2004 Last Time Intensity perception – the importance of ratios Dynamic Range – what it means and some of.
Colour Digital Multimedia, 2nd edition Nigel Chapman & Jenny Chapman
Digital Multimedia, 2nd edition Nigel Chapman & Jenny Chapman Chapter 6 This presentation © 2004, MacAvon Media Productions Colour.
Understanding Colour Colour Models Dr Jimmy Lam Tutorial from Adobe Photoshop CS.
2001 by Jim X. Chen: 1 The purpose of a color model is to allow convenient specification of colors within some color gamut.
Any questions about the current assignment? (I’ll do my best to help!)
Lab #5-6 Follow-Up: More Python; Images Images ● A signal (e.g. sound, temperature infrared sensor reading) is a single (one- dimensional) quantity that.
Computer Vision – Fundamentals of Human Vision Hanyang University Jong-Il Park.
Color 2011, Fall. Colorimetry : Definition (1/2) Colorimetry  Light is perceived in the visible band from 380 to 780 nm  distribution of wavelengths.
Why is this hard to read. Unrelated vs. Related Color Unrelated color: color perceived to belong to an area in isolation (CIE 17.4) Related color: color.
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
1 Chapter 2: Color Basics. 2 What is light?  EM wave, radiation  Visible light has a spectrum wavelength from 400 – 780 nm.  Light can be composed.
Digital Image Processing Part 1 Introduction. The eye.
CSC361/ Digital Media Burg/Wong
COLORCOLOR Angel 1.4 and 2.4 J. Lindblad
CS6825: Color 2 Light and Color Light is electromagnetic radiation Light is electromagnetic radiation Visible light: nm. range Visible light:
Graphics Lecture 4: Slide 1 Interactive Computer Graphics Lecture 4: Colour.
Sensory Information Processing
Three-Receptor Model Designing a system that can individually display thousands of colors is very difficult Instead, colors can be reproduced by mixing.
1 CSCE441: Computer Graphics: Color Models Jinxiang Chai.
Introduction to Computer Graphics
David Luebke 1 2/5/2016 Color CS 445/645 Introduction to Computer Graphics David Luebke, Spring 2003.
Color naming A Computational model of Color Perception and Color Naming, Johann Lammens, Buffalo CS Ph.D. dissertation
Charge Coupled Device (CCD) < 10  m x 10  m Silicon cells emit electrons when light falls on it.
Chapter 9: Perceiving Color. Figure 9-1 p200 Figure 9-2 p201.
09/10/02(c) University of Wisconsin, CS559 Fall 2002 Last Time Digital Images –Spatial and Color resolution Color –The physics of color.
Color Measurement and Reproduction Eric Dubois. How Can We Specify a Color Numerically? What measurements do we need to take of a colored light to uniquely.
1 of 32 Computer Graphics Color. 2 of 32 Basics Of Color elements of color:
Click to edit Master title style Click to edit Master text styles Second level Third level Fourth level Fifth level 1 Integrated Color Solutions A presentation.
Color Models Light property Color models.
Half Toning Dithering RGB CMYK Models
Color Image Processing
Color Image Processing
COLOR space Mohiuddin Ahmad.
Color Image Processing
Perception and Measurement of Light, Color, and Appearance
Computer Vision Lecture 4: Color
Introduction to Perception and Color
Color Image Processing
Slides taken from Scott Schaefer
Color Image Processing
Color Model By : Mustafa Salam.
Presentation transcript:

RGB Models human visual system? Gives an absolute color description? Models color similarity? Linear model? Convenient for color displays?

Convenient for color displays RGB Models human visual system Gives an absolute color description Models color similarity Linear model Convenient for color displays

Spectra Light reaching the retina is characterized by spectral distribution, i.e. (relative) amount of power at each wavelength. Each kind of cone (S,M,L) responds differently. add daylight spectrum matlab here

Sources of colored light used in modern fireworks. Yellow Sodium D-line 589 nm Orange CaCl 591- 599 nm 603-608 nm Red SrCl 617-623 nm 627-635 nm 640-646 nm Green BaCl 511-515 nm 524-528 nm 530-533 nm Blue CuCl 403-456 nm, From http://cc.oulu.fi/~kempmp/colours.html

Lens Retina Cornea Fovea Pupil Optic nerve Iris

Optic nerve Light Ganglion Amacrine Bipolar Horizontal Cone Rod Epithelium Retinal cross section

Photoreceptors Cones - respond in high (photopic) light differing wavelength responses (3 types) single cones feed retinal ganglion cells so give high spatial resolution but low sensitivity highest sampling rate at fovea

Photoreceptors Rods respond in low (scotopic) light none in fovea one type of spectral response several hundred feed each ganglion cell so give high sensitivity but low spatial resolution

Optic nerve 130 million photoreceptors feed 1 million ganglion cells whose output is the optic nerve. Optic nerve feeds the Lateral Geniculate Nucleus approximately 1-1 LGN feeds area V1 of visual cortex in complex ways.

Rods and cones Rods saturate at 100 cd/m2 so only cones work at high (photopic) light levels All have same spectral sensitivity Low light condition is called scotopic Three cone types differ in spectral sensitivity and somewhat in spatial distribution.

Cones L (long wave), M (medium), S (short) describes sensitivity curves. “Red”, “Green”, “Blue” is a misnomer. See spectral sensitivity.

Spectrum should be displayed compressed from about 400 to 680 nm

Trichromacy Helmholtz thought three separate images went forward, R, G, B. Wrong because retinal processing combines them in opponent channels. Hering proposed opponent models, close to right.

Opponent Models Three channels leave the retina: Red-Green (L-M+S = L-(M-S)) Yellow-Blue(L+M-S) Achromatic (L+M+S) Note that chromatic channels can have negative response (inhibition). This is difficult to model with light.

Adaptation Luminance adaptation allows greater sensitivity but over narrow ranges Chromatic adaptation supports color constancy by compensating for changes in illuminating spectra.

Log Spatial Frequency (cpd) 100 Luminance 10 Contrast Sensitivity 1.0 Red-Green 0.1 Blue-Yellow 0.001 -1 1 2 Log Spatial Frequency (cpd)

Weber Fraction DI/I = c, DI = perceived change log DI = log I + log c perceived change vs I log DI = l log I + a yields DI = c Il power law Many perceptual responses follow power laws with l<1, i.e. compressive non-linearity

Other non-linearities

Color matching Grassman laws of linearity: (r1 + r2)(l) = r1(l) + r2(l) (kr)(l) = k(r(l)) Hence for any stimulus s(l) and response r(l), total response is integral of s(l) r(l), taken over all l or approximately S s(l)r(l)

Surround light Primary lights Surround field Bipartite white screen Subject Test light Primary lights Test light

Color matching M(l) = R*R(l) + G*G(l) + B*B(l) Metamers possible good: RGB functions are like cone response bad: Can’t match all visible lights with any triple of monochromatic lights. Need to add some of primaries to the matched light

Surround light Primary lights Surround field Bipartite white screen Subject Test light Primary lights Test light

Color matching Solution: XYZ basis functions

Color matching Note Y is V(l) None of these are lights Euclidean distance in RGB and in XYZ is not perceptually useful. Nothing about color appearance

CIE L*a*b* Normalized to white-point L* is (relative) ligntness a* is (relative) redness-greeness b* is (relative) yellowness-blueness C* = length on a*-b* space is chroma, i.e. degree of colorfulness h = tan-1(b*/a*) is hue

CIE L*a*b*, L*u*v* Euclidean distance corresponds to judgements of color difference, especially lightness Somewhat realistic nonlinearities modeled

Lightness.m colorPatch.m - matlab image repn. umbColormatching.m

Color Appearance Absolute Brighness Colorfulness Relative Lightness Chroma rel to white point “colorfulness/brightness(white)” Saturation rel to own brightness “colorfulness/brightness”

Photoshop Calibration File->Color->RGB RGB space: Gamma White point Primaries Reset to sRGB!!!

Photoshop color picker Examine planes of fixed hue saturation lightness L* a* b*

light yellow b* green red a* blue CIE Lab space dark dark

IIIIIIIIIIIIIIIIIIII IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII

xyz2displayrgb SPD of color [r,g,b] :

xyz2displayrgb SPD of color [r,g,b] : phosphor

xyz2displayrgb SPD of color [r,g,b] : phosphor*[r,g,b]’

xyz2displayrgb SPD of color [r,g,b] : phosphor*[r,g,b]’ XYZ tristimulus values

xyz2displayrgb SPD of color [r,g,b] : phosphor*[r,g,b]’ XYZ tristimulus values xyz’=xyzbar’*phosphor*[r,g,b]’

xyz2displayrgb SPD of color [r,g,b] : phosphor*[r,g,b]’ XYZ tristimulus values xyz’=xyzbar’*phosphor*[r,g,b]’ [r,g,b]’=

xyz2displayrgb SPD of color [r,g,b] : phosphor*[r,g,b]’ XYZ tristimulus values xyz’=xyzbar’*phosphor*[r,g,b]’ [r,g,b]’= inv(xyzbar’*phosphor)*xyz’

xyz2displayrgb SPD of color [r,g,b] : phosphor*[r,g,b]’ XYZ tristimulus values xyz’=xyzbar’*phosphor*[r,g,b]’ [r,g,b]’= inv(xyzbar’*phosphor)*xyz’ mon2XYZ

xyz2displayrgb SPD of color [r,g,b] : phosphor*[r,g,b]’ XYZ tristimulus values xyz’=xyzbar’*phosphor*[r,g,b]’ [r,g,b]’= inv(xyzbar’*phosphor) *xyz’ xyz2displayrgb

Viewing Conditions Illuminant matters. Table 7-1 shows DE* using two different illuminants. DE* <= 2.5 is typically deemed a match. On the midterm: using chromaticities for Munsell principal hues, calculate DE* for the hues with Wandell monitor whitepoint and D65

Viewing Modes Viewing mode = to what we attribute color Illuminant: illuminating light is colored Illumination: prevailing changes to the illuminant, e.g. shading from obstruction Surface: color belongs to the surface Volume: color belongs to the volume Aperture: pure color absent an object

Adaptation Light adaptation - quick Dark adaptation - slow

Chromatic Adaptation At all levels: cone, other retinal layers, LGN, cortex; including opponent mechanisms (e.g. green flash) Subserves discounting the illuminant when illuminant is spatially uniform

Adaptation mechanisms Neural gain control: reduced sensitivity at high input, increased at low input. For cones this is photochemical dyanmics, further up it is neurochemistry dynamics Temporal mechanisms -evidence for cortical adaptation mechanisms. (e.g. waterfall illusion).

Chromatic adaptation models vonKries: chromatic adaptation is cone mediated independent mechanisms in L,M,S linear All are slightly wrong, but a good place to start.

Chromatic adaptation models three independent gain controls: La = kLL Ma=kMM Sa=kSS L = L-cone response, La = adapted response of L cones, etc

Chromatic adaptation models Choice of gain control parameters depends on model. Often simply defined to guarantee adapted response is 1 at max of unadapted response or at scene-white kL= 1/Lmax or kL= 1/Lwhite so L max a = kL Lmax = 1, etc.

Chromatic adaptation models If have two viewing conditions and M is transform for CIE XYZ to cone responses then can convert from adaptation in one condition to adaptation in the other by

Chromatic adaptation models Conversion from one adaptation to another X1 Lmax2 0 0 1/Lmax1 0 0 X1 X2 = M-1 0 Mmax2 0 0 1/ Mmax1 0 M X2 X3 0 0 Smax2 0 0 1/ Smax1 X3 See Figure 9.2 for prediction of such a model

Non-linear chromatic adaptation models Nayatani: adds noise and power law in brightness. La = aL((L+Ln)/(L0+Ln))bL etc. La : adapted L cone response Ln : noise signal; L0 : response to adapting field aL : fit from a color constancy hypothesis

Nayatani Color Appearance Model Model components Nonlinear chromatic adaptation One achromatic, two chromatic color opponent channels weighted by cone population ratios

Nayatani Color Appearance Model Model outputs Brightness as linear function of adapted cone responses (which are non-linear!) Lighness: achromatic channel origin translated to black=0, white = 100 Brightness of “ideal white” (=perfect reflector) Hue angle (from the chromatic channels)

Nayatani Color Appearance Model Model outputs Hue quadrature: interpolation between 4 hues defined by chromatic channels red (20.14), yellow (90 .00), green (164.25), blue (231.00) Saturation: depends on hue and luminance (predicts changes of chromaticity with luminance) Chroma = saturation*lightness Colorfullness: Chroma*brightness of ideal white.

Nayatani Color Appearance Model Advantages Invertible for many outputs, i.e. measure output quantities, predict inputs Accounts for changes in color appearance with chromatic adaptation and luminance

Nayatani Color Appearance Model Weaknesses Doesn’t predict: Effects of changes in background color or relative luminance incomplete chromatic adaptation cognitive discounting the illuminant appearance of complex patches or background mesopic color vision

Color Appearance Absolute Brighness Colorfulness Relative Lightness Chroma rel to white point “colorfulness/brightness(white)” Saturation rel to own brightness “colorfulness/brightness”

Hunt Color Appearance Model Inputs chromaticity of adapting field chromaticity of illuminant chromaticity and reflectivity of background proximal field (up to 2° from stimulus) reference white

Hunt Color Appearance Model Inputs absolute luminance of reference white adapting field scotopic luminance data parameters for chromatic and brightness induction

Hunt Color Appearance Model Properties Non-linear responses Models incomplete chromatic adaptation Chromatic adaptation constants depend on luminance Models saturation Models brightness, lightness, chroma and colorfulness

Hunt Color Appearance Model Good: Predicts many color appearance phenomena Useful for unrelated or related colors Large range of luminance levels of stimuli and background Bad: Complex, computationally expensive Not analytically invertible

Testing Color Appearance Models Qualitative tests Corresponding colors data (colors which appear the same when viewed under different conditions) Magnitude estimation tests Psychophysics

Testing Color Appearance Models- Qualitative Tests Predictions of color appearance phenomena, e.g. illuminant effects Comparisons with color order systems e.g. Helson-Judd effect: perceived hue of neutral Munsell colors is not neutral under strong chromatic illumination but depends on hue of illuminant and relative brightness of test to background. Hunt model successfully predicts, von Kriess model does not.

Testing Color Appearance Models- Qualitative Tests Magnitude Estimation of appearance attributes Comparisons with color order systems e.g. Helson-Judd effect: perceived hue of neutral Munsell colors is not neutral under strong chromatic illumination but depends on hue of illuminant and relative brightness of test to background.

Testing Color Appearance Models- Qualitative Tests Adjust parameters to predict constancies in standard color order systems (e.g. constant L*a*b* chroma of Munsell colors), then test model for related properites (e.g. hue shift under luminance change). Predict complex related colors phenomena, e.g. local vs. global color filtering.

Testing Color Appearance Models- Corresponding Colors Corresponding colors: two different colors, C1, C2 which appear the same for two different viewing conditions V1, V2 Test model by transforming C1 to V2. Importance: correcting images made under assumption of V1 but actually produce under V2, e.g. photos under D65 vs F vs A

Testing Color Appearance Models- Magnitude Estimation Observers assign numerical values to color appearance attributes Examples of results: Background and white point have most influence of colorfulness, lightness, hue Magnitude estimation of lightness predicted best by Hunt, next by CIELAB, then Nayatani Estimation of colorfulness badly predicted by all models

Testing Color Appearance Models- Magnitude Estimation Observers assign numerical values to color appearance attributes Examples of results: Estimation of hue predicted best for Hunt model, which was revised as suggested by experiments. etc. See Chapter 15, Fairchild

Testing Color Appearance Models- Pyschophysics Techniques starting with paired quality judgements can lead to a precise interval scale. (This is the way eyeglasses are prescribed.) Good for predicting media changes. (Review Fairchild 15.7)

MacAdam Ellipses JND of chromaticity Bipartite equiluminant color matching to a given stimulus. Depends on chromaticity both in magnitude and direction.

MacAdam Ellipses For each observer, high correlation to variance of repeated color matches in direction, shape and size 2-d normal distributions are ellipses neural noise? See Wysecki and Styles, Fig 1(5.4.1) p. 307

MacAdam Ellipses JND of chromaticity Weak inter-observer correlation in size, shape, orientation. No explanation in Wysecki and Stiles 1982 More modern models that can normalize to observer?

MacAdam Ellipses JND of chromaticity Extension to varying luminence: ellipsoids in XYZ space which project appropriately for fixed luminence

MacAdam Ellipses JND of chromaticity Technology applications: Bit stealing: points inside chromatic JND ellipsoid are not distinguishable chromatically but may be above luminance JND. Using those points in RGB space can thus increase the luminance resolution. In turn, this has appearance of increased spatial resolution (“anti-aliasing”) Microsoft ClearType. See http://www.grc.com/freeandclear.htm and http://www.ductus.com/cleartype/cleartype.html

Complementary Colors Colors which sum to white point are called complementary colors a*c1+b*c2 = wp Some monochromatic colors have complements, others don’t. See ComplementaryColors.m Complements may be out of gamut. See Photoshop.

Printer/monitor incompatibilities Gamut Colors in one that are not in the other Different whitepoint Complements of one not in the other Luminance ranges have different quantization (especially gray)

Photography, Painting Photo printing is via filters. Really multiplicative (e.g. .2 x .2 = .04) but convention is to take logarithm and regard as subtractive. Oil paint mixing is additive, water color is subtractive.

Printing Inks are subtractive Cyan (white - red) Magenta (white - green) Yellow (white - blue) In practice inks are opaque, so can’t do mixing like oil paints. May use black ink on economic and physical grounds

Halftoning The problem with ink: it’s opaque Screening: luminance range is accomplished by printing with dots of varying size. Collections of big dots appear dark, small dots appear light. % of area covered gives darkness.

Halftoning references A commercial but good set of tutorials Digital Halftoning, by Robert Ulichney, MIT Press, 1987 Stochastic halftoning

Color halftoning Needs screens at different angles to avoid moire Needs differential color weighting due to nonlinear visual color response and spatial frequency dependencies.

Right image is JPEG from digital photograph at 144 dpi, 1120x686 pixels, scaled by 25% in photoshop and imported as jpeg. Left image is Photoshop screening emulation with default screening parameters. Image imported as jpeg and reduced 25% in PowerPoint

Device Independence Calibration to standard space typically CIE XYZ Coordinate transforms through standard space Gamut mapping

Device independence Stone et. al. “Color Gamut Mapping and the Printing of Digital Color Images”, ACM Transactions on Graphics, 7(4) October 1998, pp. 249-292. The following slides refer to their techniques.

Device to XYZ Sample gamut in device space on 8x8x8 mesh (7x7x7 = 343 cubes). Measure (or model) device on mesh. Interpolate with trilinear interpolation for small mesh and reasonable function XYZ=f(device1, device2, device3) this approximates interpolating to tangent.

XYZ to Device Invert function XYZ=f(device1, device2, device3) hard to do in general if f is ill behaved At least make f monotonic by throwing out distinct points with same XYZ. e.g. CMY device: (continued)

XYZ to CMY Invert function XYZ=f(c,m,y) Given XYZ=[x,y,z] want to find CMY=[c,m,y] such that f(CMY)=XYZ Consider X(c,m,y), Y(c,m,y), Z(c,m,y) A continuous function on a closed region has max and min on the region boundaries, here the cube vertices. Also, if a continuous function has opposite signs on two boundary points, it is zero somewhere in between.

XYZ to CMY Given X0, find [c,m,y] such that f(c,m,y) = X0 if [ci,mi,yi] [cj,mj,yj] are vertices on a given cube, and U=X(c,m,y)- X0 has opposite sign on them, then it is zero in the cube. Similarly Y, Z. If find such vertices for all of X0,Y0,Z0, then the found cube contains the desired point. (and use interpolation). Doing this recursively will find the desired point if there is one.

Gamut Mapping Criteria: preserve gray axis of original image maximum luminance contrast few colors map outside destination gamut hue, saturation shifts minimized increase, rather than decrease saturation do not violate color knowledge, e.g. sky is blue, fruit colors, skin colors

Gamut Mapping Special colors and problems Highlights: this is a luminance issue so is about the gray axis Colors near black: locus of these colors in image gamut must map into something reasonably similar shape else contrast and saturation is wrong

Gamut Mapping Special colors and problems Highly saturated colors (far from white point): printers often incapable. Colors on the image gamut boundary occupying large parts of the image. Should map inside target gamut else have to project them all on target boundary.

Gamuts CRT Printer

Gamut Mapping First try: map black points and fill destination gamut.

device gamut image gamut

device gamut translate Bito Bd image gamut

device gamut translate Bito Bd image gamut scale by csf

device gamut translate Bito Bd image gamut scale by csf rotate

Gamut Mapping Xd = Bd + csf R (Xi - Bi) Problems: Bi = image black, Bd = destination black R = rotation matrix csf = contrast scaling factor Xi = image color, Xd = destination color Problems: Image colors near black outside of destination are especially bad: loss of detail, hue shifts due to quantization error, ...

shift and scale along destination gray Xd = Bd + csf R (Xi - Bi) + bs (Wd - Bd) shift and scale along destination gray

Fig 14a, bs>0, csf small, image gamut maps entirely into printer gamut, but contrast is low. Fig 14b, bs=0, csf large, more contrast, more colors inside printer gamut, but also more outside.

Saturation control “Umbrella transformation” [Rs Gs Bs] = monitor whitepoint [Rn Gn Bn] new RGB coordinates such that Rs + Gs + Bs = Rn + Gn + Bn and [Rn Gn Bn] maps inside destination gamut First map R Rs+G Gs+B Bs to R Rn+G Gn+B Bn Then map into printer coordinates Makes minor hue changes, but “relative” colors preserved. Achromatic remain achromatic.

Projective Clipping After all, some colors remain outside printer gamut Project these onto the gamut surface: Try a perpendicular projection to nearest triangular face in printer gamut surface. If none, find a perpendicular projection to the nearest edge on the surface If none, use closest vertex

Projective Clipping This is the closest point on the surface to the given color Result is continuous projection if gamut is convex, but not else. Bad: want nearby image colors to be nearby in destination gamut.

Projective Clipping Problems Printer gamuts have worst concavities near black point, giving quantization errors. Nearest point projection uses Euclidean distance in XYZ space, but that is not perceptually uniform. Try CIELAB? SCIELAB? Keep out of gamut distances small at cost of use of less than full printer gamut use.

Color Management Systems Problems Solve gamut matching issues Attempt uniform appearance Solutions Image dependent manipulations (e.g. Stone) Device independent image editors (e.g. Photoshop) with embedded CMS ICC Profiles

ICC Color Profiles International Color Consortium http://www.color.org. ICC Profile device description text characterization data calibration data invertible transforms to a fixed virtual color space, the Profile Connection Space (PCS)

Profile Connection Space Presently only two PCS’s: CIELAB and CIEXYZ Both specified with D50 white point Device<-->PCS must account for viewing conditions, gamut mapping and tone (e.g. gamma) mapping.

Viewing-condition independent space Gamut mapping, tone control, etc Input image and device Viewing-condition independent space input device colorimetric characterization Chromatic adaptation and color appearance models DVI color space (PCS) DVI color space (e.g. XYZ) DVI color cpace Chromatic adaptation and color appearance models Chromatic adaptation and color appearance models output device colorimetric characterization Viewing-condition independent space Output image and device Gamut mapping, tone control, etc

ICC Profiles Device profiles Colorspace profiles Device Link profile data conversion Device Link profile concatenated D1->PCS->D2 Abstract profile generic for private purposes, e.g. special effects

ICC Profiles Named color profile Allows data described in Pantone system (and others?) to map to other devices, e.g. view. Supported in Photoshop Photoshop demo of nearest Pantone, etc. colors to a selected color

ICC Profile Data Tags Profile header tags: administrative and descriptive Start of Header Byte count of profile Profile version number Profile or device class (input, display, output, link, colorspace, abstract, named color profile) PCS target (CIEXYZ or CIELab) Photoshop demo of nearest Pantone, etc. colors to a selected color

ICC Profile Data Tags Profile header tags: ICC registered device manufacturer, model Media attributes 64 attribute bits, 32 reserved (reflective/transparent; glossy/matte. ) XYZ of illuminant Rendering intent (Perceptual, relative colorimetry, saturation, absolute colorimetry) Photoshop demo of nearest Pantone, etc. colors to a selected color

ICC Profile Rendering Intents perceptual: “full gamut of the image is compressed or expanded to fill the gamut of the destination device. Gray balance is preserved but colorimetric accuracy might not be preserved.” (ICC Spec Clause 4.9) saturation: “specifies the saturation of the pixels in the image is preserved perhaps at the expense of accuracy in hue and lightness.” (ICC Spec Clause 4.12) absolute colorimetry: relative to illuminant only relative colorimetry: relative to illuminant and media whitepoint Photoshop demo of nearest Pantone, etc. colors to a selected color

ICC Profile Data Tags Tone Reproduction Curve (TRC) tags: grayTRC, redTRC, greenTRC, blueTRC single number (gamma) if TRC is exponential array of samples of the TRC appropriate to interpolation Photoshop demo of nearest Pantone, etc. colors to a selected color

ICC Profile Data Tags Mapping tags (“AtoB0Tag”, “BtoA0Tag”, etc.) Map between device and PCS Includes 3x3 matrix if mapping is linear map of CIEXYZ spaces, or lookup table on sample points if not. Photoshop demo of nearest Pantone, etc. colors to a selected color

ICC Profile Special Goodies Initimate with PostScript Support for PostScript Color Rendering Dictionaries reduces processing in printer Support for argument lists to PostScript level 2 color handling Halftone screen geometry and frequency Undercolor removal Embedding profiles in pict, gif, tiff, jpeg,eps Photoshop demo of nearest Pantone, etc. colors to a selected color

JPEG DCT Quantization FDCT of 8x8 blocks. Order in increasing spatial frequency (zigzag) Low frequencies have more shape information, get finer quantization. High’s often very small so go to zero after quantizing If source has 8-bit entries ( s in [-27, 27-1), can show that quantized DCT needs at most 11 bits (c in [-210, 210-1]) See Wallace paper, p 12. Note high frequency contributions small.

JPEG DCT Quantization Quantize with single 64x64 table of divisors Quantization table can be in file or reference to standard Standard quantizer based on JND. Note can have one quantizer table for each image component See Wallace p 12.

JPEG DCT Intermediate Entropy Coding Variable length code (Huffman): High occurrence symbols coded with fewer bits Intermediate code: symbol pairs symbol-1 chosen from table of symbols si,j i is run length of zeros preceding quantized dct amplitude, j is length of huffman coding of the dct amplitude i = 0…15, j= 1…10, and s0,0=‘EOB’ s15,0 = ‘ZRL’ symbol-2: Huffman encoding of dct amplitude Finally, these 162 symbols are Huffman encoded.

JPEG components Y = 0.299R + 0.587G + 0.114B Cb = 0.1687R - 0.3313G + 0.5B Cr = 0.5R - 0.4187G - 0.0813B Optionally subsample Cb, Cr replace each pixel pair with its average. Not much loss of fidelity. Reduce data by 1/2*1/3+1/2*1/3 = 1/3 More shape info in achromatic than chromatic components. (Color vision poor at localization).

JPEG goodies Progressive mode - multiple scans, e.g. increasing spatial frequency so decoding gives shapes then detail Hierarchical encoding - multiple resolutions Lossless coding mode JFIF: User embedded data more than 3 components possible?

Huffman Encoding 1 00 s1 01 s2 11 s3 100 s4 10 101 1011 s6 1010 s5

Huffman Encoding 1110101101100 Traverse from root to leaf, then repeat: 11 1010 11 01 100 s3 s5 s3 s2 s4 1 00 s1 01 s2 11 s3 100 s4 10 101 1011 s6 1010 s5

Charge Coupled Device (CCD) < 10mm x 10mm Silicon cells emit electrons when light falls on it

Charge Coupled Device (CCD) luminance cell < 10mm x 10mm time charge charge

Filters over cells More green than red, blue Y=0.299R + 0.587G +0.114B (For color tv and…?)

CCD Cameras Good links: Some device specs: http://denton.chem.arizona.edu/ccd/ Some device specs: http://www.MASDKODAK.com/

Color TV Multiple standards - US, 2 in Europe, HDTV standards, Digital HDTV , Japanese analog. US: 525 lines (US HDTV is digital, and data stream defines resolution. Typically MPEG encoded to provide 1088 lines of which 1080 are displayed)

NTSC Analog Color TV 525 lines/frame Interlaced to reduce bandwidth small interframe changes help Primary chromaticities:

NTSC Analog Color TV These yield 1.909 -0.985 0.058 RGB2XYZ = -0.532 1.997 -0.119 -0.288 -0.028 0.902 Y=0.299R + 0.587G +0.114B (same as luminance channel for JPEG!) = Y value of white point. Cr = R-Y, Cb = B-Y with chromaticity: Cr: x=1.070, y=0; Cb: x=0.131 y=0; y(C)=0 => Y(C)=0 => achromatic

NTSC Analog Color TV Signals are gamma corrected under assumption of dim surround viewing conditions (high saturation). Y, Cr, Cb signals (EY, Er, Eb) are sent per scan line; NTSC, SECAM, PAL do this in differing clever ways EY typically with twice the bandwidth of Er, Eb

NTSC Analog Color TV Y, Cr, Cb signals (EY, Er, Eb) are sent per scan line; NTSC, SECAM, PAL do this in differing clever ways. EY with 4-10 x bandwidth of Er, Eb “Blue saving”

Digital HDTV 1987 - FCC seeks proposals for advanced tv Broadcast industry wants analog, 2x lines of NTSC for compatibility Computer industry wanta digital 1993 (February) DHDTV demonstrated in four incompatible systems 1993 (May) Grand Alliance formed

Digital HDTV 1996 (Dec 26) FCC accepts Grand Alliance Proposal of the Advanced Televisions Systems Committee ATSC 1999 first DHDTV broadcasts

Digital HDTV MPEG video compression Dolby AC-3 audio compression lines hpix aspect frames frame rate ratio 720 1280 16/9 progressive 24, 30 or 60 1080 1920 16/9 interlaced 60 1080 1920 16/9 progressive 24, 30 MPEG video compression Dolby AC-3 audio compression

Some gamuts SWOP ENCAD GA ink

Color naming A Computational model of Color Perception and Color Naming, Johann Lammens, Buffalo CS Ph.D. dissertation http://www.cs.buffalo.edu/pub/colornaming/diss/diss.html Cross language study of Berlin and Kay, 1969 “Basic colors”

Color naming “Basic colors” Meaning not predicted from parts (e.g. blue, yellow, but not bluish) not subsumed in another color category, (e.g. red but not crimson or scarlet) can apply to any object (e.g. brown but not blond) highly meaningful across informants (red but not chartruese) Ask audience to give basic colors they can identify in Berlin Kay colors (adapted by Lammel)

Color naming “Basic colors” Vary with language

Color naming Berlin and Kay experiment: Elicit all basic color terms from 329 Munsell chips (40 equally spaced hues x 8 values plus 9 neutral hues Find best representative Find boundaries of that term

Color naming Berlin and Kay experiment: Representative (“focus” constant across lang’s) Boundaries vary even across subjects and trials Lammens fits a linear+sigmoid model to each of R-B B-Y and Brightness data from macaque monkey LGN data of DeValois et. al.(1966) to get a color model. As usual this is two chromatic and one achromatic

Color naming To account for boundaries Lammens used standard statistical pattern recognition with the feature set determined by the coordinates in his color space defined by macaque LGN opponent responses. Has some theoretical but no(?) experimental justification for the model.

Pantone Color Combo of the Month January 1999 Mecca Orange (Pantone 1675C) Canteen (Pantone 405 C) Violet Quartz (Pantone 689 C) Pantone Color Combo of the Month January 1999 That's all for today Pantone’s RGB for Violet Quartz has some values greater than 100%, i.e. than 255. Converting all three from CYMK to RGB with Photoshop using default monitor characters, including D65 whitepoint, gives different RGB from Panton. Probably Pantone is D50 whitepoint.