Presentation is loading. Please wait.

Presentation is loading. Please wait.

RGB Models human visual system? Gives an absolute color description?

Similar presentations


Presentation on theme: "RGB Models human visual system? Gives an absolute color description?"— Presentation transcript:

1 RGB Models human visual system? Gives an absolute color description?
Models color similarity? Linear model? Convenient for color displays?

2 Convenient for color displays
RGB Models human visual system Gives an absolute color description Models color similarity Linear model Convenient for color displays

3 Spectra Light reaching the retina is characterized by spectral distribution, i.e. (relative) amount of power at each wavelength. Each kind of cone (S,M,L) responds differently. add daylight spectrum matlab here

4

5 Sources of colored light used in modern fireworks.
Yellow Sodium D-line 589 nm Orange CaCl nm nm Red SrCl nm nm nm Green BaCl nm nm nm Blue CuCl nm, From

6 Lens Retina Cornea Fovea Pupil Optic nerve Iris

7 Optic nerve Light Ganglion Amacrine Bipolar Horizontal Cone Rod Epithelium Retinal cross section

8 Photoreceptors Cones - respond in high (photopic) light
differing wavelength responses (3 types) single cones feed retinal ganglion cells so give high spatial resolution but low sensitivity highest sampling rate at fovea

9 Photoreceptors Rods respond in low (scotopic) light none in fovea
one type of spectral response several hundred feed each ganglion cell so give high sensitivity but low spatial resolution

10 Optic nerve 130 million photoreceptors feed 1 million ganglion cells whose output is the optic nerve. Optic nerve feeds the Lateral Geniculate Nucleus approximately 1-1 LGN feeds area V1 of visual cortex in complex ways.

11 Rods and cones Rods saturate at 100 cd/m2 so only cones work at high (photopic) light levels All have same spectral sensitivity Low light condition is called scotopic Three cone types differ in spectral sensitivity and somewhat in spatial distribution.

12 Cones L (long wave), M (medium), S (short)
describes sensitivity curves. “Red”, “Green”, “Blue” is a misnomer. See spectral sensitivity.

13 Spectrum should be displayed compressed from about 400 to 680 nm

14

15 Trichromacy Helmholtz thought three separate images went forward, R, G, B. Wrong because retinal processing combines them in opponent channels. Hering proposed opponent models, close to right.

16 Opponent Models Three channels leave the retina:
Red-Green (L-M+S = L-(M-S)) Yellow-Blue(L+M-S) Achromatic (L+M+S) Note that chromatic channels can have negative response (inhibition). This is difficult to model with light.

17 Adaptation Luminance adaptation allows greater sensitivity but over narrow ranges Chromatic adaptation supports color constancy by compensating for changes in illuminating spectra.

18

19 Log Spatial Frequency (cpd)
100 Luminance 10 Contrast Sensitivity 1.0 Red-Green 0.1 Blue-Yellow 0.001 -1 1 2 Log Spatial Frequency (cpd)

20 Weber Fraction DI/I = c, DI = perceived change
log DI = log I + log c perceived change vs I log DI = l log I + a yields DI = c Il power law Many perceptual responses follow power laws with l<1, i.e. compressive non-linearity

21 Other non-linearities

22 Color matching Grassman laws of linearity: (r1 + r2)(l) = r1(l) + r2(l) (kr)(l) = k(r(l)) Hence for any stimulus s(l) and response r(l), total response is integral of s(l) r(l), taken over all l or approximately S s(l)r(l)

23 Surround light Primary lights Surround field Bipartite white screen Subject Test light Primary lights Test light

24 Color matching M(l) = R*R(l) + G*G(l) + B*B(l) Metamers possible
good: RGB functions are like cone response bad: Can’t match all visible lights with any triple of monochromatic lights. Need to add some of primaries to the matched light

25 Surround light Primary lights Surround field Bipartite white screen Subject Test light Primary lights Test light

26

27 Color matching Solution: XYZ basis functions

28

29 Color matching Note Y is V(l) None of these are lights
Euclidean distance in RGB and in XYZ is not perceptually useful. Nothing about color appearance

30 CIE L*a*b* Normalized to white-point L* is (relative) ligntness
a* is (relative) redness-greeness b* is (relative) yellowness-blueness C* = length on a*-b* space is chroma, i.e. degree of colorfulness h = tan-1(b*/a*) is hue

31 CIE L*a*b*, L*u*v* Euclidean distance corresponds to judgements of color difference, especially lightness Somewhat realistic nonlinearities modeled

32 Lightness.m colorPatch.m - matlab image repn. umbColormatching.m

33 Color Appearance Absolute Brighness Colorfulness Relative Lightness
Chroma rel to white point “colorfulness/brightness(white)” Saturation rel to own brightness “colorfulness/brightness”

34 Photoshop Calibration
File->Color->RGB RGB space: Gamma White point Primaries Reset to sRGB!!!

35 Photoshop color picker
Examine planes of fixed hue saturation lightness L* a* b*

36 light yellow b* green red a* blue CIE Lab space dark dark

37

38

39

40

41

42

43

44 IIIIIIIIIIIIIIIIIIII
IIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIIII

45 xyz2displayrgb SPD of color [r,g,b] :

46 xyz2displayrgb SPD of color [r,g,b] : phosphor

47 xyz2displayrgb SPD of color [r,g,b] : phosphor*[r,g,b]’

48 xyz2displayrgb SPD of color [r,g,b] : phosphor*[r,g,b]’
XYZ tristimulus values

49 xyz2displayrgb SPD of color [r,g,b] : phosphor*[r,g,b]’
XYZ tristimulus values xyz’=xyzbar’*phosphor*[r,g,b]’

50 xyz2displayrgb SPD of color [r,g,b] : phosphor*[r,g,b]’
XYZ tristimulus values xyz’=xyzbar’*phosphor*[r,g,b]’ [r,g,b]’=

51 xyz2displayrgb SPD of color [r,g,b] : phosphor*[r,g,b]’
XYZ tristimulus values xyz’=xyzbar’*phosphor*[r,g,b]’ [r,g,b]’= inv(xyzbar’*phosphor)*xyz’

52 xyz2displayrgb SPD of color [r,g,b] : phosphor*[r,g,b]’
XYZ tristimulus values xyz’=xyzbar’*phosphor*[r,g,b]’ [r,g,b]’= inv(xyzbar’*phosphor)*xyz’ mon2XYZ

53 xyz2displayrgb SPD of color [r,g,b] : phosphor*[r,g,b]’
XYZ tristimulus values xyz’=xyzbar’*phosphor*[r,g,b]’ [r,g,b]’= inv(xyzbar’*phosphor) *xyz’ xyz2displayrgb

54 Viewing Conditions Illuminant matters. Table 7-1 shows DE* using two different illuminants. DE* <= 2.5 is typically deemed a match. On the midterm: using chromaticities for Munsell principal hues, calculate DE* for the hues with Wandell monitor whitepoint and D65

55 Viewing Modes Viewing mode = to what we attribute color
Illuminant: illuminating light is colored Illumination: prevailing changes to the illuminant, e.g. shading from obstruction Surface: color belongs to the surface Volume: color belongs to the volume Aperture: pure color absent an object

56 Adaptation Light adaptation - quick Dark adaptation - slow

57 Chromatic Adaptation At all levels: cone, other retinal layers, LGN, cortex; including opponent mechanisms (e.g. green flash) Subserves discounting the illuminant when illuminant is spatially uniform

58 Adaptation mechanisms
Neural gain control: reduced sensitivity at high input, increased at low input. For cones this is photochemical dyanmics, further up it is neurochemistry dynamics Temporal mechanisms -evidence for cortical adaptation mechanisms. (e.g. waterfall illusion).

59 Chromatic adaptation models
vonKries: chromatic adaptation is cone mediated independent mechanisms in L,M,S linear All are slightly wrong, but a good place to start.

60 Chromatic adaptation models
three independent gain controls: La = kLL Ma=kMM Sa=kSS L = L-cone response, La = adapted response of L cones, etc

61 Chromatic adaptation models
Choice of gain control parameters depends on model. Often simply defined to guarantee adapted response is 1 at max of unadapted response or at scene-white kL= 1/Lmax or kL= 1/Lwhite so L max a = kL Lmax = 1, etc.

62 Chromatic adaptation models
If have two viewing conditions and M is transform for CIE XYZ to cone responses then can convert from adaptation in one condition to adaptation in the other by

63 Chromatic adaptation models
Conversion from one adaptation to another X Lmax /Lmax X1 X2 = M Mmax / Mmax M X2 X Smax / Smax X3 See Figure 9.2 for prediction of such a model

64 Non-linear chromatic adaptation models
Nayatani: adds noise and power law in brightness. La = aL((L+Ln)/(L0+Ln))bL etc. La : adapted L cone response Ln : noise signal; L0 : response to adapting field aL : fit from a color constancy hypothesis

65 Nayatani Color Appearance Model
Model components Nonlinear chromatic adaptation One achromatic, two chromatic color opponent channels weighted by cone population ratios

66 Nayatani Color Appearance Model
Model outputs Brightness as linear function of adapted cone responses (which are non-linear!) Lighness: achromatic channel origin translated to black=0, white = 100 Brightness of “ideal white” (=perfect reflector) Hue angle (from the chromatic channels)

67 Nayatani Color Appearance Model
Model outputs Hue quadrature: interpolation between 4 hues defined by chromatic channels red (20.14), yellow (90 .00), green (164.25), blue (231.00) Saturation: depends on hue and luminance (predicts changes of chromaticity with luminance) Chroma = saturation*lightness Colorfullness: Chroma*brightness of ideal white.

68 Nayatani Color Appearance Model Advantages
Invertible for many outputs, i.e. measure output quantities, predict inputs Accounts for changes in color appearance with chromatic adaptation and luminance

69 Nayatani Color Appearance Model Weaknesses
Doesn’t predict: Effects of changes in background color or relative luminance incomplete chromatic adaptation cognitive discounting the illuminant appearance of complex patches or background mesopic color vision

70 Color Appearance Absolute Brighness Colorfulness Relative Lightness
Chroma rel to white point “colorfulness/brightness(white)” Saturation rel to own brightness “colorfulness/brightness”

71 Hunt Color Appearance Model
Inputs chromaticity of adapting field chromaticity of illuminant chromaticity and reflectivity of background proximal field (up to 2° from stimulus) reference white

72 Hunt Color Appearance Model
Inputs absolute luminance of reference white adapting field scotopic luminance data parameters for chromatic and brightness induction

73 Hunt Color Appearance Model
Properties Non-linear responses Models incomplete chromatic adaptation Chromatic adaptation constants depend on luminance Models saturation Models brightness, lightness, chroma and colorfulness

74 Hunt Color Appearance Model
Good: Predicts many color appearance phenomena Useful for unrelated or related colors Large range of luminance levels of stimuli and background Bad: Complex, computationally expensive Not analytically invertible

75 Testing Color Appearance Models
Qualitative tests Corresponding colors data (colors which appear the same when viewed under different conditions) Magnitude estimation tests Psychophysics

76

77 Testing Color Appearance Models- Qualitative Tests
Predictions of color appearance phenomena, e.g. illuminant effects Comparisons with color order systems e.g. Helson-Judd effect: perceived hue of neutral Munsell colors is not neutral under strong chromatic illumination but depends on hue of illuminant and relative brightness of test to background. Hunt model successfully predicts, von Kriess model does not.

78 Testing Color Appearance Models- Qualitative Tests
Magnitude Estimation of appearance attributes Comparisons with color order systems e.g. Helson-Judd effect: perceived hue of neutral Munsell colors is not neutral under strong chromatic illumination but depends on hue of illuminant and relative brightness of test to background.

79 Testing Color Appearance Models- Qualitative Tests
Adjust parameters to predict constancies in standard color order systems (e.g. constant L*a*b* chroma of Munsell colors), then test model for related properites (e.g. hue shift under luminance change). Predict complex related colors phenomena, e.g. local vs. global color filtering.

80 Testing Color Appearance Models- Corresponding Colors
Corresponding colors: two different colors, C1, C2 which appear the same for two different viewing conditions V1, V2 Test model by transforming C1 to V2. Importance: correcting images made under assumption of V1 but actually produce under V2, e.g. photos under D65 vs F vs A

81 Testing Color Appearance Models- Magnitude Estimation
Observers assign numerical values to color appearance attributes Examples of results: Background and white point have most influence of colorfulness, lightness, hue Magnitude estimation of lightness predicted best by Hunt, next by CIELAB, then Nayatani Estimation of colorfulness badly predicted by all models

82 Testing Color Appearance Models- Magnitude Estimation
Observers assign numerical values to color appearance attributes Examples of results: Estimation of hue predicted best for Hunt model, which was revised as suggested by experiments. etc. See Chapter 15, Fairchild

83 Testing Color Appearance Models- Pyschophysics
Techniques starting with paired quality judgements can lead to a precise interval scale. (This is the way eyeglasses are prescribed.) Good for predicting media changes. (Review Fairchild 15.7)

84 MacAdam Ellipses JND of chromaticity
Bipartite equiluminant color matching to a given stimulus. Depends on chromaticity both in magnitude and direction.

85

86 MacAdam Ellipses For each observer, high correlation to variance of repeated color matches in direction, shape and size 2-d normal distributions are ellipses neural noise? See Wysecki and Styles, Fig 1(5.4.1) p. 307

87

88 MacAdam Ellipses JND of chromaticity
Weak inter-observer correlation in size, shape, orientation. No explanation in Wysecki and Stiles 1982 More modern models that can normalize to observer?

89 MacAdam Ellipses JND of chromaticity
Extension to varying luminence: ellipsoids in XYZ space which project appropriately for fixed luminence

90 MacAdam Ellipses JND of chromaticity Technology applications:
Bit stealing: points inside chromatic JND ellipsoid are not distinguishable chromatically but may be above luminance JND. Using those points in RGB space can thus increase the luminance resolution. In turn, this has appearance of increased spatial resolution (“anti-aliasing”) Microsoft ClearType. See and

91 Complementary Colors Colors which sum to white point are called complementary colors a*c1+b*c2 = wp Some monochromatic colors have complements, others don’t. See ComplementaryColors.m Complements may be out of gamut. See Photoshop.

92

93 Printer/monitor incompatibilities
Gamut Colors in one that are not in the other Different whitepoint Complements of one not in the other Luminance ranges have different quantization (especially gray)

94 Photography, Painting Photo printing is via filters.
Really multiplicative (e.g. .2 x .2 = .04) but convention is to take logarithm and regard as subtractive. Oil paint mixing is additive, water color is subtractive.

95 Printing Inks are subtractive
Cyan (white - red) Magenta (white - green) Yellow (white - blue) In practice inks are opaque, so can’t do mixing like oil paints. May use black ink on economic and physical grounds

96 Halftoning The problem with ink: it’s opaque
Screening: luminance range is accomplished by printing with dots of varying size. Collections of big dots appear dark, small dots appear light. % of area covered gives darkness.

97

98 Halftoning references
A commercial but good set of tutorials Digital Halftoning, by Robert Ulichney, MIT Press, 1987 Stochastic halftoning

99 Color halftoning Needs screens at different angles to avoid moire
Needs differential color weighting due to nonlinear visual color response and spatial frequency dependencies.

100 Right image is JPEG from digital photograph at 144 dpi, 1120x686 pixels, scaled by 25% in photoshop and imported as jpeg. Left image is Photoshop screening emulation with default screening parameters. Image imported as jpeg and reduced 25% in PowerPoint

101

102

103 Device Independence Calibration to standard space
typically CIE XYZ Coordinate transforms through standard space Gamut mapping

104 Device independence Stone et. al. “Color Gamut Mapping and the Printing of Digital Color Images”, ACM Transactions on Graphics, 7(4) October 1998, pp The following slides refer to their techniques.

105 Device to XYZ Sample gamut in device space on 8x8x8 mesh (7x7x7 = 343 cubes). Measure (or model) device on mesh. Interpolate with trilinear interpolation for small mesh and reasonable function XYZ=f(device1, device2, device3) this approximates interpolating to tangent.

106 XYZ to Device Invert function XYZ=f(device1, device2, device3)
hard to do in general if f is ill behaved At least make f monotonic by throwing out distinct points with same XYZ. e.g. CMY device: (continued)

107 XYZ to CMY Invert function XYZ=f(c,m,y)
Given XYZ=[x,y,z] want to find CMY=[c,m,y] such that f(CMY)=XYZ Consider X(c,m,y), Y(c,m,y), Z(c,m,y) A continuous function on a closed region has max and min on the region boundaries, here the cube vertices. Also, if a continuous function has opposite signs on two boundary points, it is zero somewhere in between.

108 XYZ to CMY Given X0, find [c,m,y] such that f(c,m,y) = X0
if [ci,mi,yi] [cj,mj,yj] are vertices on a given cube, and U=X(c,m,y)- X0 has opposite sign on them, then it is zero in the cube. Similarly Y, Z. If find such vertices for all of X0,Y0,Z0, then the found cube contains the desired point. (and use interpolation). Doing this recursively will find the desired point if there is one.

109 Gamut Mapping Criteria: preserve gray axis of original image
maximum luminance contrast few colors map outside destination gamut hue, saturation shifts minimized increase, rather than decrease saturation do not violate color knowledge, e.g. sky is blue, fruit colors, skin colors

110 Gamut Mapping Special colors and problems
Highlights: this is a luminance issue so is about the gray axis Colors near black: locus of these colors in image gamut must map into something reasonably similar shape else contrast and saturation is wrong

111 Gamut Mapping Special colors and problems
Highly saturated colors (far from white point): printers often incapable. Colors on the image gamut boundary occupying large parts of the image. Should map inside target gamut else have to project them all on target boundary.

112 Gamuts CRT Printer

113 Gamut Mapping First try: map black points and fill destination gamut.

114 device gamut image gamut

115 device gamut translate Bito Bd image gamut

116 device gamut translate Bito Bd image gamut scale by csf

117 device gamut translate Bito Bd image gamut scale by csf rotate

118 Gamut Mapping Xd = Bd + csf R (Xi - Bi) Problems:
Bi = image black, Bd = destination black R = rotation matrix csf = contrast scaling factor Xi = image color, Xd = destination color Problems: Image colors near black outside of destination are especially bad: loss of detail, hue shifts due to quantization error, ...

119 shift and scale along destination gray
Xd = Bd + csf R (Xi - Bi) + bs (Wd - Bd) shift and scale along destination gray

120 Fig 14a, bs>0, csf small, image gamut maps entirely into printer gamut, but contrast is low.
Fig 14b, bs=0, csf large, more contrast, more colors inside printer gamut, but also more outside.

121 Saturation control “Umbrella transformation”
[Rs Gs Bs] = monitor whitepoint [Rn Gn Bn] new RGB coordinates such that Rs + Gs + Bs = Rn + Gn + Bn and [Rn Gn Bn] maps inside destination gamut First map R Rs+G Gs+B Bs to R Rn+G Gn+B Bn Then map into printer coordinates Makes minor hue changes, but “relative” colors preserved. Achromatic remain achromatic.

122 Projective Clipping After all, some colors remain outside printer gamut Project these onto the gamut surface: Try a perpendicular projection to nearest triangular face in printer gamut surface. If none, find a perpendicular projection to the nearest edge on the surface If none, use closest vertex

123 Projective Clipping This is the closest point on the surface to the given color Result is continuous projection if gamut is convex, but not else. Bad: want nearby image colors to be nearby in destination gamut.

124 Projective Clipping Problems
Printer gamuts have worst concavities near black point, giving quantization errors. Nearest point projection uses Euclidean distance in XYZ space, but that is not perceptually uniform. Try CIELAB? SCIELAB? Keep out of gamut distances small at cost of use of less than full printer gamut use.

125 Color Management Systems
Problems Solve gamut matching issues Attempt uniform appearance Solutions Image dependent manipulations (e.g. Stone) Device independent image editors (e.g. Photoshop) with embedded CMS ICC Profiles

126 ICC Color Profiles International Color Consortium ICC Profile device description text characterization data calibration data invertible transforms to a fixed virtual color space, the Profile Connection Space (PCS)

127 Profile Connection Space
Presently only two PCS’s: CIELAB and CIEXYZ Both specified with D50 white point Device<-->PCS must account for viewing conditions, gamut mapping and tone (e.g. gamma) mapping.

128 Viewing-condition independent space
Gamut mapping, tone control, etc Input image and device Viewing-condition independent space input device colorimetric characterization Chromatic adaptation and color appearance models DVI color space (PCS) DVI color space (e.g. XYZ) DVI color cpace Chromatic adaptation and color appearance models Chromatic adaptation and color appearance models output device colorimetric characterization Viewing-condition independent space Output image and device Gamut mapping, tone control, etc

129 ICC Profiles Device profiles Colorspace profiles Device Link profile
data conversion Device Link profile concatenated D1->PCS->D2 Abstract profile generic for private purposes, e.g. special effects

130 ICC Profiles Named color profile
Allows data described in Pantone system (and others?) to map to other devices, e.g. view. Supported in Photoshop Photoshop demo of nearest Pantone, etc. colors to a selected color

131 ICC Profile Data Tags Profile header tags:
administrative and descriptive Start of Header Byte count of profile Profile version number Profile or device class (input, display, output, link, colorspace, abstract, named color profile) PCS target (CIEXYZ or CIELab) Photoshop demo of nearest Pantone, etc. colors to a selected color

132 ICC Profile Data Tags Profile header tags:
ICC registered device manufacturer, model Media attributes 64 attribute bits, 32 reserved (reflective/transparent; glossy/matte. ) XYZ of illuminant Rendering intent (Perceptual, relative colorimetry, saturation, absolute colorimetry) Photoshop demo of nearest Pantone, etc. colors to a selected color

133 ICC Profile Rendering Intents
perceptual: “full gamut of the image is compressed or expanded to fill the gamut of the destination device. Gray balance is preserved but colorimetric accuracy might not be preserved.” (ICC Spec Clause 4.9) saturation: “specifies the saturation of the pixels in the image is preserved perhaps at the expense of accuracy in hue and lightness.” (ICC Spec Clause 4.12) absolute colorimetry: relative to illuminant only relative colorimetry: relative to illuminant and media whitepoint Photoshop demo of nearest Pantone, etc. colors to a selected color

134 ICC Profile Data Tags Tone Reproduction Curve (TRC) tags:
grayTRC, redTRC, greenTRC, blueTRC single number (gamma) if TRC is exponential array of samples of the TRC appropriate to interpolation Photoshop demo of nearest Pantone, etc. colors to a selected color

135 ICC Profile Data Tags Mapping tags (“AtoB0Tag”, “BtoA0Tag”, etc.)
Map between device and PCS Includes 3x3 matrix if mapping is linear map of CIEXYZ spaces, or lookup table on sample points if not. Photoshop demo of nearest Pantone, etc. colors to a selected color

136 ICC Profile Special Goodies
Initimate with PostScript Support for PostScript Color Rendering Dictionaries reduces processing in printer Support for argument lists to PostScript level 2 color handling Halftone screen geometry and frequency Undercolor removal Embedding profiles in pict, gif, tiff, jpeg,eps Photoshop demo of nearest Pantone, etc. colors to a selected color

137 JPEG DCT Quantization FDCT of 8x8 blocks.
Order in increasing spatial frequency (zigzag) Low frequencies have more shape information, get finer quantization. High’s often very small so go to zero after quantizing If source has 8-bit entries ( s in [-27, 27-1), can show that quantized DCT needs at most 11 bits (c in [-210, 210-1]) See Wallace paper, p 12. Note high frequency contributions small.

138 JPEG DCT Quantization Quantize with single 64x64 table of divisors
Quantization table can be in file or reference to standard Standard quantizer based on JND. Note can have one quantizer table for each image component See Wallace p 12.

139 JPEG DCT Intermediate Entropy Coding
Variable length code (Huffman): High occurrence symbols coded with fewer bits Intermediate code: symbol pairs symbol-1 chosen from table of symbols si,j i is run length of zeros preceding quantized dct amplitude, j is length of huffman coding of the dct amplitude i = 0…15, j= 1…10, and s0,0=‘EOB’ s15,0 = ‘ZRL’ symbol-2: Huffman encoding of dct amplitude Finally, these 162 symbols are Huffman encoded.

140 JPEG components Y = 0.299R G B Cb = R G + 0.5B Cr = 0.5R G B Optionally subsample Cb, Cr replace each pixel pair with its average. Not much loss of fidelity. Reduce data by 1/2*1/3+1/2*1/3 = 1/3 More shape info in achromatic than chromatic components. (Color vision poor at localization).

141 JPEG goodies Progressive mode - multiple scans, e.g. increasing spatial frequency so decoding gives shapes then detail Hierarchical encoding - multiple resolutions Lossless coding mode JFIF: User embedded data more than 3 components possible?

142 Huffman Encoding 1 00 s1 01 s2 11 s3 100 s4 10 101 1011 s6 1010 s5

143 Huffman Encoding Traverse from root to leaf, then repeat: s3 s5 s3 s2 s4 1 00 s1 01 s2 11 s3 100 s4 10 101 1011 s6 1010 s5

144 Charge Coupled Device (CCD)
< 10mm x 10mm Silicon cells emit electrons when light falls on it

145 Charge Coupled Device (CCD)
luminance cell < 10mm x 10mm time charge charge

146 Filters over cells More green than red, blue Y=0.299R G B (For color tv and…?)

147 CCD Cameras Good links: Some device specs:
Some device specs:

148 Color TV Multiple standards - US, 2 in Europe, HDTV standards, Digital HDTV , Japanese analog. US: 525 lines (US HDTV is digital, and data stream defines resolution. Typically MPEG encoded to provide 1088 lines of which 1080 are displayed)

149 NTSC Analog Color TV 525 lines/frame Interlaced to reduce bandwidth
small interframe changes help Primary chromaticities:

150 NTSC Analog Color TV These yield
RGB2XYZ = Y=0.299R G B (same as luminance channel for JPEG!) = Y value of white point. Cr = R-Y, Cb = B-Y with chromaticity: Cr: x=1.070, y=0; Cb: x=0.131 y=0; y(C)=0 => Y(C)=0 => achromatic

151 NTSC Analog Color TV Signals are gamma corrected under assumption of dim surround viewing conditions (high saturation). Y, Cr, Cb signals (EY, Er, Eb) are sent per scan line; NTSC, SECAM, PAL do this in differing clever ways EY typically with twice the bandwidth of Er, Eb

152 NTSC Analog Color TV Y, Cr, Cb signals (EY, Er, Eb) are sent per scan line; NTSC, SECAM, PAL do this in differing clever ways. EY with 4-10 x bandwidth of Er, Eb “Blue saving”

153 Digital HDTV 1987 - FCC seeks proposals for advanced tv
Broadcast industry wants analog, 2x lines of NTSC for compatibility Computer industry wanta digital 1993 (February) DHDTV demonstrated in four incompatible systems 1993 (May) Grand Alliance formed

154 Digital HDTV 1996 (Dec 26) FCC accepts Grand Alliance Proposal of the Advanced Televisions Systems Committee ATSC 1999 first DHDTV broadcasts

155 Digital HDTV MPEG video compression Dolby AC-3 audio compression
lines hpix aspect frames frame rate ratio /9 progressive 24, 30 or 60 /9 interlaced /9 progressive , 30 MPEG video compression Dolby AC-3 audio compression

156 Some gamuts SWOP ENCAD GA ink

157 Color naming A Computational model of Color Perception and Color Naming, Johann Lammens, Buffalo CS Ph.D. dissertation Cross language study of Berlin and Kay, 1969 “Basic colors”

158 Color naming “Basic colors”
Meaning not predicted from parts (e.g. blue, yellow, but not bluish) not subsumed in another color category, (e.g. red but not crimson or scarlet) can apply to any object (e.g. brown but not blond) highly meaningful across informants (red but not chartruese) Ask audience to give basic colors they can identify in Berlin Kay colors (adapted by Lammel)

159 Color naming “Basic colors” Vary with language

160 Color naming Berlin and Kay experiment:
Elicit all basic color terms from 329 Munsell chips (40 equally spaced hues x 8 values plus 9 neutral hues Find best representative Find boundaries of that term

161 Color naming Berlin and Kay experiment:
Representative (“focus” constant across lang’s) Boundaries vary even across subjects and trials Lammens fits a linear+sigmoid model to each of R-B B-Y and Brightness data from macaque monkey LGN data of DeValois et. al.(1966) to get a color model. As usual this is two chromatic and one achromatic

162 Color naming To account for boundaries Lammens used standard statistical pattern recognition with the feature set determined by the coordinates in his color space defined by macaque LGN opponent responses. Has some theoretical but no(?) experimental justification for the model.

163 Pantone Color Combo of the Month January 1999
Mecca Orange (Pantone 1675C) Canteen (Pantone 405 C) Violet Quartz (Pantone 689 C) Pantone Color Combo of the Month January 1999 That's all for today Pantone’s RGB for Violet Quartz has some values greater than 100%, i.e. than 255. Converting all three from CYMK to RGB with Photoshop using default monitor characters, including D65 whitepoint, gives different RGB from Panton. Probably Pantone is D50 whitepoint.


Download ppt "RGB Models human visual system? Gives an absolute color description?"

Similar presentations


Ads by Google