Digital image basics Human vision perception system Image formation

Slides:



Advertisements
Similar presentations
Digital Image Processing
Advertisements

Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Digital Image Processing: Digital Imaging Fundamentals.
Digital Image Processing Chapter 2: Digital Image Fundamentals.
Digital Image Fundamentals
Chapter 2 Digital Image Fundamentals. Outline Elements of Visual Perception Light and the Electromagnetic Spectrum Image Sensing and Acquisition Image.
Digital Image Fundamentals Human Vision Lights and Electromagnetic spectrum Image Sensing & Acquisition Sampling & Quantization Basic Relationships b/w.
Chapter 2: Digital Image Fundamentals Fall 2003, 劉震昌.
ECE 472/572 - Digital Image Processing Lecture 4 - Image Enhancement - Spatial Filter 09/06/11.
ECE 472/572 – Digital Image Processing Lecture 2 – Elements of Visual Perception and Image Formation 08/25/11.
Digital Image Fundamentals
: Office Room #:: 7
Image Formation. Input - Digital Images Intensity Images – encoding of light intensity Range Images – encoding of shape and distance They are both a 2-D.
Digital Image Processing Lecture 2
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Digital Image Fundamentals II 1.Image modeling and representations 2.Pixels and Pixel relations 3.Arithmetic operations of images 4.Image geometry operation.
© by Yu Hen Hu 1 Human Visual System. © by Yu Hen Hu 2 Understanding HVS, Why? l Image is to be SEEN! l Perceptual Based Image Processing.
Human Visual Perception The Human Eye Diameter: 20 mm 3 membranes enclose the eye –Cornea & sclera –Choroid –Retina.
Digital Image Fundamentals. What Makes a good image? Cameras (resolution, focus, aperture), Distance from object (field of view), Illumination (intensity.
Digital Image Processing CCS331 Relationships of Pixel 1.
University of Ioannina - Department of Computer Science Digital Imaging Fundamentals Christophoros Nikou Digital Image Processing Images.
Chapter Teacher: Remah W. Al-Khatib. This lecture will cover:  The human visual system  Light and the electromagnetic spectrum  Image representation.
Course Website: Digital Image Processing: Digital Imaging Fundamentals P.M.Dholakia Brian.
Digital Image Processing (DIP) Lecture # 5 Dr. Abdul Basit Siddiqui Assistant Professor-FURC 1FURC-BCSE7.
Medical Image Processing & Neural Networks Laboratory 1 Medical Image Processing Chapter 2 Digital Image Fundamentals 國立雲林科技大學 資訊工程研究所 張傳育 (Chuan-Yu Chang.
G52IIP, School of Computer Science, University of Nottingham 1 G52IIP Summary Topic 1 Overview of the course Related topics Image processing Computer.
G52IIP, School of Computer Science, University of Nottingham 1 G52IIP 2011 Summary Topic 1 Overview of the course Related topics Image processing Computer.
Digital Image Processing In The Name Of God Digital Image Processing Lecture2: Digital Image Fundamental M. Ghelich Oghli By: M. Ghelich Oghli
Elements of Visual Perception
University of Kurdistan Digital Image Processing (DIP) Lecturer: Kaveh Mollazade, Ph.D. Department of Biosystems Engineering, Faculty of Agriculture,
Image Perception ‘Let there be light! ‘. “Let there be light”
Digital Image Processing
Introduction to Image Processing Course Notes Anup Basu, Ph.D. Professor, Dept of Computing Sc. University of Alberta.
Human Visual System.
Image Perception ‘Let there be light! ‘. “Let there be light”
Masaki Hayashi 2015, Autumn Visualization with 3D CG Digital 2D Image Basic.
CS Spring 2014 CS 414 – Multimedia Systems Design Lecture 4 – Visual Perception and Digital Image Representation Klara Nahrstedt Spring 2014.
Visual Information Processing. Human Perception V.S. Machine Perception  Human perception: pictorial information improvement for human interpretation.
Color Models Light property Color models.
What is Digital Image Processing?
Some Basic Relationships Between Pixels
Digital Image Fundamentals
Image Subtraction Mask mode radiography h(x,y) is the mask.
图像处理技术讲座(3) Digital Image Processing (3) Basic Image Operations
Human Visual System.
EE663-Digital Image Processing & Analysis Dr. Samir H
Digital Image Processing (DIP)
Three Membranes Cornea and Sclera outer cover,
Images, Display, Perception
IMAGE PROCESSING Questions and Answers.
IT – 472 Digital Image Processing
Digital 2D Image Basic Masaki Hayashi
Digital Image Processing
Image Formation and Processing
Image Acquisition.
Digital Image Fundamentals
Fundamentals of Image Processing A Seminar on By Alok K. Watve
Visual Perception, Image Formation, Math Concepts
Digital Image Fundamentals
CSC 381/481 Quarter: Fall 03/04 Daniela Stan Raicu
Subject Name: IMAGE PROCESSING Subject Code: 10EC763
Computer and Robot Vision I
Digital Image Fundamentals
Digital Image Processing
Digital Image Fundamentals
Topic 1 Three related sub-fields Image processing Computer vision
IT523 Digital Image Processing
Review and Importance CS 111.
Presentation transcript:

Digital image basics Human vision perception system Image formation Human vision property and model Image acquisition Image transform Image quality Connected components Image sensing Image formats

1. Human vision

1. Human vision (cont’d) Two types of receptors -- Cones (fovea): sensitive to brightness and color - 7M - Cone-vision (photopic, bright-light vision) -- Rod (cell): sensitive to low-level illumination - 100M - Rod-vision (scopotic, dim-light vision)

2. Image perception and formation

3. Vision property

3. Vision property (cont’d) Brightness adaptation -- There are a range of intensity levels that human eye can adapt - photopic: 10^(-3) (mL) – 10^(3) (mL) - scopotic: 10^(-3) (mL) – 10^(-1) (mL) -- Human eyes have brightness adaptation level, they cannot adapt the whole range simultaneously

3. Vision property (cont’d) Brightness discrimination -- The ability to discriminate different intensity level - Weber ratio: just noticeable difference of intensity versus the background intensity -- The intensity defined in the digital image is not the real intensity. It is a contrast scale (e.g., gray scale)

3. Vision property (cont’d) Contrast -- Absolute contrast C = Bmax / Bmin where Bmax is the maximum brightness intensity Bmin is the minimum brightness intensity -- Relative contrast Cr = (B – B0) / B0 B is the brightness of object; B0 is the background brightness -- Mach Band: over-shooting effect

3. Vision property (cont’d) Spatial discrimination (SD) -- minimum view angle which can discriminate two points on the object to be viewed d/(2 * Pi * L) = theta / 360 d L theta eye

3. Vision property (cont’d) Spatial discrimination (SD) -- low illumination (SD decreases) -- low contrast (SD decreases) -- too high illumination (SD does not increase too much) -- SD of color is weaker than SD of brightness -- projection on fovea (SD increases)

3. Vision property (cont’d) Human vision model -- g(x, y) = T [ f(x, y)] -- T: transform input optical scene to output image - linear or non-linear transform - H(u,v) low pass filter (e.g., limited discrimination, linear) - log response to the brightness (e.g., non-linear) - time-delay effect (e.g., “image-remain” effect) Input image f(x,y) Optical system H(u,v) ~ h(x,y) Output image g(x,y)

4. Image acquisition Wavelength -- electromagnetic spectrum

4. Image acquisition (cont’d) Principle of imaging sensor -- transform illumination energy into digital image -- output voltage waveform is proportional to light -- e.g., single sensor, group sensors (one-strip, CT/MRI), group sensors (2D array CCD)

4. Image acquisition (cont’d) Image digitizing -- Sampling: digitizing the coordinate values (spatially) - Nyquest rate: 2*F(max) - limited by the number of sensors - spatial sampling: uniform and non-uniform (e.g., fovea-based, fish-eye based) -- Quantization: digitizing the amplitude values - uniform - non-uniform (based on image characteristics)

4. Image acquisition (cont’d) Image digitizing -- f(x, y) is the gray level at pixel location (x, y) -- Gray level is not real illumination intensity (it is an index of the gray scale) -- f(x, y) is in the range of [0, 255] for 8-bit image -- the image with size of M*N and k bits per pixel, has the total bits: M*N*k

4. Image acquisition (cont’d) Spatial resolution -- number of pixels with respect to the image size -- line pair: smallest discernible detail per unit distance in an image - e.g., 100 lp/mm.

4. Image acquisition (cont’d) Relationship between spatial resolution N and gray level resolution K -- N  and K   quality  -- N  and K   contrast  -- N (detail)   K (number of gray level) can be  (e.g., half-tone image)

4. Image acquisition (cont’d) Aliasing problem -- JigJag or staircase effect. -- occurs in image acquisition (e.g., image processing) -- occurs in display (e.g., computer graphics) -- Reasons: The sampling or displaying resolution is lower than the minimum rate 2*F(max), which is the Nyquest rate. -- Possible solution: - Smooth image before sampling to reduce the F(max) - side-effect: image blurred, quality 

5. Image transform Size change -- Zoom-in Shape change -- Zoom-out -- pixel replication -- pixel interpolation -- super-resolution Shape change -- geometric transformation

6. Image quality Subjective -- Rating (e.g., R=1, 2,…, 5) where N is the number of evaluators; Ji R -- application in image enhancement, restoration, compression, etc.

6. Image quality Objective -- Mean square error -- dB value: -10Log(E) -- f(x,y) is the image to be evaluated. f^(x,y) is the reference image to be compared with. -- application in image coding, etc.

7. Connected components Relationship of pixels -- Four neighbors of pixel P - N4(P) (strong neighbors) - ND(P) (weak neighbors) -- Eight neighbors of pixel P - N8(P) = N4(P) + ND(P) P P P Strong weak 8-neighbor

7. Connected components (cont’d) Adjacency -- 4-adjacency -- 8-adjacency -- m-adjacency (mixed-adjacency) q P P q P q 4-connected pq is not m-connected 8-connected m-adjacent: if q is N4(p), or q is Nd(p) and N4(p) N4(q) = 

7. Connected components (cont’d) Path -- If p and q is connected, there is a path between p and q. -- m path: the path between p and q based on m-connected pixels. -- closed path: starting p and ending q are connected P q P q

7. Connected components (cont’d) -- set of pixels which are connected -- The set is also called connected set Concept -- R is a region if R is a connected set -- boundary of R is “closed path” -- edge: gray-level discontinuity at a point - link edge points  edge segment

7. Connected components (cont’d) Distance -- D(p, q) is defined as the distance between p and q. D(p, q) >=0 D(p, q) = D(q, p) D(p, q) <= D(p,z) + D(q,z) -- Euclidean distance (disk shape) De(p,q) = sqrt[(xp – xq)^(2) + (yp – yq)^(2)]

7. Connected components (cont’d) Distance -- D4 distance (city-block distance) (diamond shape) D4(p,q) = |(xp – xq)|+ |(yp – yq)| 2 2 1 2 2 1 0 1 2

7. Connected components (cont’d) Distance -- D8 distance (chessboard distance) (square shape) D8(p,q) = max(|(xp – xq)|, |(yp – yq)|) 2 2 2 2 2 2 1 1 1 2 2 1 0 1 2

7. Connected components (cont’d) Distance -- Dm distance (shortest m-path between two points) 1 - 1 | 1 Dm = 4

8. Pixel operation Point-wise operation -- M*N image bound matrix t r (r,t): coordinates of upper-left component; each component is either defined (which is represented by a certain intensity value), or undefined (which is represented by “*”).

8. Pixel operation (Cont’d) Arithmetic operation (1) ADD[f, g](I,j) = f(I,j) + g(I,j) IF f(I,j)   and g(I,j)   (C1) =  otherwise (2) Mult[f,g](I,j) = f(I,j) • g(I,j) IF C1 (3) SCALAR[t; f](I,j) = t • f(I,j) IF f(I,j)   =  otherwise

8. Pixel operation (Cont’d) Arithmetic operation (4) Max[f,g](I,j) = max[f(I,j), g(I,j)] IF C1 =  otherwise (5) Min[f,g](I,j) = min[f(I,j), g(I,j)] IF C1 (6) Sub[f](I,j) = -f(I,j) IF f(I,j)   (6) SCALAR[t; f](I,j) = t • f(I,j) IF C1 =  otherwise

8. Pixel operation (Cont’d) Arithmetic operation (7) EXTEND[f,g](I,j) = f(I,j) IF f(I,j)   = g(I,j) otherwise (8) EXTADD[f,g](I,j) = ADD[f,g](I,j) IF C1 = f(I,j) IF f(I,j)   and g(I,j) =  = g(I,j) IF g(I,j)   and f(I,j) =  = * both g and f on undefined domain

8. Pixel operation (Cont’d) Arithmetic operation (9) THRESH[f,t](I,j) = 1 IF f(I,j)  t = 0 IF f(I,j) < t =  IF f(I,j) =  (10) TRUNC[f,t](I,j) = f(I,j) IF f(I,j)  t TRUNC[f,g](I,j) = Mult[f, THRESH(f, t)]

8. Pixel operation (Cont’d) Arithmetic operation (11) EQUAL[f,t](I,j) = 1 IF f(I,j) = t = 0 otherwise = * on the undefined domain (12) similar definition for GREATER[f,t](I,j) BETWEEN[f, t1, t2](I,j) (13) operation with masking: AND, OR, NOT.

8. Pixel operation (Cont’d) Arithmetic operation (14) PIXSUM(f) is the summation of all pixels on the defined domain (15) DOT(f,g) = SUM[f(I,j)  g(I,j)] on the common domain (16) Norm(f) = [SUM[f(I,j)^2]]^(1/2) Norm(f) = (DOT(f,f))^(1/2)

8. Pixel operation (Cont’d) Arithmetic operation (17) REST[f,g](I,j) = f(I,j) IF g(I,j)   =  IF g(I,j) =  (18) Note: Linear operation: H(af + bg) = aH(f) + bH(g) otherwise: non-linear operation (e.g., |f-g| operation) H: operator f, g: images a, b: scale values

Image Sensing Single Image Sensor Line Sensor (Sensor strip) Array Sensor

Image Sensing Linear motion Rotation Sensing Ring for CT (x-ray) to create cross-sectional images

Image Format TIF (LZW – lossless coding) GIF JPEG BMP

Image Format TIF (LZW – lossless coding) Tagged image file format Image head: field = tags + values image size compression color depth location of data bits per sample …….

Image Format JPEG 8*8 blocks  DCT  Coefficient quantization  Huffman coding  zig-zag run-length coding

Demo

Image Format BMP PBM - portable bitmap file format (binary) PGM – portable greymap (grey scale) PPM – portable pixmap (color)