Digital Image Fundamentals

Slides:



Advertisements
Similar presentations
Digital Image Processing
Advertisements

Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Spatial Filtering (Chapter 3)
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Motivation Application driven -- VoD, Information on Demand (WWW), education, telemedicine, videoconference, videophone Storage capacity Large capacity.
 Image Characteristics  Image Digitization Spatial domain Intensity domain 1.
Digital Image Processing
DIGITAL IMAGE PROCESSING CMSC 150: Lecture 14. Conventional Cameras  Entirely chemical and mechanical processes  Film: records a chemical record of.
Digital Cameras CCD (Monochrome) RGB Color Filter Array.
Digital Image Processing Chapter 2: Digital Image Fundamentals.
1 Image Processing(IP) 1. Introduction 2. Digital Image Fundamentals 3. Image Enhancement in the spatial Domain 4. Image Enhancement in the Frequency Domain.
Chapter 2: Digital Image Fundamentals Fall 2003, 劉震昌.
Digital Images The nature and acquisition of a digital image.
The Digital Image.
Digital Images Chapter 8 Exploring the Digital Domain.
Digital Image Processing
Digital Image Processing Lecture 2
Spatial Filtering: Basics
Digital Image Fundamentals II 1.Image modeling and representations 2.Pixels and Pixel relations 3.Arithmetic operations of images 4.Image geometry operation.
Digital Image Processing CCS331 Relationships of Pixel 1.
Intelligent Vision Systems Image Geometry and Acquisition ENT 496 Ms. HEMA C.R. Lecture 2.
Digital Image Fundamentals Faculty of Science Silpakorn University.
A Simple Image Model Image: a 2-D light-intensity function f(x,y)
3. Image Sampling & Quantisation 3.1 Basic Concepts To create a digital image, we need to convert continuous sensed data into digital form. This involves.
Digital Image Processing (DIP) Lecture # 5 Dr. Abdul Basit Siddiqui Assistant Professor-FURC 1FURC-BCSE7.
Computer Vision Introduction to Digital Images.
COMP322/S2000/L171 Robot Vision System Major Phases in Robot Vision Systems: A. Data (image) acquisition –Illumination, i.e. lighting consideration –Lenses,
Copyright Howie Choset, Renata Melamud, Al Costa, Vincent Lee-Shue, Sean Piper, Ryan de Jonckheere. All Rights Reserved Computer Vision.
Digital Image Processing In The Name Of God Digital Image Processing Lecture2: Digital Image Fundamental M. Ghelich Oghli By: M. Ghelich Oghli
Image Processing Ch2: Digital image Fundamentals Prepared by: Tahani Khatib.
CS654: Digital Image Analysis Lecture 4: Basic relationship between Pixels.
ISAN-DSP GROUP Digital Image Fundamentals ISAN-DSP GROUP What is Digital Image Processing ? Processing of a multidimensional pictures by a digital computer.
Digital Image Processing
Some Basic Relationships Between Pixels Definitions: –f(x,y): digital image –Pixels: q, p –Subset of pixels of f(x,y): S.
Digital Image Processing Image Enhancement in Spatial Domain
Scanner Scanner Introduction: Scanner is an input device. It reads the graphical images or line art or text from the source and converts.
Image Processing Chapter(3) Part 1:Relationships between pixels Prepared by: Hanan Hardan.
Relationship between pixels Neighbors of a pixel – 4-neighbors (N,S,W,E pixels) == N 4 (p). A pixel p at coordinates (x,y) has four horizontal and vertical.
Digital Image Processing CCS331 Relationships of Pixel 1.
Spatial Filtering (Chapter 3) CS474/674 - Prof. Bebis.
ITEC2110, Digital Media Chapter 2 Fundamentals of Digital Imaging 1 GGC -- ITEC Digital Media.
Digital image basics Human vision perception system Image formation
What is Digital Image Processing?
Image Sampling and Quantization
Some Basic Relationships Between Pixels
Image Pre-Processing in the Spatial and Frequent Domain
JPEG Image Coding Standard
IMAGE ENHANCEMENT TECHNIQUES
IMAGE PROCESSING Questions and Answers.
Digital 2D Image Basic Masaki Hayashi
Chapter I, Digital Imaging Fundamentals: Lesson II Capture
Image Formation and Processing
CIS 601 Image Fundamentals
Histogram Histogram is a graph that shows frequency of anything. Histograms usually have bars that represent frequency of occuring of data. Histogram has.
Digital Image Fundamentals
T490 (IP): Tutorial 2 Chapter 2: Digital Image Fundamentals
Fundamentals of Spatial Filtering
CIS 595 Image Fundamentals
CSC 381/481 Quarter: Fall 03/04 Daniela Stan Raicu
Pixel Relations.
Digital Image Fundamentals
CSC 381/481 Quarter: Fall 03/04 Daniela Stan Raicu
Subject Name: IMAGE PROCESSING Subject Code: 10EC763
Digital Image Fundamentals
Digital Image Processing
Digital Image Processing
Digital Image Fundamentals
Image Enhancement To process an image so that the result is more suitable than the original image for a specific application. Spatial domain methods and.
IT523 Digital Image Processing
DIGITAL IMAGE PROCESSING Elective 3 (5th Sem.)
Presentation transcript:

Digital Image Fundamentals Chapter 2 Digital Image Fundamentals

Fundamentals of Digital Images x y Origin Image “After snow storm” f(x,y) w An image: a multidimensional function of spatial coordinates. w Spatial coordinate: (x,y) for 2D case such as photograph, (x,y,z) for 3D case such as CT scan images (x,y,t) for movies w The function f may represent intensity (for monochrome images) or color (for color images) or other associated values.

Digital Images Digital image: an image that has been discretized both in Spatial coordinates and associated value. w Consist of 2 sets:(1) a point set and (2) a value set w Can be represented in the form I = {(x,a(x)): x ÎX, a(x) Î F} where X and F are a point set and value set, respectively. w An element of the image, (x,a(x)) is called a pixel where - x is called the pixel location and - a(x) is the pixel value at the location x

Image Sensor:Charge-Coupled Device (CCD) w Used for convert a continuous image into a digital image w Contains an array of light sensors w Converts photon into electric charges accumulated in each sensor unit CCD KAF-3200E from Kodak. (2184 x 1472 pixels, Pixel size 6.8 microns2)

Image Sensor: Inside Charge-Coupled Device Horizontal Transportation Register Vertical Transport Register Gate Vertical Transport Register Gate Vertical Transport Register Gate Output Gate Amplifier Photosites Output

Image Sensor: How CCD works b c g h i d e f Image pixel a b c g h i d e f a b c g h i d e f Horizontal transport register Vertical shift Output Horizontal shift

Intensity image or monochrome image Image Types Intensity image or monochrome image each pixel corresponds to light intensity normally represented in gray scale (gray level). Gray scale values

Color image or RGB image: each pixel contains a vector Image Types Color image or RGB image: each pixel contains a vector representing red, green and blue components. RGB components

Binary image or black and white image Each pixel contains one bit : Image Types Binary image or black and white image Each pixel contains one bit : 1 represent white 0 represents black Binary data

Each pixel contains index number pointing to a color in a color table Image Types Index image Each pixel contains index number pointing to a color in a color table Color Table Index No. Red component Green Blue 1 0.1 0.5 0.3 2 1.0 0.0 3 4 5 0.2 0.8 0.9 … Index value

Image Sampling Image sampling: discretize an image in the spatial domain Spatial resolution / image resolution: pixel size or number of pixels

Under sampling, we lost some image details! How to choose the spatial resolution Spatial resolution = Sampling locations Original image Sampled image Under sampling, we lost some image details!

How to choose the spatial resolution : Nyquist Rate Sampled image Original image 2mm 1mm Minimum Period Spatial resolution (sampling rate) No detail is lost! Nyquist Rate: Spatial resolution must be less or equal half of the minimum period of the image or sampling frequency must be greater or Equal twice of the maximum frequency. = Sampling locations

Effect of Spatial Resolution 256x256 pixels 64x64 pixels 128x128 pixels 32x32 pixels Down sampling is an irreversible process.

Image Quantization Image quantization: discretize continuous pixel values into discrete numbers Color resolution/ color depth/ levels: - No. of colors or gray levels or - No. of bits representing each pixel value - No. of colors or gray levels Nc is given by where b = no. of bits

Image Quantization : Quantization function Quantization level 2 1 Light intensity Darkest Brightest

Effect of Quantization Levels

Effect of Quantization Levels (cont.) In this image, it is easy to see false contour. 4 levels 2 levels

Conventional indexing method Basic Relationship of Pixels (0,0) x (x,y) (x+1,y) (x-1,y) (x,y-1) (x,y+1) (x+1,y-1) (x-1,y-1) (x-1,y+1) (x+1,y+1) y Conventional indexing method

N4(p) = 4-neighbors of p: Neighbors of a Pixel Neighborhood relation is used to tell adjacent pixels. It is useful for analyzing regions. (x,y-1) 4-neighbors of p: (x-1,y) (x+1,y) (x-1,y) (x+1,y) (x,y-1) (x,y+1) N4(p) = p (x,y+1) 4-neighborhood relation considers only vertical and horizontal neighbors. Note: q Î N4(p) implies p Î N4(q)

N8(p) = 8-neighbors of p: Neighbors of a Pixel (cont.) (x-1,y-1) 8-neighborhood relation considers all neighbor pixels.

Diagonal neighbors of p: Neighbors of a Pixel (cont.) Diagonal neighbors of p: (x-1,y-1) (x+1,y-1) (x-1,y-1) (x+1,y-1) (x-1,y+1) (x+1,y+1) p ND(p) = (x-1,y+1) (x+1,y+1) Diagonal -neighborhood relation considers only diagonal neighbor pixels.

Connectivity Connectivity is adapted from neighborhood relation. Two pixels are connected if they are in the same class (i.e. the same color or the same range of intensity) and they are neighbors of one another. For p and q from the same class w 4-connectivity: p and q are 4-connected if q Î N4(p) w 8-connectivity: p and q are 8-connected if q Î N8(p) w mixed-connectivity (m-connectivity): p and q are m-connected if q Î N4(p) or q Î ND(p) and N4(p) Ç N4(q) = Æ

Adjacency A pixel p is adjacent to pixel q is they are connected. Two image subsets S1 and S2 are adjacent if some pixel in S1 is adjacent to some pixel in S2 S1 S2 We can define type of adjacency: 4-adjacency, 8-adjacency or m-adjacency depending on type of connectivity.

Path A path from pixel p at (x,y) to pixel q at (s,t) is a sequence of distinct pixels: (x0,y0), (x1,y1), (x2,y2),…, (xn,yn) such that (x0,y0) = (x,y) and (xn,yn) = (s,t) and (xi,yi) is adjacent to (xi-1,yi-1), i = 1,…,n q p We can define type of path: 4-path, 8-path or m-path depending on type of adjacency.

Path (cont.) 8-path m-path p q p q p q m-path from p to q solves this ambiguity 8-path from p to q results in some ambiguity

Distance For pixel p, q, and z with coordinates (x,y), (s,t) and (u,v), D is a distance function or metric if w D(p,q) ³ 0 (D(p,q) = 0 if and only if p = q) w D(p,q) = D(q,p) w D(p,z) £ D(p,q) + D(q,z) Example: Euclidean distance

Distance (cont.) D4-distance (city-block distance) is defined as 1 2 Pixels with D4(p) = 1 is 4-neighbors of p.

Distance (cont.) D8-distance (chessboard distance) is defined as 2 2 2 2 2 2 1 1 1 2 2 1 1 2 2 1 1 1 2 2 2 2 2 2 Pixels with D8(p) = 1 is 8-neighbors of p.

Template, Window, and Mask Operation Sometime we need to manipulate values obtained from neighboring pixels Example: How can we compute an average value of pixels in a 3x3 region center at a pixel z? Pixel z 2 4 1 2 6 2 9 2 3 4 4 4 7 2 9 7 6 7 5 2 3 6 1 5 7 4 2 5 1 2 2 5 2 3 2 8 Image

Template, Window, and Mask Operation (cont.) Step 1. Selected only needed pixels Pixel z … 2 4 1 2 6 2 9 2 3 4 4 4 3 4 4 … … 7 2 9 7 6 7 9 7 6 5 2 3 6 1 5 3 6 1 7 4 2 5 1 2 … 2 5 2 3 2 8

Template, Window, and Mask Operation (cont.) Step 2. Multiply every pixel by 1/9 and then sum up the values 4 6 7 9 1 3 … X 1 Mask or Window or Template

Template, Window, and Mask Operation (cont.) Question: How to compute the 3x3 average values at every pixels? Solution: Imagine that we have a 3x3 window that can be placed everywhere on the image 2 4 1 2 6 2 9 2 3 4 4 4 7 2 9 7 6 7 5 2 3 6 1 5 7 4 2 5 1 2 Masking Window

Template, Window, and Mask Operation (cont.) Step 1: Move the window to the first location where we want to compute the average value and then select only pixels inside the window. Step 2: Compute the average value 4 6 7 1 9 2 5 3 4 1 9 2 3 7 Sub image p Step 3: Place the result at the pixel in the output image Original image 4.3 Step 4: Move the window to the next location and go to Step 2 Output image

Example : moving averaging Template, Window, and Mask Operation (cont.) The 3x3 averaging method is one example of the mask operation or Spatial filtering. w The mask operation has the corresponding mask (sometimes called window or template). w The mask contains coefficients to be multiplied with pixel values. Example : moving averaging w(2,1) w(3,1) w(3,3) w(2,2) w(3,2) w(1,1) w(1,2) 1 Mask coefficients The mask of the 3x3 moving average filter has all coefficients = 1/9

Template, Window, and Mask Operation (cont.) The mask operation at each point is performed by: 1. Move the reference point (center) of mask to the location to be computed 2. Compute sum of products between mask coefficients and pixels in subimage under the mask. Mask frame p(2,1) p(3,2) p(2,2) p(2,3) p(3,3) p(1,1) p(1,3) p(3,1) … w(2,1) w(3,1) w(3,3) w(2,2) w(3,2) w(1,1) w(1,2) Mask coefficients Subimage The reference point of the mask

Template, Window, and Mask Operation (cont.) The mask operation on the whole image is given by: Move the mask over the image at each location. Compute sum of products between the mask coefficeints and pixels inside subimage under the mask. Store the results at the corresponding pixels of the output image. Move the mask to the next location and go to step 2 until all pixel locations have been used.

3x3 moving average filter Template, Window, and Mask Operation (cont.) Examples of the masks Sobel operators 3x3 moving average filter 1 2 -1 -2 -2 -1 1 2 1 3x3 sharpening filter -1 8