Presentation is loading. Please wait.

Presentation is loading. Please wait.

Visual Computing Computer Vision 2 INFO410 & INFO350 S2 2015

Similar presentations


Presentation on theme: "Visual Computing Computer Vision 2 INFO410 & INFO350 S2 2015"— Presentation transcript:

1 Visual Computing Computer Vision 2 INFO410 & INFO350 S2 2015
Tavita Su’a INFORMATION SCIENCE

2 Recap Analysing images and producing descriptions that can be used to interact with the environment (Horn, 1986). Applications include face detection, automatic number plate recognition and facial recognition Typical tasks include Feature: Detection (using edges, corners or blobs as descriptors) e.g. FAST Description e.g. SIFT Matching e.g. Cross Correlation

3 Overview Images Binary Grayscale Color Image Processing
Potential Exam Questions References

4 What is an (digital) image?

5 Images Binary Grayscale Color

6 Binary Each pixel is stored as a single bit, either 0 or 1. This type of image is also called 1 bit, binary image, bi-level, two-level, black and white and monochromatic.

7 = Grayscale 1 channel, 0 = black, 255 = white
20 75 95 96 127 145 175 200 47 74 = Each pixel has a grey value between 0 being black and 255 being white. 1 channel, 0 = black, 255 = white

8 Color 3 Channels – Red, Green, Blue
A color image is formed by combining the 3 images captured by the red, blue, and green sensors. Therefore a color image has 3 values per pixel whereas a binary image had 1 value per pixel.

9 Grayscale vs Color A color image is formed by combining the 3 images captured by the red, blue, and green sensors. Therefore a color image has 3 values per pixel whereas a binary image had 1 value per pixel.

10 Imaging Process The scene is projected on a 2D plane
sampled on a regular grid, and each sample is quantized (rounded to the nearest integer) A color image is formed by combining the 3 images captured by the red, blue, and green sensors.

11 Digital Image 3 resolutions Spatial (no. of pixels)
Intensity (no. of grey levels) Temporal (number of frames per second) Computers have limited resolutions Quantization is required due to the limited Intensity resolution Sampling is required due to limited spatial and temporal resolution A color image is formed by combining the 3 images captured by the red, blue, and green sensors.

12 Digital Image Sampling Quantization discrete color output
continuous colors mapped to a finite, discrete set of colors. discrete color output A color image is formed by combining the 3 images captured by the red, blue, and green sensors. continuous color input

13 Digital Image is… A 2D rectilinear array of samples real image sampled
A color image is formed by combining the 3 images captured by the red, blue, and green sensors. real image sampled quantized sampled & quantized

14 Image Noise Light Variations Camera Electronics Surface Reflectance Lens Given a camera and a still scene, how can you reduce noise? A color image is formed by combining the 3 images captured by the red, blue, and green sensors.

15 Image as a function We can think of an image as a function, f,
f: R2  R f (x, y) gives the intensity I, at position (x, y) A color image is just three functions pasted together. We can write this as a “vector-valued” function A color image is formed by combining the 3 images captured by the red, blue, and green sensors.

16 Image Processing An image processing operation typically defines a new image g in terms of an existing image f We can transform the domain or the range of f Range transformation: Domain transformation: A color image is formed by combining the 3 images captured by the red, blue, and green sensors. Filtering also generates new images from an existing image

17 Point Processing Simplest kind of range transformation
independent of position (x,y): for each original image intensity value I, function t() returns a transformed intensity value t(I) Function t() is applied to every pixel, the spatial information (x,y) is ignored A color image is formed by combining the 3 images captured by the red, blue, and green sensors.

18 Point Processing A color image is formed by combining the 3 images captured by the red, blue, and green sensors.

19 Gamma Correction Monitors have a intensity to voltage response curve which is roughly a 2.5 power function Send v -> actually display a pixel which has intensity equal to v 2.5 the light emitted is proportional to the voltage to the power 2.5 or so. The actual value of the exponent, called gamma, varies somewhat and the power law is only an an approximate model of the real situation, albeit a good one. An additional optical effect caused by viewing images against a dim surround is that the effective gamma value is somewhat reduced, from a theoretical 2.5 to around 2.2

20 Gamma Correction This example shows the original, unprocessed image and subsequent processed images with varying values of gamma applied. the light emitted is proportional to the voltage to the power 2.5 or so. The actual value of the exponent, called gamma, varies somewhat and the power law is only an an approximate model of the real situation, albeit a good one. An additional optical effect caused by viewing images against a dim surround is that the effective gamma value is somewhat reduced, from a theoretical 2.5 to around 2.2

21 Image filtering Modify the pixels in an image based on some function of a local neighborhood of each pixel 6 1 4 8 5 3 10 0.5 1 8 Local image data kernel Modified image data Common method used is linear filtering: Convolution and Cross Correlation Replace each pixel by a linear combination (a weighted sum) of its neighbors The prescription for the linear combination is called the kernel (aka mask or filter) the light emitted is proportional to the voltage to the power 2.5 or so. The actual value of the exponent, called gamma, varies somewhat and the power law is only an an approximate model of the real situation, albeit a good one. An additional optical effect caused by viewing images against a dim surround is that the effective gamma value is somewhat reduced, from a theoretical 2.5 to around 2.2

22 Why filter images? Smooth Sharpen Intensify Enhance Noise Reduction
the light emitted is proportional to the voltage to the power 2.5 or so. The actual value of the exponent, called gamma, varies somewhat and the power law is only an an approximate model of the real situation, albeit a good one. An additional optical effect caused by viewing images against a dim surround is that the effective gamma value is somewhat reduced, from a theoretical 2.5 to around 2.2

23 Cross-correlation Method
Denoted By: Filtering an image: replace each pixel with a linear combination of its neighbors The filter or kernel or mask H[u,v] is the prescription for the weights in the linear combination the light emitted is proportional to the voltage to the power 2.5 or so. The actual value of the exponent, called gamma, varies somewhat and the power law is only an an approximate model of the real situation, albeit a good one. An additional optical effect caused by viewing images against a dim surround is that the effective gamma value is somewhat reduced, from a theoretical 2.5 to around 2.2

24 Convolution Method Denoted By:
Flip the filter in both dimensions (top to bottom, left to right) Then apply cross-correlation the light emitted is proportional to the voltage to the power 2.5 or so. The actual value of the exponent, called gamma, varies somewhat and the power law is only an an approximate model of the real situation, albeit a good one. An additional optical effect caused by viewing images against a dim surround is that the effective gamma value is somewhat reduced, from a theoretical 2.5 to around 2.2

25 Convolution Method the light emitted is proportional to the voltage to the power 2.5 or so. The actual value of the exponent, called gamma, varies somewhat and the power law is only an an approximate model of the real situation, albeit a good one. An additional optical effect caused by viewing images against a dim surround is that the effective gamma value is somewhat reduced, from a theoretical 2.5 to around 2.2

26 Sample Exam Questions What is quantization and sampling?
What is the difference between a range transformation and a domain transformation? Given a camera and a still scene, how can you reduce noise? List and explain two uses of image filtering. Identify and explain two types of image resolutions. Briefly describe how a kernel or mask can be used to filter an image.

27 References


Download ppt "Visual Computing Computer Vision 2 INFO410 & INFO350 S2 2015"

Similar presentations


Ads by Google