EE 4780: Introduction to Computer Vision Introduction.

Slides:



Advertisements
Similar presentations
Set the Camera Options  Resolution  Focus  Exposure  Zoom  Flash  Self-Timer/Remote Control.
Advertisements

Manual Camera Settings
Digital Camera Essential Elements Part 1 Sept
Foundations of Physics
Module 1 Digital Cameras. Image Capture Instead of film, a digital camera uses a device called a CCD (charge coupled device).
Digital Imaging and Image Analysis
Lesson 1: The Art and Physics of Photography Digital Photography MITSAA IAP 2003 Rob Zehner.
Cameras (Reading: Chapter 1) Goal: understand how images are formed Camera obscura dates from 15 th century Basic abstraction is the pinhole camera Perspective.
Announcements. Projection Today’s Readings Nalwa 2.1.
Imaging Real worldOpicsSensor Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen.
The Camera : Computational Photography Alexei Efros, CMU, Fall 2008.
CPSC 643 Robot Vision Introduction to Computer Vision Dezhen Song,
Capturing and controlling digital images. Great images are not made by digital cameras. They are made by photographers who understand what to look for.
Chapter 17 Optics 17.1 Reflection and Refraction
Digital Cameras Basic Info on Operations Class Web Quest.
Copyright © Texas Education Agency, All rights reserved.1 Introduction to Digital Cameras Principles of Information Technology.
Photography Parts of a Camera. Aperture size (or width or diameter) of the opening of a lens diaphragm inside a photographic lens regulates the amount.
How the Camera Works ( both film and digital )
Microscope.
Camera Functions Using Your Digital Camera. 1. What happens when you press the shutter button down halfway? What does macro mode allow you to do? Pressing.
EXPOSURE Image & Camera Control Instructor: David King
Cameras Course web page: vision.cis.udel.edu/cv March 22, 2003  Lecture 16.
Photographics 10 Introduction to Digital Photography
1 Digital Cameras Consumer digital cameras have been around since 1995 What features make a good camera? How do we optimize good features with a limited.
Image Formation. Input - Digital Images Intensity Images – encoding of light intensity Range Images – encoding of shape and distance They are both a 2-D.
1 CS6825: Image Formation How are images created. How are images created.
Fundamental of Optical Engineering Lecture 3.  Aberration happens when the rays do not converge to a point where it should be. We may distinguish the.
Lab #5-6 Follow-Up: More Python; Images Images ● A signal (e.g. sound, temperature infrared sensor reading) is a single (one- dimensional) quantity that.
Comparing Regular Film to Digital Photography
Image Formation Fundamentals Basic Concepts (Continued…)
Lecture Exposure/histograms. Exposure - Four Factors A camera is just a box with a hole in it. The correct exposure is determined by four factors: 1.
FYS 100 Creative Discovery in Digital Art Forms Fall 2008 Burg Digital Photography Assignment.
PHYS 1442 – Section 004 Lecture #22-23 MW April 14-16, 2014 Dr. Andrew Brandt 1 Cameras, Film, and Digital The Human Eye; Corrective Lenses Magnifying.
Photography in Education TECH2113 Dr. Alaa Sadik Department of Instructional & Learning Technologies
Digital Cameras Colter Freund All a digital picture really is, is a collection of millions of tiny little squares called picture elements (pixels) that.
Math 3360: Mathematical Imaging Prof. Ronald Lok Ming Lui Department of Mathematics, The Chinese University of Hong Kong Lecture 1: Introduction to mathematical.
EE 7730: Image Analysis I Introduction. Bahadir K. Gunturk2 EE 7730 Dr. Bahadir K. Gunturk Office: EE 225
Factors affecting CT image RAD
Computational Photography
Vocabulary Words. 2-stage shutter release is a shutter release, that when pressed halfway, it activates the autofocus and the light meter of the camera,
Intelligent Vision Systems Image Geometry and Acquisition ENT 496 Ms. HEMA C.R. Lecture 2.
In the Know … Technological Vocabulary. Movin’ on Up Terms 1. Available Light – Light in a room (sunlight or existing room light) 2. Bit Depth – the color.
Lenses. Diverging and Converging Lenses Double Convex lenses focus light rays to a point on the opposite side of the lens. Double Concave lenses diverge.
Computer Graphics & Image Processing Lecture 1 Introduction.
1. What is depth of field? 2. Everything else equal, what effect will each of the following have on depth of field (larger, smaller?): -Larger aperture.
Cameras. Question: If you’re building a camera and want to make a larger image (a telephoto lens) you should: 1.increase the diameter of the lens 2.decrease.
DIGITAL CAMERAS Prof Oakes. Overview Camera history Digital Cameras/Digital Images Image Capture Image Display Frame Rate Progressive and Interlaced scans.
Autonomous Robots Vision © Manfred Huber 2014.
1 Machine Vision. 2 VISION the most powerful sense.
Glossary of Photographic Terms and Concepts. Aperture (aka f-stop): the opening in a lens. The bigger the opening, the more light will be allowed in through.
Intelligent Vision Systems Image Geometry and Acquisition ENT 496 Ms. HEMA C.R. Lecture 2.
CS559: Computer Graphics Lecture 3: Image Sampling and Filtering Li Zhang Spring 2010.
Robotics Chapter 6 – Machine Vision Dr. Amit Goradia.
What kind of light is emitted by regular (not self-luminous) objects?
BASICS of the Camera A brief look at the origins and key features of the modern camera.
IMAGE FORGERY DETECTION Submitted by Deepika Dileep Deepika Dileep S7 IT N0:35 N0:35.
An Introduction to Digital Image Processing Dr.Amnach Khawne Department of Computer Engineering, KMITL.
Digital Cameras in the Classroom Day One Basics Ann Howden UEN Professional Development
Digital Cameras A digital camera ( or digital) is a camera that takes video or still photographs, or both, digitally by recording images by an electronic.
CSE 185 Introduction to Computer Vision
Digital Image -M.V.Ramachandranwww.youtube.com/postmanchandru
1. 2 What is Digital Image Processing? The term image refers to a two-dimensional light intensity function f(x,y), where x and y denote spatial(plane)
Visual Information Processing. Human Perception V.S. Machine Perception  Human perception: pictorial information improvement for human interpretation.
Chapter 10 Digital Signal and Image Processing
The Camera : Computational Photography
Introduction to Digital Photography
Engineering Math Physics (EMP)
The Camera : Computational Photography
Announcements Midterm out today Project 1 demos.
Introduction to Digital Photography
Presentation transcript:

EE 4780: Introduction to Computer Vision Introduction

Bahadir K. Gunturk2 EE 4780 Instructor: Bahadir K. Gunturk Office: EE Tel: Office Hours: MW 10:00 – 12:00

Bahadir K. Gunturk3 EE 4780 We will learn the fundamentals of digital image processing and computer vision. Lecture slides, problems sets, solutions, study materials, etc. will be posted on the class website. [ Textbook is not required. References:  Gonzalez/Woods, Digital Image Processing, Prentice-Hall, 2/e.  Forsyth/Ponce, Computer Vision: A Modern Approach, Prentice-Hall.  Duda, Hart, and Stork, “Pattern Classification,” John Wiley&Sons,  Shapiro/Stockman, Computer Vision, Prentice-Hall.  Horn, “Robot Vision,” MIT Press, 1986.

Bahadir K. Gunturk4 Grading Policy Your grade will be based on  Problem Sets: 30%  Midterm: 30%  Final: 40% Problem Sets  Mini projects: Theoretical problems and MATLAB assignments  4-5 Problem Sets  Individually or in two-person teams

Bahadir K. Gunturk5 Digital Image Acquisition Sensor array When photons strike, electron-hole pairs are generated on sensor sites. Electrons generated are collected over a certain period of time. The number of electrons are converted to pixel values. (Pixel is short for picture element.)

Bahadir K. Gunturk6 Digital Image Acquisition Two types of quantization: 1.There are finite number of pixels. (Spatial resolution) 2.The amplitude of pixel is represented by a finite number of bits. (Gray-scale resolution)

Bahadir K. Gunturk7 Digital Image Acquisition Take a look at this cross section

Bahadir K. Gunturk8 Digital Image Acquisition 256x256 - Found on very cheap cameras, this resolution is so low that the picture quality is almost always unacceptable. This is 65,000 total pixels. 640x480 - This is the low end on most "real" cameras. This resolution is ideal for ing pictures or posting pictures on a Web site. 1216x912 - This is a "megapixel" image size -- 1,109,000 total pixels -- good for printing pictures. 1600x With almost 2 million total pixels, this is "high resolution." You can print a 4x5 inch print taken at this resolution with the same quality that you would get from a photo lab. 2240x Found on 4 megapixel cameras -- the current standard -- this allows even larger printed photos, with good quality for prints up to 16x20 inches. 4064x A top-of-the-line digital camera with 11.1 megapixels takes pictures at this resolution. At this setting, you can create 13.5x9 inch prints with no loss of picture quality.

Bahadir K. Gunturk9 Image Resolution Don’t confuse image size and resolution.

Bahadir K. Gunturk10 Bit Depth – Grayscale Resolution 8 bits7 bits 6 bits5 bits

Bahadir K. Gunturk11 Bit Depth – Grayscale Resolution 4 bits 3 bits 2 bits 1 bit

Bahadir K. Gunturk12 Matrix Representation of Images A digital image can be written as a matrix

Bahadir K. Gunturk13 Digital Color Images

Bahadir K. Gunturk14 Color Displays CRT LCD Polarize to control the amount of light passed.

Bahadir K. Gunturk15 Video = vertical position = horizontal position = frame number ~24 frames per second.

Bahadir K. Gunturk16 Why do we process images? To facilitate their storage and transmission To prepare them for display or printing To enhance or restore them To extract information from them To hide information in them

Bahadir K. Gunturk17 Image Processing Example Image Restoration Original imageBlurredRestored by Wiener filter

Bahadir K. Gunturk18 Image Processing Example Noise Removal Noisy imageDenoised by Median filter

Bahadir K. Gunturk19 Image Processing Example Image Enhancement Histogram equalization

Bahadir K. Gunturk20 Image Processing Example Artifact Reduction in Digital Cameras Original sceneCaptured by a digital camera Processed to reduce artifacts

Bahadir K. Gunturk21 Image Processing Example Image Compression Original image 64 KB JPEG compressed 15 KB JPEG compressed 9 KB

Bahadir K. Gunturk22 Image Processing Example Object Segmentation “Rice” imageEdges detected using Canny filter

Bahadir K. Gunturk23 Image Processing Example Resolution Enhancement

Bahadir K. Gunturk24 Image Processing Example Watermarking Original image Hidden message Generate watermark Watermarked image Secret key

Bahadir K. Gunturk25 Image Processing Example Face Recognition Surveillance video Search in the database

Bahadir K. Gunturk26 Image Processing Example Fingerprint Matching

Bahadir K. Gunturk27 Image Processing Example Segmentation

Bahadir K. Gunturk28 Image Processing Example Texture Analysis and Synthesis Pattern repeated Computer generated Photo

Bahadir K. Gunturk29 Image Processing Example Face detection and tracking

Bahadir K. Gunturk30 Image Processing Example Face Tracking

Bahadir K. Gunturk31 Image Processing Example Object Tracking

Bahadir K. Gunturk32 Image Processing Example Virtual Controls

Bahadir K. Gunturk33 Image Processing Example Visually Guided Surgery

Bahadir K. Gunturk34 Cameras First camera was invented in 16 th century. It used a pinhole to focus light rays onto a wall or translucent plate. Take a box, prick a small hole in one of its sides with a pin, and then replace the opposite side with a translucent plate. Place a candle on the pinhole side, you will see an inverted image of the candle on the translucent plate.

Bahadir K. Gunturk35 Perspective Projection Perspective projection equations

Bahadir K. Gunturk36 Pinhole Camera Model If the pinhole were really reduced to a point, exactly one light ray would pass through each point in the image plane. In reality, each point in the image place collects light from a cone of rays.

Bahadir K. Gunturk37 Pinhole Cameras Pinhole too big - many directions are averaged, blurring the image Pinhole too small - diffraction effects blur the image

Bahadir K. Gunturk38 Cameras With Lenses Most cameras are equipped with lenses. There are two main reasons for this:  To gather light. For an ideal pinhole, a single light ray would reach each point the image plane. Real pinholes have a finite size, so each point in the image plane is illuminated by a cone of light rays. The larger the hole, the wider the cone and the brighter the image => blurry pictures. Shrinking the pinhole produces sharper images, but reduces the amount of light and may introduce diffraction effects.  To keep the picture in sharp focus while gathering light from a large area.

Bahadir K. Gunturk39 Compound Lens Systems

Bahadir K. Gunturk40 Real Lenses Rays may not focus at a single point. Spherical aberration Spherical aberration can be eliminated completely by designing aspherical lenses.

Bahadir K. Gunturk41 Real Lenses Chromatic aberration The index of refraction is a function of wavelength. Light at different wavelengths follow different paths.

Bahadir K. Gunturk42 Real Lenses Chromatic Aberration

Bahadir K. Gunturk43 Real Lenses Special lens systems using two or more pieces of glass with different refractive indeces can reduce or eliminate this problem. However, not even these lens systems are completely perfect and still can lead to visible chromatic aberrations.

Bahadir K. Gunturk44 Real Lenses Barrel Distortion & Pincushion Distortion Stop (Aperture) Causes of distortion (normal) Chief ray

Bahadir K. Gunturk45 Real Lenses Barrel Distortion & Pincushion Distortion Distorted Corrected

Bahadir K. Gunturk46 Real Lenses Vignetting effect in a two-lens system. The shaded part of the beam never reaches the second lens. The brightness drop in the image perimeter.

Bahadir K. Gunturk47 Real Lenses Optical vignetting example. Left: f/1.4. Right: f/5.6. f-number focal length to diameter ratio

Bahadir K. Gunturk48 Real Lenses Long exposure time Short exposure time

Bahadir K. Gunturk49 Real Lenses Flare Hood may prevent flares

Bahadir K. Gunturk50 Real Lenses Flare

Bahadir K. Gunturk51 Compound Lens Systems

Bahadir K. Gunturk52 Digital Camera Pipeline Auto-exposure algorithms measure brightness over discrete scene regions to compensate for overexposed or underexposed areas by manipulating shutter speed and/or aperture size. The net goals here are to maintain relative contrast between different regions in the image and to achieve a good overall quality. (from Katz and Gentile)

Bahadir K. Gunturk53 Digital Camera Pipeline Auto-focus algorithms divide into two categories. Active methods use infrared or ultrasonic emitters/receivers to estimate the distance between the camera and the object being photographed. Passive methods, on the other hand, make focusing decisions based on the received image in the camera.

Bahadir K. Gunturk54 Digital Camera Pipeline Lens distortion correction This set of algorithms accounts for the physical properties of lenses that warp the output image compared to the actual scene the user is viewing. Different lenses can cause different distortions; for instance, wide-angle lenses create a "barrel distortion", while telephoto lenses create a "pincushion distortion“.

Bahadir K. Gunturk55 Digital Camera Pipeline Vignetting (shading distortion) reduces image brightness in the area around the lens. Chromatic aberration causes color fringes around an image. The media processor needs to mathematically transform the image in order to correct for these distortions.

Bahadir K. Gunturk56 Digital Camera Pipeline Sensor's output needs to be gamma-corrected to account for eventual display, as well as to compensate for nonlinearities in the sensor's capture response.

Bahadir K. Gunturk57 Digital Camera Pipeline Image stability compensation, or hand-shaking correction is another area of preprocessing. Here, the processor adjusts for the translational motion of the received image, often with the help of external transducers that relate the real-time motion profile of the sensor.

Bahadir K. Gunturk58 Digital Camera Pipeline White balance is another important stage of preprocessing. When we look at a scene, regardless of lighting conditions, our eyes tend to normalize everything to the same set of natural colors. For instance, an apple looks deep red to us whether we're indoors under fluorescent lighting, or outside in sunny weather. However, an image sensor's "perception" of color depends largely on lighting conditions, so it needs to map its acquired image to appear natural in its final output. This mapping can be done either manually or automatically.

Bahadir K. Gunturk59 Digital Camera Pipeline Demosaicking (Bayer interpolation) estimates missing color samples in single- chip cameras.

Bahadir K. Gunturk60 Digital Camera Pipeline In this stage, the interpolated RGB image is transformed to the targeted output color space (if not already in the right space). For compression or display to a television, this will usually involve an RGB  YCbCr matrix transformation, often with another gamma correction stage to accommodate the target display. The YCbCr outputs may also be chroma subsampled at this stage to the standard 4:2:2 format for color bandwidth reduction with little visual impact.

Bahadir K. Gunturk61 Digital Camera Pipeline Postprocessing In this phase, the image is perfected via a variety of filtering operations before being sent to the display and/or storage media. For instance, edge enhancement, pixel thresholding for noise reduction, and color-artifact removal are all common at this stage.

Bahadir K. Gunturk62 Digital Camera Pipeline Display / Compress / Store Once the image itself is ready for viewing, the image pipe branches off in two different directions. In the first, the postprocessed image is output to the target display, usually an integrated LCD screen (but sometimes an NTSC/PAL television monitor, in certain camera modes). In the second, the image is sent to the media processor's compression algorithm, where industry-standard compression techniques (JPEG, for instance) are applied before the picture is stored locally in some storage medium (e.g., Flash memory card).