A Multi-Spectral Structured Light 3D Imaging System MATTHEW BEARDMORE MATTHEW BOWEN.

Slides:



Advertisements
Similar presentations
Light Chapter
Advertisements

Grey Level Enhancement Contrast stretching Linear mapping Non-linear mapping Efficient implementation of mapping algorithms Design of classes to support.
Exposure The balance of the amount of light allowed entering the photographic medium There are 3 elements used to create the desired exposure 1. ISO 2.
Lightning Lesson Digital Imagery & Film Exposure The balance of the amount of light allowed entering the photographic medium There are 3 elements used.
The Ray Box: Part Two. Law of Refraction The law of refraction for light states that a light ray is bent away from its original direction when it passes.
COLOUR YEAR 11 - UNIT ONE PHYSICS
HOW A SILICON CHIP CAPTURES AN IMAGE
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
Bit Depth and Spatial Resolution SIMG-201 Survey of Imaging Science © 2002 CIS/RIT.
The eyes have three different kinds of color receptors; One kind is most sensitive to short wavelengths, one to middle wavelengths, and one to long wavelengths.
Color and Color Perception. a physicists view of color…
Color Mixing There are two ways to control how much red, green, and blue light reaches the eye: “Additive Mixing” Starting with black, the right amount.
Homework Set 8: Due Monday, Nov. 18 From Chapter 9: P10, P22, P26, P30, PH3, From Chapter 10: P4, P5, P9.
CSC 461: Lecture 2 1 CSC461 Lecture 2: Image Formation Objectives Fundamental imaging notions Fundamental imaging notions Physical basis for image formation.
Digital Images The nature and acquisition of a digital image.
02/14/02(c) University of Wisconsin 2002, CS 559 Last Time Filtering Image size reduction –Take the pixel you need in the output –Map it to the input –Place.
Digital Single-Lens Reflex Camera Peter Hsieh Writing 2E.
Bits & Bytes (are not junk food!). Bit is short for binary digit, the smallest unit of information in the digital world. A single bit can hold only one.
Digital Images The digital representation of visual information.
How do we perceive colour? How do colours add?. What is colour? Light comes in many “colours”. Light is an electromagnetic wave. Each “colour” is created.
Computer Graphics I, Fall 2008 Image Formation.
1 Image Formation. 2 Objectives Fundamental imaging notions Physical basis for image formation ­Light ­Color ­Perception Synthetic camera model Other.
Benjamin Goliwas’ Photo Analysis. Depth of Field Depth of Field: The breadth of sharpness in an image.  The greater the aperture (small Fstop) the smaller.
© 1999 Rochester Institute of Technology Color. Imaging Science Workshop for Teachers ©Chester F. Carlson Center for Imaging Science at RIT Color Images.
Lab #5-6 Follow-Up: More Python; Images Images ● A signal (e.g. sound, temperature infrared sensor reading) is a single (one- dimensional) quantity that.
How A Camera Works Image Sensor Shutter Mirror Lens.
Comparing Regular Film to Digital Photography
LIGHT.
Color Sources:
The Basics of Photography Exposure Micah Murdock.
ECEN 4616/5616 Optoelectronic Design Class website with past lectures, various files, and assignments: (The.
Photography in Education TECH2113 Dr. Alaa Sadik Department of Instructional & Learning Technologies
© 1999 Rochester Institute of Technology Introduction to Digital Imaging.
Macro and Close-up Photography Digital Photography DeCal 2010 Nathan Yan Kellen Freeman Some slides adapted from Zexi Eric Yan Photo by Daniel Schwen.
Guilford County SciVis V Applying Pixel Values to Digital Images.
Filtering and Color To filter a color image, simply filter each of R,G and B separately Re-scaling and truncating are more difficult to implement: –Adjusting.
December 4, 2014Computer Vision Lecture 22: Depth 1 Stereo Vision Comparing the similar triangles PMC l and p l LC l, we get: Similarly, for PNC r and.
Digital Cameras And Digital Information. How a Camera works Light passes through the lens Shutter opens for an instant Film is exposed to light Film is.
Color and Resolution Introduction to Digital Imaging.
Support the spread of “good practice” in generating, managing, analysing and communicating spatial information Introduction to Remote Sensing Images By:
What is an image? What is an image and which image bands are “best” for visual interpretation?
Digital Image Processing Part 1 Introduction. The eye.
An Introduction to Analyzing Colors in a Digital Photograph Rob Snyder.
How digital cameras work The Exposure The big difference between traditional film cameras and digital cameras is how they capture the image. Instead of.
Digital imaging By : Alanoud Al Saleh. History: It started in 1960 by the National Aeronautics and Space Administration (NASA). The technology of digital.
Spectral Image Analysis of a natural color sample using Rewritable Transparent Broad-band Filters Kanae Miyazawa (1), Markku Hauta-Kasari (2), and Satoru.
Color Web Design Professor Frank. Color Displays Based on cathode ray tubes (CRTs) or back- lighted flat-screen Monitors transmit light - displays use.
Digital imaging By : Alanoud Al Saleh. History: It started in 1960 by the National Aeronautics and Space Administration (NASA). The technology of digital.
Visual Computing Computer Vision 2 INFO410 & INFO350 S2 2015
Light. Photon is a bundle of light related to the amount of energy Light travels in straight line paths called light rays
Computer Science 121 Scientific Computing Winter 2014 Chapter 14 Images.
Applying Pixel Values to Digital Images
Intelligent Vision Systems Image Geometry and Acquisition ENT 496 Ms. HEMA C.R. Lecture 2.
Intelligent Robotics Today: Vision & Time & Space Complexity.
Data Models, Pixels, and Satellite Bands. Understand the differences between raster and vector data. What are digital numbers (DNs) and what do they.
Fundamentals of Digital Images & Photography. Pixels & Colors The pixel (a word invented from "picture element") is the basic unit of programmable color.
Color.
Digital Image: Rendering of a continuously varying scene with a finite array of picture elements, where each one has a discrete intensity or color 39.
An Introduction to Digital Image Processing Dr.Amnach Khawne Department of Computer Engineering, KMITL.
Unit 1 The History of Photography & The Camera
Aerial Images.
Introduction to Electromagnetic waves, Light, and color
Chapter I, Digital Imaging Fundamentals: Lesson II Capture
Common Classification Tasks
What Is Spectral Imaging? An Introduction
ATOMIC SPECTRA.
Design of Apparatus for colour matching.
Photographic Image Formation I
More on The visible Spectrum
Color Highlights Unit 6.
Presentation transcript:

A Multi-Spectral Structured Light 3D Imaging System MATTHEW BEARDMORE MATTHEW BOWEN

Origins of our Project Freshman Imaging Project 2011 Tasked with creating a 3D imaging system over three quarters Presented at ImagineRIT 2012 Contour distances along a person’s face gives information about the structure of that person’s trachea Uses technique known as Structured Light to scan subjects

A brief primer on digital imaging Source: Digital Photography Presentation (Jeff Pelz, Joe Pow)

How are digital images captured? Light rays pass in through an aperture Those rays are focused onto the sensor by a lens The sensor segments the light into individual boxes, known as pixels Each pixel interprets the intensity of the light striking it as a numerical value A closeup of an imaging sensor Source: Digital Photography Presentation (Jeff Pelz, Joe Pow) Individual pixel element

What is Structured Light? A 3D scanning technique Involves projecting a series of known patterns onto a subject A camera interprets the distortions in the patterns and calculates depth The series of patterns creates a temporal code for each pixel Each projected pixel is uniquely identified by this code

What is Structured Light? (continued) The camera can detect where each projected pixel falls upon the subject Interprets each pattern as a part of a temporal code Combines each part at the end of the scan to reconstruct the temporal code Depth can then be calculated Triangulation between camera and projector projector camera projected light ray reflected light ray object being scanned p Diagram source: Structured Light: The Mathematics of 3D Triangulation Presentation (Gabriel Taubin, Douglas Lanman)

ImagineRIT 2012 Prototype

360° Scanning – Obstacles Simultaneous scanning with black and white can cause the scanners to interfere with each other Left projector onlyRight projector onlyBoth projectors simultaneously

360° Scanning – Utilizing multiple spectra Left projector displaying green Right projector displaying red Camera with green color filterCamera with red color filter Instead of black and white scanning, different colors can be assigned to each camera-projector pair and isolated using color filters

Our goals Extend the original FIP2011 prototype to four cameras and projectors Allows for 360° + overhead scanning of subject (2π steradians) Utilize red, green, and blue portions of the visible spectrum Allows for simultaneous projection, keeping scan times as short as possible Decrease the overall scan time of our system Assists in the scanning of subjects, as subjects will move less

First attempt Three color cameras, each with a Bayer filter on the sensor Separate red, green, and blue channels into intensity maps (grayscale) No color filters Suboptimal quality – projectors not displaying precise colors E.g. displaying green would cause projector to display a small amount of red

Use of color filters Instead of raw Bayer filter, color filters were placed in front of camera and projector Restricts projector output and camera input to only desired wavelengths Significantly improved scanning results with three camera-projector pairs Very little interference between scanners

Adding an overhead scanner Red, green, and blue make up primary colors of light What color should the fourth scanner use? Yellow was chosen due to its distance away from any primary color relative to other secondary colors Significant interference due to breadth of wavelengths that the red and green filters cover Yellow wavelength overlaps too much with red and green to be viable

Summer 2012 Prototype

Results PUT PHOTO HERE

Status of goals Successfully implemented a multi-spectral scanner that uses red, green, and blue to achieve 360° scanning Scan time halved from ~8 seconds to ~4 seconds Addition of fourth color – yellow – not successful Colors filters not narrow enough to be useful for four scanners To achieve the goal of scanning a subject with the overhead view, the last camera-projector pair would not be able to scan simultaneously

What we’ve learned Matching the output of the projector to color filters is difficult Narrow matches are optimal Grayscale cameras paired with color filters provide optimal contrast and resolution As opposed to the color (Bayer) cameras used in our project

Acknowledgements Chester F. Carlson Center for Imaging Science Joe Pow, Advisor Maria Helguera, Advisor Stefi Baum Class of Freshman Imaging Project 2011 Gabriel Taubin and Douglas Lanman, Brown University