Polarization Processing for Low Light Imaging Abstract: An important property of still picture cameras and video cameras is their ability to present an.

Slides:



Advertisements
Similar presentations
Sumitha Ajith Saicharan Bandarupalli Mahesh Borgaonkar.
Advertisements

© Paradigm Publishing, Inc. 2-1 Chapter 2 Input and Processing Chapter 2 Input and Processing.
Eyes for Relighting Extracting environment maps for use in integrating and relighting scenes (Noshino and Nayar)
Electrical and Computer Engineering Preliminary Design Review Team 14: BMW Brainwave Manipulated Wagon.
Logging and Replay of Go Game Steven Davis Elizabeth Fehrman Seth Groder.
Motion Tracking Recorder 360 (MTR-360) Group #1 Lee Estep Philip Robertson Andy Schiestl Robert Tate.
Top Level System Block Diagram BSS Block Diagram Abstract In today's expanding business environment, conference call technology has become an integral.
OUTLINE WHAT ? HOW ? WHY ? BLUEPOST Poster and Message Content Specified by the User Displaying the Poster Content on a Monitor Sending Messages to.
TAP (Test Access Port) JTAG course June 2006 Avraham Pinto.
Super Fast Camera System Performed by: Tokman Niv Levenbroun Guy Supervised by: Leonid Boudniak.
Real-Time Face Detection and Tracking Using Multiple Cameras RIT Computer Engineering Senior Design Project John RuppertJustin HnatowJared Holsopple This.
University of Pennsylvania Department of Electrical and Systems Engineering ABSTRACT: Quantifying and measuring certain aspects of a golf swing is a helpful.
Adapted from CTAE Resources Network PROFITT Curriculum Basic Computer Skills Module 1 Hardware.
Polarization.
Polarization.
1 ECE Department Thermal Mapping Drone FPR Team 17 Jamyang Tenzin Stefan Totino Dylan Fallon Jason Fellow Advisor: Joseph Bardin.
Polarization, Diffraction and Interference Behavior of Waves Essential Knowledge 6.A.1: Waves can propagate via different oscillation modes such as transverse.
Management Information Systems
Building a Photodetector to Observe the Polarization of Light Kevin J. McElwee Bridgewater State University, Bridgewater MA Mentor: Edward F. Deveney.
OPTICAL FLOW The optical flow is a measure of the change in an image from one frame to the next. It is displayed using a vector field where each vector.
Eye Detector Project Midterm Review John Robertson Roy Nguyen.
Abstract Some Examples The Eye tracker project is a research initiative to enable people, who are suffering from Amyotrophic Lateral Sclerosis (ALS), to.
© Paradigm Publishing Inc. 2-1 Chapter 2 Input and Processing.
Team 3D Erik Lorhammer Christopher BermelJosh Cornelius Electrical Computer Engineering Electrical EngineerElectrical Engineer.
Why Objects Have Color Visible light is a combination of many wavelengths (colors), which give it a white appearance. When light hits an object certain.
1 Registers and Counters A register consists of a group of flip-flops and gates that affect their transition. An n-bit register consists of n-bit flip-flops.
Pinewood Derby Timing System Using a Line-Scan Camera Rob Ostrye Class of 2006 Prof. Rudko.
Comparing Regular Film to Digital Photography
Reflections Specular reflection is the perfect reflection of light from a surface. The law a reflection states that the direction of the incoming ray and.
Virtual Imaging Peripheral for Enhanced Reality Aaron Garrett, Ryan Hannah, Justin Huffaker, Brendon McCool.
1.Overview 2. Hardware 3. Software Interface 4. Triggering 5. Installation 6. Configuring.
Data and its manifestations. Storage and Retrieval techniques.
© Paradigm Publishing Inc. 2-1 Chapter 2 Input and Processing.
Phantom 65 Workflow Overview Phillip Jantzen © Vision Research 2011 All Rights Reserved.
ME 6501 Computer Aided Design
ABSTRACT A single camera can be a useful surveillance tool, but video recorded from a single point of reference becomes ineffective when objects of interest.
Week 10 - Wednesday.  What did we talk about last time?  Shadow volumes and shadow mapping  Ambient occlusion.
Images formed by mirrors –plane mirrors –curved mirrors concave convex Images formed by lenses the human eye –correcting vision problems nearsightedness.
Motion Analysis using Optical flow CIS750 Presentation Student: Wan Wang Prof: Longin Jan Latecki Spring 2003 CIS Dept of Temple.
By N.Gopinath AP/CSE Cognos Impromptu. What is Impromptu? Impromptu is an interactive database reporting tool. It allows Power Users to query data without.
12.4 Essential Questions How is a hologram made? When does total internal reflection occur? How are optical fibers used? Using Light Copyright © McGraw-Hill.
1 Perception and VR MONT 104S, Fall 2008 Lecture 21 More Graphics for VR.
Resolution Limits for Single-Slits and Circular Apertures  Single source  Two sources.
DIGITAL CAMERAS Prof Oakes. Overview Camera history Digital Cameras/Digital Images Image Capture Image Display Frame Rate Progressive and Interlaced scans.
PROJECT - ZYNQ Yakir Peretz Idan Homri Semester - winter 2014 Duration - one semester.
INTRODUCTION.  Upon completing this topic, you should be able to: Illustrate a basic elements of digital computer system and their functions, Depicts.
Intel: Lan Access Division Technion: High Speed Digital Systems Lab By: Leonid Yuhananov & Asaad Malshy Supervised by: Dr. David Bar-On.
IPD Technical Conference February 19 th 2008 Application: Pipette Measurement and Flash Inspection. Distributor: CPU Automation Engineer: Mike Bray.
TechnoSpecialist Computers Information Package By Troy Rayner.
1 07/10/07 Forward Vertex Detector Technical Design – Electronics DAQ Readout electronics split into two parts – Near the detector (ROC) – Compresses and.
L 32 Light and Optics-4 Up to now we have been studying geometric optics Today we will look at effects related to the wave nature of light – physical optics.
Introduction ProjectRequirements Project Requirements In a previous senior design project, a wireless front-end was added to Iowa State University’s Teradyne.
29:006 FINAL EXAM FRIDAY MAY 11 3:00 – 5:00 PM IN LR1 VAN.
Intelligent Robotics Today: Vision & Time & Space Complexity.
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
Chapter 9.6 Polarization. Simple concepts of Polarization Waves on a string Diagram of vertical wave on a string (string is always in the vertical plane)
Lecture Overview Shift Register Buffering Direct Memory Access.
Camera surface reference images desired ray ‘closest’ ray focal surface ‘closest’ camera Light Field Parameterization We take a non-traditional approach.
L 33 Light and Optics [4] Measurements of the speed of light  The bending of light – refraction  Total internal reflection  Dispersion Dispersion 
Digital Cameras A digital camera ( or digital) is a camera that takes video or still photographs, or both, digitally by recording images by an electronic.
Development of T3Maps adapter boards
Dominic C. O Brien,Jing Jing Liu ,
Section 4: Using Light Light can be used to form three-dimensional images and to transmit information in optical fibers. K What I Know W What I Want to.
The Polarization of Light
Registers.
What are the jobs of Sunglasses?
NAND/NOR Logic Gate Replacement Training tool
Serial Communications
Presentation transcript:

Polarization Processing for Low Light Imaging Abstract: An important property of still picture cameras and video cameras is their ability to present an image that clearly depicts different objects. Normal cameras have the ability to discern between items in bright settings. When ambient light is very dim, pictures can be taken with a flash to illuminate the environment. However, there are many scenarios where there is little light and the use of a flash is either prohibited or can compromise a position. Additionally, video cameras do not even have the option, as a flash cannot be triggered for every frame. In these situations, normal cameras lack the necessary technology to distinguish one object from another. The approach this system takes to overcome the constraints of normal cameras is to process polarization information collected by three cameras. By placing three filters oriented at 0, 45, and 90 degrees on top of the cameras, polarization information is collected for each angle, resulting in total polarization knowledge for every pixel. With the data collected, an overall image is generated through image recombination and various algorithms. This image displays far more information than that rendered by traditional cameras using only the intensity of light. The additional contrast provided results in the user being able to differentiate between two objects in low light situations that were previously indistinguishable. Authors: Darren Wang (EE ’08) Anujit Shastri (EE ’08) Advisors: Professor Jan Van der Spiegel Viktor Gruev Zheng Yang Polarization: z y x E k Polarization is defined as the direction of the electric field (E) of an electromagnetic wave (k). While it resides in the plane transverse to the direction of the wave’s propagation, it changes over time. Degree of Polarization (p): If the polarization of a wave is unpredictable, it is said to be randomly polarized, and its degree of polarization is thus zero. Conversely, if a wave’s polarization is completely predictable, it is perfectly polarized, and has a degree of polarization of 1. Partially polarized waves reside in between these two extremes. Angle of Polarization (θ): The tip of the electric field of polarized waves will trace out an ellipse over time. The angle of polarization is the angle of the major axis of the ellipse with respect to the horizontal. Demo Day Times: 10:30, 11:00, 1:30, 2:00, 2:30 (Group 3) Project Setup: Physical SetupComputer FPGA Light enters the system and passes through three lenses. Behind each of these lenses is a polarization filter oriented at 0, 45, or 90 degrees. A sensor, placed behind each filter, will thus record intensities of polarized light. This decomposition allows for per pixel calculation of the polarization parameters. The sensors output 8 bits of data per pixel in parallel format. The FPGA acts as an intermediary in this system. Its purpose is to transfer image data from the sensors to the computer. It also controls the sensors by providing the clock and programming the registers through a serial interface. Data traverses a series of FIFO’s and is stored in the on- board SDRAM until ready for transfer to the computer. Once data has been acquired by the FPGA, it signals the computer for collection. The computer program is perpetually scanning for incoming FPGA signals, upon which a separate rendering thread is invoked to visually display the captured sensor information. The separate images are recombined via user matching of similar points between the three images, such that pixels match each other in space. The polarization parameters are then calculated and displayed. Usage of three sensors required development of a printed circuit board (PCB) for mounting of sensors and routing of information to the FPGA. PCB design and layout done using the OrCad Cadence suite of tools and fabricated from a third-party production facility. Coded in Verilog, the logic is the driving force behind the FPGA. Data coming from the sensors is in binary and valid only at certain times. The FPGA stores only valid data in the SDRAM that resides on its board. A series of control signals control the FPGA from the computer.