ROHIT BANERJEE DEBRAJ BOSE KEVIN KASSING KANUPRIYA TAVRI EyePoint 1.

Slides:



Advertisements
Similar presentations
Types of Camera Lenses Types Of Lenses Zoom Wide Angle Zoom Normal Zoom Telephoto Zoom Single Focal Length Wide Angle Normal Zoom Special Purpose Micro.
Advertisements

Hand Gesture for Taking Self Portrait Shaowei Chu and Jiro Tanaka University of Tsukuba Japan 12th July 15 minutes talk.
1. Which two parameters determine how much light reaches the image sensor? 2. You're shooting fast action scenes at night and getting lots of blur because.
Insect Photography Rig Preliminary Design Review Brendan Kemp Rob Leveille Nafis Azad.
UAV pose estimation using POSIT algorithm
AR Infrared Camera Matt Travis Will Carter Aaron Brandt Brent Illingworth.
Photography (the very basics). Before we get started… - These are only very simple explanations - I could be wrong! - Mainly aimed at digital users.
IN YOUR FACE Anish Mathur Brian Thompson Sunny Atluri Rushikesh Sheth.
Photography Merit Badge
CASTLEFORD CAMERA CLUB
Types of cameras, parts of the camera
Digital Cameras Caitlin Stamper. The first commercially available digital camera was the 1990 Dycam Model 1 (Logitech Fotoman). It used a CCD image sensor,

Multimedia and Interactivity. Interactivity Allows users to manipulate information and to contribute to the story Promotes user involvement and understanding.
Motion Capture Hardware
By: Arkida Merizaj and Imelda Mahmutaj.
Virtual Humanoid “Utsushiomi” Michihiko SHOJI Venture Business Laboratories, Yokohama National University
Cinematography  Process of capturing moving images on film.
HISTORY OF VIDEO TECHNOLOGY Made By : Guadalupe Bustamante.
Shinta Kisriani.  INTRODUCTION  THEORY LITERATURE  METHOD DESIGN  ANALYSIS & RESULT  CONCLUSION  FUTURE WORK.
 Is the technique of taking several shots of the same subject using different aperture, shutter speed or ISO settings  Why do you think you would want.
Taking Better Photos 15 Tips You Can Try. Move in CLOSER.  Take a few steps closer.  Use the zoom lens to zoom in.  Most people leave too much “dead.
An unbroken view of an entire surrounding area, or survey A picture or series of pictures representing a continuous scene.
Controlling the Photographic Process. With today’s modern digital cameras you can have as much or as little control over the picture taking process as.
High-Resolution Interactive Panoramas with MPEG-4 발표자 : 김영백 임베디드시스템연구실.
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
ECE 8443 – Pattern Recognition EE 3512 – Signals: Continuous and Discrete Objectives: Spectrograms Revisited Feature Extraction Filter Bank Analysis EEG.
Photojournalism. Lesson Essential Questions  How do you compose a photograph for maximum impact?  How do you adjust a camera’s shutter speed and aperture.
1 Perception and VR MONT 104S, Fall 2008 Lecture 21 More Graphics for VR.
In-Car Video Management: Technology and Trends. Agenda Things to Consider –Analog vs. Digital What Makes a good Video Solution It’s All about Protection.
Method determinate angle of rotation of an IMU application for UAV Trinh Dinh Quan Southern TaiWan University.
Media Arts – Review Day 3 Digital Camera Parts And Principles of design.
ELEMENTS OF DESIGN. Camera Angles Every time that you take a photograph you choose a position or camera angle to take it from. The camera angle and your.
CHROMATIC TRAILBLAZER 25 th November, 2008 University of Florida, Department of Electrical & Computer Engineering, Intelligent Machine Design Lab (EEL.
by: Taren Haynes, Robert Barnes, Devin Guelfo, Ashley Evans.
Realtime Robotic Radiation Oncology Brian Murphy 4 th Electronic & Computer Engineering.
Landscape Photography. Landscape A landscape is a section or portion of scene as seen from a single viewpoint. Scenery is the subject of a landscape image.
No Blur in the Image Sensor Consistent light. Motion Blur in the Image Sensor Inconsistent light.
Photo Imaging * Ms Whiteside Circle High School.
1.Introduction about camera auto balancing system of UAV 2.Proposal auto balancing platform 3.Step by step solving rotation (attitude) of an IMU O Outline.
ISO Learning how to control light entering your camera.
There are two types of keyboards: Alphanumeric Special Function Alphanumeric, the most common type of keyboard contains 101 to 105 keys, however all keyboards.
-BY SAMPATH SAGAR( ) ABHISHEK ANAND( )
Shutter Speed.  How long the sensor/film in the camera is being exposed to light  Shutter speed allows light to reach the cameras image sensor  How.
Digital Media Dr. Jim Rowan ITEC 2110 Animation. Two ways to create moving images Capture using a camera –edit in a video editor like iMovie Create using.
Parts of the Camera What Do They Do?.
SOME THINGS TO REMEMBER WHEN TAKING PHOTOS Photography Tips All photos in this slideshow are courtesy of
CAMERAS, PARTS of the CAMERA, and ACCESSORIES (TAKE NOTES ON THE UNDERLINED MATERIAL AND LABELLED DIAGRAMS)
IBall Face2Face CHD 12.0 Webcam
Photography (the very basics).
Introducing virtual REALITY
Ball Roller Coaster It will keep two balls busy rolling down the track over and over again.  The ULTRASONIC sensor tells the NXT when to use the lift motor.
Mr. Clark’s Camera Settings
Automating the Polarizing Filter
All pictures copyright Paul Lack
Depth of Field Objective: to photograph a subject with a foreground and background, giving a better understanding of aperture and its relation to depth.
Augmented Reality And Virtual Reality.
Virtual Reality By: brady adger.
Media Production Richard Trombly Contact :
The Ping-Pong Ninja™ Sam Reilly and Conor McCarthy
Peak Of Performance.
Dr. Jim Rowan ITEC 2110 Animation
Gimbal Control and Design for Tracking of a Solar Eclipse
Multiple Exposure and Extending the Frame
Standards-Based Planning
BACK UP P09045: Membrane Characterization Test Stand
ECE 477 Digital Systems Senior Design Project  Spring 2006
Viewpoint in Photography
What can we learn from careful reading of an image?
CS 352: Computer Graphics Chapter 5: Viewing.
Presentation transcript:

ROHIT BANERJEE DEBRAJ BOSE KEVIN KASSING KANUPRIYA TAVRI EyePoint 1

How and Why EyePoint 2 Concept EyePoint will allow a person to take a picture of the scene they are looking at, without being attached to their camera. This is achieved using head motion tracking to determine the user’s field of view, and accordingly adjusting the yaw and pitch of the camera mount to frame the image of the scene. Motivation Photographers today are tied to their cameras; even when it is possible to snap pictures from afar with a remote, it is still cumbersome to reposition the camera. Some locations are too dangerous for photographers to linger, and some photographic opportunities would be lost because of the photographer’s presence. With EyePoint, photographers will be able to remotely survey their scene and control the camera with simple gestures.

Competitors EyePoint 3 RODEON VR Head Introduced: € and up Orion TeleTrack Introduced: 2007 $300 Gigapan Introduced: 2007 $300 There are existing systems to capture images at varying angles Standalone systems Use the camera you already own Mostly work with predefined parameters No existing systems are integrated with head tracking Existing head tracking systems are very specialized

Requirements EyePoint 4 Functional  Take pictures based on where user is looking  Camera mount rotates with 2 degrees of freedom  Facilitate aiming camera at the object with user’s head motion  Images are captured and stored based on wireless user input  User’s gestures actuate shutter and zoom Non-functional:  Wireless freedom; no power cables, no tether to user  Timing: Real-time control with response within 5 seconds

Technical Details EyePoint 5 32 bit microcontrollerStepper motorsAccelerometer Positional sensors8 bit microcontrollerRF transceiverHead motion tracking Motorized camera tracking LibPTP2