Task Dependency of Eye Fixations & The Development of a Portable Eyetracker Jeff Cunningham Senior Research Project Dr. Jeff Pelz Visual Perception Laboratory.

Slides:



Advertisements
Similar presentations
Evidential modeling for pose estimation Fabio Cuzzolin, Ruggero Frezza Computer Science Department UCLA.
Advertisements

An innovative tool for the review of health and safety work practices and the implementation of effective controls of particulate exposures.
Introduction to Eye Tracking
Virtual Me. Motion Capture (mocap) Motion capture is the process of simulating actual movement in a computer generated environment The capture subject.
Application: driving Gemma Briggs
Eye Movements of Younger and Older Drivers Professor: Liu Student: Ruby.
Background POP displays are critical because 68% of all buying decisions are made with no prior planning (Stahlberg & Maila, 2010). Effective POP displays.
Retinal Imaging Protocols for Constructing High Resolution Mosaics of In Vivo Photoreceptor Cells Blanca E. Marinez Cabrillo College Aptos, California.
SENIOR DESIGN PROJECTS Design Review
The Use of Eye Tracking Technology in the Evaluation of e-Learning: A Feasibility Study Dr Peter Eachus University of Salford.
Jeff B. Pelz Visual Perception Laboratory Carlson Center for Imaging Science Rochester Institute of Technology Using Eyetracking to Improve Image Composition.
Correlation Between Image Reproduction Preferences and Viewing Patterns Measured with a Head Mounted Eye Tracker Lisa A. Markel Jeff B. Pelz, Ph.D. Center.
Comparison of two eye tracking devices used on printed images Barbora Komínková The Norwegian Color Research Laboratory Faculty of Computer Science and.
Jeff B. Pelz, Roxanne Canosa, Jason Babcock, & Eric Knappenberger Visual Perception Laboratory Carlson Center for Imaging Science Rochester Institute of.
Noise Reduction in Digital Images Lana Jobes Research Advisor: Dr. Jeff Pelz.
Jeff B. Pelz, Roxanne Canosa, Jason Babcock, & Eric Knappenberger Visual Perception Laboratory Carlson Center for Imaging Science Rochester Institute of.
Attention. Looking without Seeing.
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
Photgraphic camera. How it works? Take a simple converging lens: Object – usually at a distance much, much larger from the lens than its focal length Lens.
Survey of Eye Tracking Techniques
Importance of region-of-interest on image difference metrics Marius Pedersen The Norwegian Color Research Laboratory Faculty of Computer Science and Media.
Echo  Conceptual Design Review Andrew Berg, Shawn Carroll, Cody Humbargar, Jade Nelson, Jared Russell, Austin Williamson
Capturing the Motion of Ski Jumpers using Multiple Stationary Cameras Atle Nes Faculty of Informatics and e-Learning Trondheim University.
Wearable Computers and Augmented Reality David Mizell Intel Research Seattle Feb. 24, 2003.
Development of an Eye Tracker By Jason Kajon Barrett of the Center for Imaging Science at the Rochester Institute of Technology Advisor: Jeff Pelz.
Eye Movements and Visual Attention
HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM ERRORS HUMAN 1 HUMAN 2 HUMAN 3 ALGORITHM.
May Melissa Albo #1 Sagrario Casillas #4 Angela Durán #8 Daniela Martínez #16.
The Eye-Tracking Butterfly: Morphing the SMI REDpt Eye-Tracking Camera into an Interactive Device. James Cunningham & James D. Miles California State University,
The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities.
10/2/2012.  Build glasses that record two video streams  Transfer the streams to an eye-tracking application hosted in an Android device  Process the.
Maria Grazia Albanesi, Riccardo Amadeo University of Pavia, Faculty of Engineering, Computer Department Impact of Fixation Time on Subjective Video Quality.
Comparing Experts and Novices In Solving Electrical Circuit Problems With the Help of Eye-Tracking David Rosengrant, Colin Thomson & Taha Mzoughi Department.
Effect of Shared-attention on Human-Robot Communication Written by Junyi Yamato, Kazuhiko Shinozawa, Futoshi Naya Presentation by Bert Gao.
Eye Tracking in the Design and Evaluation of Digital Libraries
Visual Object Tracking Xu Yan Quantitative Imaging Laboratory 1 Xu Yan Advisor: Shishir K. Shah Quantitative Imaging Laboratory Computer Science Department.
Inami Laboratory / The University of Electro-Communications Smart Light –Ultra High Speed Projector for Spatial Multiplexing Optical Transmission Hideaki.
 An eye tracking system records how the eyes move when a subject is sitting in front of a computer screen.  The human eyes are constantly moving until.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
ESR 9: Review of test results and plan for the final testing campaign 1/24 EDUSAFE Summer School, 22 nd June 2015 Yuta Itoh (TU Munich)
The effects of relevance of on-screen information on gaze behaviour and communication in 3-party groups Emma L Clayes University of Glasgow Supervisor:
National institute of science & technology BLINK DETECTION AND TRACKING OF EYES FOR EYE LOCALIZATION LOPAMUDRA CS BLINK DETECTION AND TRACKING.
Symmetry Detecting Symmetry in a Random Background Monica Cook.
Eye Tracking In Evaluating The Effectiveness OF Ads Guide : Dr. Andrew T. Duchowski.
模式识别国家重点实验室 中国科学院自动化研究所 National Laboratory of Pattern Recognition Institute of Automation, Chinese Academy of Sciences Context Enhancement of Nighttime.
Copyright, 1999 © Valerie A. Summers Calibration for Augmented Reality Experimental Testbeds Valerie A. Summers, Kellogg S. Booth, Tom Calvert, Evan Graham,
The geometry of the system consisting of the hyperbolic mirror and the CCD camera is shown to the right. The points on the mirror surface can be expressed.
Change Blindness Meredith Curtis Laurel Calderwood Undergraduate Research Symposium August 11, 2006.
Research Background: Depth Exam Presentation
Light Loss Tests Update Aron Fish 22/02/06 Mice Tracker Phone Conference.
Detecting Eye Contact Using Wearable Eye-Tracking Glasses.
An Eyetracking Analysis of the Effect of Prior Comparison on Analogical Mapping Catherine A. Clement, Eastern Kentucky University Carrie Harris, Tara Weatherholt,
Visual Perception By Katie Young and Joe Avery. Overview Visual Perception Eye Trackers Change Blindness.
P15051: Robotic Eye Project Definition Review TIM O’HEARNANDREW DROGALISJORGE GONZALEZ KATIE HARDY DANIEL WEBSTER.
Eye Tracking and Interactive TV Interfaces ISE 298 Jean Ostrem.
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
Examining the Conspicuity of Infra-Red Markers For Use With 2-D Eye Tracking Abstract Physical infra-red (IR) markers are sometimes used to help aggregate.
ENTERFACE 08 Project 9 “ Tracking-dependent and interactive video projection ” Mid-term presentation August 19th, 2008.
Date of download: 6/22/2016 Copyright © 2016 SPIE. All rights reserved. Schematic representation of the near-infrared (NIR) structured illumination instrument,
Date of download: 6/25/2016 Copyright © 2016 American Medical Association. All rights reserved. From: Absence of Preferential Looking to the Eyes of Approaching.
Laser in Situ Keratomileusis Decentration With and Without Active Eye-Tracking System Mark Edmund Johnston MD FRCSC P 167: The author.
EYE TRACKING TECHNOLOGY
Focus J.P. Hornak, 1995.
Non-invasive cardiac pulse measurement
Depth Perception in Medical Augmented Reality
Optical Imaging of Intrinsic Cortical Signals
Basic Camera Function The camera converts an optical image into electrical signals that are reconverted by a television receiver into visible screen images.
Human-centered Interfaces
Final Project Plan P09004 – Eye Movement Tracking Device
Eye-Based Interaction in Graphical Systems: Theory & Practice
Presentation transcript:

Task Dependency of Eye Fixations & The Development of a Portable Eyetracker Jeff Cunningham Senior Research Project Dr. Jeff Pelz Visual Perception Laboratory at the Center for Imaging Science at RIT

Overview u Goals 1.Demonstrate the relationship between eye fixations and the task at hand. 2.Improve the portability of the video eye tracking system currently in use in the Visual Perception Lab. u Results –Methods of data collection and analysis. –What were the factors associated with eye fixations? –About the portable eyetracker. u Conclusions u Questions

Goals u A question of perception –Alfred Yarbus 1960s -- obtrusive instruments. Are Yarbus’ results an artefact of his equipment? –How results obtained with a video eyetracker compare to Yarbus’? u Portable Eye Tracking –The ASL 501 eye tracker is designed to be a portable system. –Use some existing ASL hardware. –Reduce size & weight. –Increase comfort. –  2 hours on batteries.

Task Dependency -- Data Collection u Video eye track 9 subjects. u 3 Tasks –Free Viewing –Memorization –Ages u Collect statistics on fixations. u 3 Images –Doc –Shoe –Ropes

Task Dependency -- Data Collection Doc Shoe Ropes

Task Dependency - Fixations One Subject Free Viewing > 10% % 6 - 8% 4 - 6% 2 - 4% < 2%

Task Dependency -- Fixation Summary

Task Dependency -- ANOVA u General Linear Model w/ 2 Factors: –Image –Task Head Fixations Imagep = 0.14 Task p = 0.00 Image*Taskp = People Fixations Imagep = 0.04 Task p = 0.00 Image*Taskp = 0.05 Object Fixations Imagep = 0.02 Task p = 0.00 Image*Taskp = 0.134

Portable Eye Tracking CamcorderBatteriesCap & Optics ASL Control Box

Portable Eye Tracking Eye Camera IR Illuminator Scene Camera IR Reflective Visor

Portable Eye Tracking u Changes: –Moved to baseball cap. –Swapped in smaller CCD cameras. –Used smaller visor. –Powered by 3 batteries. –Camcorder instead of VCR. u Still the same: ASL control box.

Portable Eye Tracking u Pro’s –Battery time is approx. 3 hours. –Is lighter and smaller. –Is more comfortable. u Con’s –Set-up takes longer. –Eye image is less stable. –Eye image is too large.

Portable Eye Tracking u Yet Remaining –Didn’t have time to do a comparison to the existing eye tracker. –For more stability, move to a new cap. –Move optics for a smaller eye image. –Wire for sound.

Conclusions u Task Dependency –There is a significant change in eye fixation patterns that varies with task. –A dependency on image was not as clear. u Portability –A smaller, lighter, and more comfortable tracker was made. –Battery life is good. –More work needs to be done.

? Questions?