LiGaze Ultra-Low Power Gaze Tracking for Virtual Reality

Slides:



Advertisements
Similar presentations
S3 Technologies Presents Tactile Vision Glove for The Blind S3 Technologies: Shaun Marlatt Sam Zahed Sina Afrooze ENSC 340 Presentation: December 17, 2004.
Advertisements

Tian Honeywell ACS Fellows Symposium Event-Driven Localization Techniques Tian He Department of Computer Science and Engineering University of Minnesota.
Haptic Glove Hardware Graduation Project Prepared by Yaman A. Salman Eman M. Masarweh 2012.
Contextual Advertising by Combining Relevance with Click Feedback D. Chakrabarti D. Agarwal V. Josifovski.
Virtual Dart: An Augmented Reality Game on Mobile Device Supervisor: Professor Michael R. Lyu Prepared by: Lai Chung Sum Siu Ho Tung.
Head Tracking and Virtual Reality by Benjamin Nielsen.
Spectrophotometer Jan 28, 2002 Deryck Hong Suryadi Gunawan.
Energy Smart Room GROUP 9 PRESENTERS DEMO DATE SPECIAL THANKS TO ADVISOR PRESENTERS Thursday April 19, 2007 Department of Electrical and Systems Engineering.
MACHINE VISION GROUP Head-tracking virtual 3-D display for mobile devices Miguel Bordallo López*, Jari Hannuksela*, Olli Silvén* and Lixin Fan**, * University.
The CarBot Project Group Members: Chikaod Anyikire, Odi Agenmonmen, Robert Booth, Michael Smith, Reavis Somerville ECE 4006 November 29 th 2005.
The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities.
A haptic presentation of 3D objects in virtual reality for the visually disabled MSc Marcin Morański Professor Andrzej Materka Institute of Electronics,
Low Cost Infrared Touch Screen Bezel for POS Systems Rohan Verma, Jeremy Taylor, Freddie Dunn III Georgia Institute of Technology School of Electrical.
1 Perception, Illusion and VR HNRS 299, Spring 2008 Lecture 2 Introduction, Light Course webpage:
CSE 381 – Advanced Game Programming GLSL Lighting.
1 Location and Activity Tracking with the Cloud Taj Morton, Alex Weeks, Samuel House, Patrick Chiang, and Chris Scaffidi School of Electrical Engineering.
TAUCHI – Tampere Unit for Computer-Human Interaction Development of the text entry self-training system for people with low vision Tatiana Evreinova Multimodal.
Counting How Many Words You Read
1 Computational Vision CSCI 363, Fall 2012 Lecture 2 Introduction to Vision Science.
RISE Proposal. Acronym: JEPIF: Japan-Europe Physics at Intensity Frontier ??? Merge WP 1 & 3 ? – Sell as cross-fertilization and networking of flavour.
Residential Security, Access Control, and Surveillance Copyright © 2005 Heathkit Company, Inc. All Rights Reserved Presentation 3 – Motion Detection.
PRESENTATION CSE 341 MICROPROCESSOR Presented By Nabid Kaisar
CHAPTER 8 Sensors and Camera. Chapter objectives: Understand Motion Sensors, Environmental Sensors and Positional Sensors Learn how to acquire measurement.
Interactive LED Staircase Modules Group 38 Mike Udelhofen ECE 445 April 26, 2012.
AUGMENTED REALITY VIJAY COLLEGE OF ENGG. FOR WOMEN PRESENTED BY…….
CS Chapter 11.5 – Computer GraphicsPage 145 Computer Graphics Recent advances in processor power, display technology, memory capacity, and rendering.
Stimulation Effects in SSVEP-based BCIs Jordi Bieger, July 8, 2010.
Under the Guidance of Submitted by
TRACK SENSING ROBOTIC VEHICLE MOVEMENT
EYE TRACKING TECHNOLOGY
Introduction to Virtual and Augmented Reality
EYE-GAZE COMMUNICATION
LED & LCD SUKHNANDAN COLLEGE MUNGELI A PRESENTATION ON BY:
Department of ECE A project Report on
Introducing virtual REALITY
Augmented Reality.
John David Noe Dept. of Technology, EKU
Bh. Ravi Krishna 12N31A0533 야나 누르디아나 학번 : 사이버경찰하과.
Presented by Jason Moore
Can you guess what this is?
IncuCyte Training Calibrations and Troubleshooting
IR Sensor for Line Follower
Augmented Reality And Virtual Reality.
Virtual Training Room David Hernandez, Joshua Opada, Dorian Ozoude
ARDUINO LINE FOLLOWER ROBOT
리즈끼 마울라나 VR Therapy.
Eye Detection and Gaze Estimation
Project guide B. SUBBULAKSHMI M. E Assistant Professor C. A. R
Virtual Reality By: brady adger.
ASHIK V S Roll No. 19 S3 ECE COLLEGE OF ENGINEERING, TRIVANDRUM
Biometric Security Fujitsu Palm Vein Technology
VR and AR In Education 010/10/2017.
Real-time Wall Outline Extraction for Redirected Walking
Xbox Kinect (Microsoft)
Visible Light based Activity Sensing using Ceiling Photosensors
The Role of Light in High Speed Digital Design
Do-It-Yourself Eye Tracker: Impact of the Viewing Angle on the
Electronics Fundamentals
Sampling Theorem told us …
Interior Camera - A solution to Driver Monitoring Status
Workshop B Technology: Get on Board or Get Out of the Way
Illumination Model How to compute color to represent a scene
William Claycomb and Dongwan Shin
Under the Guidance of Submitted by
Nuclear Inspections in the Matrix
JMP 11 added new features and improvements to CCB and MSA.
Work Smarter, Not Harder
Schematic diagram showing inputs and modules of iDROP software.
Application of Optical Heart Rate Monitoring
Overview of augmented reality By M ISWARYA. CONTENTS  INTRODUCTION  WHAT IS AR?  ROLE OF AR  EXAMPLES  APPLICATION  ADVANTAGES  CONCLUSION.
Presentation transcript:

LiGaze Ultra-Low Power Gaze Tracking for Virtual Reality Alex Rowden and Noel Warford

Background Virtual Reality (VR) is considered by some to be the wave of the future. VR is computationally expensive Double the render calls. Potentially wider Field of View Energy costs Interaction in VR requires external sensors. VR headsets are difficult or impossible to use with glasses. Alex VR has many benefits in gaming immersion, education, culture, etc. It still has many flaws External sensors - premodeled room environments, wands/controllers High rendering costs - can lead to high monetary costs (specifically for production) VR headsets pose security risks (verifying users is difficult without reliable interaction)

Eye Tracking and Its Benefits Foveated rendering Eye-based interaction Varifocal displays user verification increased social avatar immersion health monitoring user intent prediction. Alex Eye tracking is the process of followinng the user’s pupil Currently implemented in the FOVE and proposed for the upcoming HTC Vive Pro Eye Reduce number of samples/pixel size in the periphery Adjust headset for the user (potentially eliminate the need for glasses in headset)

Motivation Eye tracking is expensive or inaccessible (FOVE is currently developer only) Eye tracking is power hungry Eye tracking isn’t built in Currently only specialized headsets. Alex State-of-the-art uses IR emitters and IR cameras to track pupil position with computer vision. Emitters drain power and cameras are relatively expensive.

LiGaze A model for introducing eye tracking to any headset, cheaply and without consuming too much power. Uses photosensors to take advantage of reflected light from the display and the absorption properties of the pupil. Key advantage: photosensors are relatively cheap and energy-inexpensive Eliminates the need for internal cameras (expensive) and active IR LEDs (energy inefficient) Alex

Hardware Printed circuit board with 32 photodiodes Microcontroller - 2 4-channel multiplexers Photodiodes divided into eight groups Each group transmits data serially Solar cell at the top for ambient energy - only 4V required Noel

Experimental Design 30 participants (19 males, 11 females) Divided based on skin color Tested within-user and cross-user accuracy Tested across various scenes Compared LiGaze to FOVE as a source of ground-truth Noel - within-user used own data to train inference model, cross user is half trained on participant’s data and half uses other participants

Results (Noel) From top left: light intensity - has an impact on LiGaze performance error (within users, trained on you. Cross users, trained on others, two models based on skin color), error distros - further out towards the edges because of training set focusing on center, average accuracy same across skin colors, green slightly worse because of few people with green eyes (within-users much better), representative eye colors essential Content: games tend to be darker and eyes move rapidly Reflected: mean error is 0.9 lux

Results (Noel) Precision: similarly difficult at edge of FoV, indicates reproducibility during calibration; Energy versus error, including potential - LiGaze entries based on downsampling from 32 to 16 to 8 diodes, others are examples

Results (Noel) Consumption: 791 microwatts; good conversion; pretty good blink detection (affects reflection) - could test with higher sampling rates, black skin reflects less; Latency is 7.8 milliseconds (could support higher sampling rates)

Data quality of ground truth Improving tracking accuracy Discussion Data quality of ground truth Improving tracking accuracy Prior knowledge of VR content Scenarios beyond VR Noel Questionable results in the field from FOVE beyond reported error margins More data! And low sampling rate from the ADC Could potentially integrate (or manipulate, like in our project) Could be used for AR, combined with other eye tracking tech

Performs better with more light in the headset Conclusion Lower cost than SoTA Low energy than SoTA Performs better with more light in the headset Works better for specific users Can be fitted to most modern headsets (VR not AR) Alex Lower energy and cost allows you to port it to new devices (also can be added into the design of new headsets). AR would allow external light into the headset (if the light is not constant this will through off LiGaze’s tracking).