Class Code: EEL 6788 Instructor: Dr. Damla Turgut

Slides:



Advertisements
Similar presentations
ARTIFICIAL PASSENGER.
Advertisements

TEMPLATE DESIGN © The basic model for a trigonometric setup requires that the HID be seen by at least two cameras at any.
Darwin Phones: the Evolution of Sensing and Inference on Mobile Phones Emiliano Miluzzo *, Cory T. Cornelius *, Ashwin Ramaswamy *, Tanzeem Choudhury *,
OpenCV Introduction Hang Xiao Oct 26, History  1999 Jan : lanched by Intel, real time machine vision library for UI, optimized code for intel 
Sonar and Localization LMICSE Workshop June , 2005 Alma College.
Electronic and Computer Engineering Colin Grogan Final Year Project: Design and Build an Air Mouse for people with lower mobility.
SignalGuru: Leveraging Mobile Phones for Collaborative Traffic Signal Schedule Advisory MobiSys 2011 Best paper award.
EyePhone: Activating Mobile Phones With Your Eyes Emiliano Miluzzo, Tianyu Wang, Andrew T. Campbell CS Department – Dartmouth College, Hanover, NH, USA.
By : Adham Suwan Mohammed Zaza Ahmed Mafarjeh. Achieving Security through Kinect using Skeleton Analysis (ASKSA)
Facial feature localization Presented by: Harvest Jang Spring 2002.
Instructor: Mircea Nicolescu Lecture 13 CS 485 / 685 Computer Vision.
Move With Me S.W Graduation Project An Najah National University Engineering Faculty Computer Engineering Department Supervisor : Dr. Raed Al-Qadi Ghada.
SurroundSense: Mobile Phone Localization via Ambience Fingerprinting Written by Martin Azizyan, Ionut Constandache, & Romit Choudhury Presented by Craig.
Virtual Dart: An Augmented Reality Game on Mobile Device Supervisor: Professor Michael R. Lyu Prepared by: Lai Chung Sum Siu Ho Tung.
Computer Vision for Interactive Computer Graphics Mrudang Rawal.
Cindy Song Sharena Paripatyadar. Use vision for HCI Determine steps necessary to incorporate vision in HCI applications Examine concerns & implications.
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
1 Real Time, Online Detection of Abandoned Objects in Public Areas Proceedings of the 2006 IEEE International Conference on Robotics and Automation Authors.
Jacinto C. Nascimento, Member, IEEE, and Jorge S. Marques
ICBV Course Final Project Arik Krol Aviad Pinkovezky.
Computer Vision Systems for the Blind and Visually Disabled. STATS 19 SEM Talk 3. Alan Yuille. UCLA. Dept. Statistics and Psychology.
MACHINE VISION GROUP Multimodal sensing-based camera applications Miguel Bordallo 1, Jari Hannuksela 1, Olli Silvén 1 and Markku Vehviläinen 2 1 University.
1 REAL-TIME IMAGE PROCESSING APPROACH TO MEASURE TRAFFIC QUEUE PARAMETERS. M. Fathy and M.Y. Siyal Conference 1995: Image Processing And Its Applications.
PhonePoint Pen: Using Mobile Phones to Write in Air Sandip Agrawal, Ionut Constandache, Shravan Gaonkar, Romit Roy Choudhury ACM MobiHeld 2009.
REAL TIME EYE TRACKING FOR HUMAN COMPUTER INTERFACES Subramanya Amarnag, Raghunandan S. Kumaran and John Gowdy Dept. of Electrical and Computer Engineering,
Design, Implementation and Evaluation of CenceMe Application COSC7388 – Advanced Distributed Computing Presentation By Sushil Joshi.
Knowledge Systems Lab JN 9/10/2002 Computer Vision: Gesture Recognition from Images Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
PortableVision-based HCI A Hand Mouse System on Portable Devices 連矩鋒 (Burt C.F. Lien) Computer Science and Information Engineering Department National.
Sketch­based interface on a handheld augmented reality system Rhys Moyne Honours Minor Thesis Supervisor: Dr. Christian Sandor.
 An eye tracking system records how the eyes move when a subject is sitting in front of a computer screen.  The human eyes are constantly moving until.
Portable Vision-Based HCI A Real-Time Hand Mouse System on Portable Devices 連矩鋒 (Burt C.F. Lien) Department of Computer Science and Information Engineering.
Implementing Codesign in Xilinx Virtex II Pro Betim Çiço, Hergys Rexha Department of Informatics Engineering Faculty of Information Technologies Polytechnic.
An Information Fusion Approach for Multiview Feature Tracking Esra Ataer-Cansizoglu and Margrit Betke ) Image and.
HCI / CprE / ComS 575: Computational Perception Instructor: Alexander Stoytchev
September 5, 2013Computer Vision Lecture 2: Digital Images 1 Computer Vision A simple two-stage model of computer vision: Image processing Scene analysis.
Computer Science Department Pacific University Artificial Intelligence -- Computer Vision.
Using Mobile Phones to Write in Air
Trends in Embedded Computing The Ubiquitous Computing through Sensor Swarms.
National institute of science & technology BLINK DETECTION AND TRACKING OF EYES FOR EYE LOCALIZATION LOPAMUDRA CS BLINK DETECTION AND TRACKING.
New Human Machine Interfaces for Games Narrated by Michael Song Digiwinner Limited Aug
Expectation-Maximization (EM) Case Studies
Eye regions localization Balázs Harangi – University of Debrecen Ciprian Pop – Technical University of Cluj-Napoca László Kovács – University of Debrecen.
Fast Census Transform-based Stereo Algorithm using SSE2
Intermediate 2 Computing Unit 2 - Software Development.
Virtual Image Peephole By Kyle Patience Supervisor: Reg Dodds Co Supervisor: Mehrdad Ghaziasgar.
2016/1/141 A novel method for detecting lips, eyes and faces in real time Real-Time Imaging (2003) 277–287 Cheng-Chin Chiang*,Wen-Kai Tai,Mau-Tsuen Yang,
It Starts with iGaze: Visual Attention Driven Networking with Smart Glasses It Starts with iGaze: Visual Attention Driven Networking with Smart Glasses.
Wonjun Kim and Changick Kim, Member, IEEE
Product: Microsoft Kinect Team I Alex Styborski Brandon Sayre Brandon Rouhier Section 2B.
What you need: In order to use these programs you need a program that sends out OSC messages in TUIO format. There are a few options in programs that.
Portable Camera-Based Assistive Text and Product Label Reading From Hand-Held Objects for Blind Persons.
CHAPTER 8 Sensors and Camera. Chapter objectives: Understand Motion Sensors, Environmental Sensors and Positional Sensors Learn how to acquire measurement.
UWave: Accelerometer-based personalized gesture recognition and its applications Tae-min Hwang.
Over the recent years, computer vision has started to play a significant role in the Human Computer Interaction (HCI). With efficient object tracking.
CIRP Annals - Manufacturing Technology 60 (2011) 1–4 Augmented assembly technologies based on 3D bare-hand interaction S.K. Ong (2)*, Z.B. Wang Mechanical.
Hand Gestures Based Applications
Paper – Stephen Se, David Lowe, Jim Little
Video-based human motion recognition using 3D mocap data
Play game, pause video, move cursor… with your eyes
Vehicle Segmentation and Tracking in the Presence of Occlusions
Mean transform , a tutorial
Dingding Liu* Yingen Xiong† Linda Shapiro* Kari Pulli†
Higher School of Economics , Moscow, 2016
Image Processing, Lecture #8
Image Processing, Lecture #8
CSSE463: Image Recognition Day 30
CSSE463: Image Recognition Day 30
Various mobile devices
Higher School of Economics , Moscow, 2016
Presentation transcript:

Class Code: EEL 6788 Instructor: Dr. Damla Turgut EyePhone Emiliano Miluzzo, Tianyu Wang, Andrew T. Campbell, "EyePhone: Activating Mobile Phones With Your Eyes," In Proc. of the ACM SIGCOMM Workshop on Networking, Systems, and Applications on Mobile Handhelds (MobiHeld'10), August 30, 2010 Class Code: EEL 6788 Instructor: Dr. Damla Turgut Date: 03-16-2011 Presenter: Taranjeet Singh Bhatia

Content Technology introduction HPI Evolution Terminologies Paper Abstract Eye Phone Design Evaluation Application Demo Future work Improvement on Paper Related work

Technology Introduction Human Computer Interaction(HCI) “HCI (human-computer interaction) is the study of how people interact with computers and to what extent computers are or are not developed for successful interaction with human beings” Most HCI technology addresses the interaction between people and computers in “ideal” environments, i.e., where people sit in front of a desktop machine with specialized sensors and cameras centered on them

Technology Introduction Human Phone Interaction(HI) “Human-Computer Interaction (HCI) researchers and phone vendors are continuously searching for new approaches to reduce the effort users exert when accessing applications on limited form factor devices such as mobile phones.” Human-phone interaction (HPI) extends the challenges not typically found in HCI research, more specially related to the phone and how we use it. In order to address these goals HPI technology should be less intrusive; that is, it should not rely on any external devices other than the mobile phone itself; it should be readily usable with minimum user dependency as possible; it should be fast in the inference phase; it should be lightweight in terms of computation; it should preserve the phone user experience, e.g., it should not deplete the phone battery over normal operations.

HPI Evolution - Keypad 1983 –

HPI Evolution – Touch Screen 1993 –

HPI Evolution – Trackball 2006 –

HPI Evolution – Voice Operated 2010 –

HPI Evolution – What Next ? Eye Phone – Controlling Phone by Eyes

Terminologies Intrusive Method: Non- Intrusive methods: which requires direct contact with the eyes Non- Intrusive methods: which avoid any physical contact with the user Model based method Appearance based method Feature based method

Paper Abstract EyePhone tracks the user’s eye movement across the phone’s display using the camera mounted on the front of the phone Track the eye and infer its position on the mobile phone display as a user views a particular application Detect eye blinks that emulate mouse clicks to activate the target application under view.

EyePhone Design 1) An eye detection phase 2) An open eye template creation phase 3) An eye tracking phase 4) A blink detection phase

Eye detection phase By applying a motion analysis technique which operates on consecutive frames, this phase consists on finding the contour of the eyes. The eye pair is identified by the left and right eye contours. Results of original algorithm running on a desk-top with a USB camera Ref: Eye Tracking and Blink Detection Library.(http://tinyurl.com/yb9v4f2) Results of EyePhone running on a Nokia N810. The smaller dots are erroneously interpreted as eye contours

Eye detection Based on previous experimental observations, they modify the original algorithm Reducing the image resolution, which reduces the eye detection error rate Adding two more criteria to the original heuristics that filter out the false eye contours widthmin ≤ width ≤ widthmax heightmin ≤ height ≤ heightmax widthmin, widthmax, heightmin, heightmax are thresholds, which identify the possible sizes for a true eye contour, are determined under various experimental conditions (e.g., bright, dark, moving, not moving) and with different people. This design approach boosts the eye tracking accuracy considerably

Open eye template creation EyePhone creates a template of a user’s open eye once at the beginning when a person uses the system for the first time using the eye detection algorithm. The template is saved in the persistent memory of the device and fetched when EyePhone is invoked Downside of this off-line template creation approach is that a template created in certain lighting conditions might not be perfectly suitable for other environments

Eye tracking The eye tracking algorithm is based on template matching. The template matching function calculates a correlation score between the open eye template, created the first time the application is used, and a search window. Outer box around the left eye encloses the region where the correlation score is calculated. If the normalized correlation coefficient equals 0.4 we conclude that there is an eye in the search window. This threshold has been verified accurate by means of multiple experiments under different conditions (e.g., bright, dark, moving, not moving)

Eye tracking Correlation: Correlation is a measure of the degree to which two variables agree, not necessary in actual value but in general behavior. The two variables are the corresponding pixel values in two images, template and source. The value cor is between –1 and +1, with larger values representing a stronger relationship between the two images x is the template gray level image x is the average grey level in the template image y is the source image section y is the average grey level in the source image N is the number of pixels in the section image (N= template image size = columns * rows)

Blink Detection To detect blinks they apply a thresholding technique for the normalized correlation coefficient returned by the template matching function Problem: Quality of the mobile camera is not the same as a good USB camera, and the phone’s camera is generally closer to the person’s face than is the case of using a desktop and USB camera. Because of this latter situation the camera can pick up iris movements, i.e., the interior of the eye, due to eyeball rotation. In particular, when the iris is turned towards the corner of the eye, upwards or downwards, a blink is inferred even if the eye remains open. This occurs because in this case the majority of the eye ball surface turns white which is confused with the color of the skin Four thresholds: T1min = 0.64 T1max =0.75 T2min =-0.53 T2max =-0.45 If we indicate with cmin and cmax, respectively, the minimum and maximum normalized correlation coefficient values returned by the template matching function, the eye is inferred to be closed if T1min ≤ cmax ≤ T1max and T2min ≤ cmin ≤ T2max It is inferred to be open otherwise.

EVALUATION DS = eye tracking accuracy measured in daylight exposure and being steady; AS =eye tracking accuracy measured in artificial light exposure and being steady; DM = eye tracking accuracy measured in daylight exposure and walking; BD = blink detection accuracy in daylight exposure Device Used: Nokia N810, 400MHz processor, 128 MB of RAM, Maemo 4.1 OS, OpenCV (Open Source Computer Vision) library, algorithms compiled on the Maemo scratchbox, video frames from the camera using GStreamer

EVALUATION Average CPU usage, RAM usage, and computation time for one video frame. The front camera supports up to 15 frames per second. The last column reports the percentage of used battery by EyePhone after a three hour run of the system

Applications EyeMenu Car Driver Safety EyePhone could also be used to detect drivers drowsiness and distraction in cars.

Demo http://sensorlab.cs.dartmouth.edu/eyephone/demo.m4v

FUTURE WORK Creation of the open eye template and the filtering algorithm for wrong eye contours Improvement over variations of lighting conditions or movement of the phone in a person’s hand Using a learning approach instead of a fixed thresholding policy

Improvement On The Paper Very Basic Approach used High speed and better DSP libraries are available Need to implement on I-phone and android phones Can be incorporated with face detection Iris tracking Improvement for various light condition and face orientation Online template creation

RELATED WORK There are a number of developments in HCI related to mobile phones over the last several years Use of accelerometers on the phone in order to infer gestures PhonePoint Pen project : phone being transformed into a pen to write in the air in order to quickly take small notes without needing to either write on paper or type on a computer. [4] uWave project exploits a 3-D accelerometer on a Wii remote-based prototype for personalized gesture recognition. [7] Eye movement has recently been used for activity recognition. By tracking the eye researchers show that it is possible to infer the actions of watching a video, browsing the web or reading notes. [24] The eyeSight technology exploits the phone camera to control the mobile phone by simple hand gestures over the camera. [5]

PAPER REFERENCES [4] S. Agrawal, I. Constandache, S. Gaonkar, and R.R.Choudhury. PhonePoint Pen: Using Mobile Phones to Write in Air. In MobiHeld ‘09,pages1–6.ACM,2009. [5] eyeSight. http://www.eyesight-tech.com [7] J. Liu, L. Zhong, J. Wickramasuriya, and V. Vasudevan. uWave: Accelerometer-Based Personalized Gesture Recognition and its Applications. In Pervasive and Mobile Computing, vol. 5, issue 6, pp. 657-675, December 2009. [8] Accelerometer-Based Gesture Recognition with the iPhone. http://tinyurl.com/yjddr8q. [9] The Mobile Phone that Could “Read Lips”. http://www.cellular-news.com/story/42211.php. 10] Nokia Mobile Radar. http://tinyurl.com/yfo2dtd. [11] The Eye Mouse project. http://www.arts.ac.uk/research/eyemouse/index.htm. [12] F. Karray, M. Alemzadeh, J.A. Saleh, and M.N. Arab.Human-Computer Interaction: Overview on State of the Art. In International Journal on Smart Sensing and Intelligent Systems, Vol. 1, No. 1, March 2008. [13] M. Chau and M. Betke. Real Time Eye Tracking and Blink Detection with USB Cameras. In Boston University Computer Science Technical Report No.2005-12,2005. [14] E. Miluzzo, C.T. Cornelius, A. Ramaswamy,T. Choudhury, Z. Liu, A.T. Campbell. Darwin Phones: The Evolution of Sensing and Inference on Mobile Phones. In Eighth International ACM Conference on Mobile Systems, Applications, and Services (MobiSys ’10), San Francisco, CA, USA, June 15-18, 2010. [15] EyePhone demo video. http://sensorlab.cs.dartmouth.edu/eyephone/demo.m4v [16] Nokia n900. http://maemo.nokia.com/n900. [17] Eye Tracking and Blink Detection Library. http://tinyurl.com/yb9v4f2.

Thank You Questions?