A novel depth-based head tracking and facial gesture recognition system by Dr. Farzin Deravi– EDA UoK Dr Konstantinos Sirlantzis– EDA UoK Shivanand Guness.

Slides:



Advertisements
Similar presentations
Miroslav Hlaváč Martin Kozák Fish position determination in 3D space by stereo vision.
Advertisements

A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
Hand Gesture for Taking Self Portrait Shaowei Chu and Jiro Tanaka University of Tsukuba Japan 12th July 15 minutes talk.
KINECT REHABILITATION
INTERACTING WITH SIMULATION ENVIRONMENTS THROUGH THE KINECT Fayez Alazmi Supervisor: Dr. Brett Wilkinson Flinders University Image 1Image 2Image 3 Source.
By : Adham Suwan Mohammed Zaza Ahmed Mafarjeh. Achieving Security through Kinect using Skeleton Analysis (ASKSA)
A Modified EM Algorithm for Hand Gesture Segmentation in RGB-D Data 2014 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE) July 6-11, 2014, Beijing,
SENTINEL SAFE Product Overview. SEARCHING THE DATABASE Example: Investigators can search based on detailed criteria for targeted results. In this example.
1 CS 544 Human Abilities Human Motor Capabilities Acknowledgement: Some of the material in these lectures is based on material prepared for similar courses.
Real Time Visual Body Feedback & IR Tracking in HMD Based Virtual Environments Using Microsoft Kinects Speaker: Srivishnu ( Kaushik ) Satyavolu Advisor:
Head Tracking and Virtual Reality by Benjamin Nielsen.
Topic: Fitts' Law Lawrence Fyfe CPSC 681. Fitts' Law Formula: ID (index of difficulty) = log 2 (D/W +1) Soukoreff R.W., MacKenzie I.S., Towards.
Objectives Define predictive and descriptive models and explain why they are useful. Describe Fitts’ Law and explain its implications for interface design.
A Novel 2D To 3D Image Technique Based On Object- Oriented Conversion.
Fitts’ Law Rob Diaz-Marino. Overview  The Basics Who invented it? Who invented it? What does it model? What does it model? How is it used in HCI? How.
Discussion Silvia Lindtner INF 132 April 07. Fitts’ law - recap A predictive model of time to point at an object Help decide the location and size of.
Introduce about sensor using in Robot NAO Department: FTI-FHO-FPT Presenter: Vu Hoang Dung.
A Brief Overview of Computer Vision Jinxiang Chai.
The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities.
ICT PROSPECTS FOR THE PHYSICALLY CHALLENGED CHILD - A PARENTAL VIEW AVM FEMI GBADEBO (Rtd) OFR PRINCIPAL CONSULTANT GEEBARD CONCEPTS NIG. LTD.
Kinect Part II Anna Loparev.
Professor : Yih-Ran Sheu Student’s name : Nguyen Van Binh Student ID: MA02B203 Kinect camera 1 Southern Taiwan University Department of Electrical Engineering.
Steven Marsh, James Eagle, Juergen Meyer and Adrian Clark
EWatchdog: An Electronic Watchdog for Unobtrusive Emotion Detection based on Usage Analysis Rayhan Shikder Department.
Computers in the real world Objectives Understand the terms input and output Look at different types of input devices – Sensors / actuators – Human interface.
Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.
Exploration Robot with Stereovision Vladislav Richter Miroslav Skrbek FIT, CTU in Prague
User Models Predicting a user’s behaviour. Fitts’ Law.
3D Fingertip and Palm Tracking in Depth Image Sequences
Two Handed and Gaze Input Stanford and Princeton Lecture Nov 29, 1999 Shumin Zhai.
INTERACTING WITH SIMULATION ENVIRONMENTS THROUGH THE KINECT Fayez Alazmi Supervisor: Dr. Brett Wilkinson Flinders University Image 1Image 2 Source : 1.
REU Project RGBD gesture recognition with the Microsoft Kinect Steven Hickson.
Human Gesture Recognition Using Kinect Camera Presented by Carolina Vettorazzo and Diego Santo Orasa Patsadu, Chakarida Nukoolkit and Bunthit Watanapa.
RMH: Fitts’ Law Paul Cairns
INTRODUCTION Generally, after stroke, patient usually has cerebral cortex functional barrier, for example, the impairment in the following capabilities,
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
The system I4Control ® current research interests + intentions for projects Czech Technical University in Prague I4Control.
By: 1- Aws Al-Nabulsi 2- Ibrahim Wahbeh 3- Odai Abdallah Supervised by: Dr. Kamel Saleh.
Image Pool. (a)(b) (a)(b) (a)(c)(b) ID = 0ID = 1.
Vehicle Segmentation and Tracking From a Low-Angle Off-Axis Camera Neeraj K. Kanhere Committee members Dr. Stanley Birchfield Dr. Robert Schalkoff Dr.
Professor : Tsung Fu Chien Student’s name : Nguyen Trong Tuyen Student ID: MA02B208 An application Kinect camera controls Vehicles by Gesture 1 Southern.
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
Eye Tracking In Evaluating The Effectiveness OF Ads Guide : Dr. Andrew T. Duchowski.
Using an alternative nurse call Iowa Adaptive Technologies Fall 2015 ™ Iowa Adaptive Technologies.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
Counting How Many Words You Read
Introduction to Kinect For Windows SDK
REU Project RGBD gesture recognition with the Microsoft Kinect.
By : Mahwash Merchant Abdul Ahad Mushir Rashida Moiz Sandhya.
Product: Microsoft Kinect Team I Alex Styborski Brandon Sayre Brandon Rouhier Section 2B.
Introduction Processing Verbal Information in Concept Maps How people read concept maps in the context of a semantically meaningful task?
EYE-GAZE COMMUNICATION
TECHNICAL SEMINAR ON. ABSTRACT INTRODUCTION USERS OF THE EYEGAZE SYSTEM SKILL NEEDED BY THE USERS PARTS AND WORKING HOW TO RUN THE EYEGAZE SYSTEM USES.
Face Recognition Technology By Catherine jenni christy.M.sc.
Optical Design, Fabrication and Measurement Associate Professor: Yi-Pai Huang Department of Photonics and Display Institute 2010/02/25.
Investigating the Use of Eye-Tracking Technology for Assessment: A case study of research and innovation at a Special School INNOVATION IN THE ASSESSMENT.
Microsoft Kinect How does a machine infer body position?
Southern Taiwan University Department of Electrical Engineering
Paul Cairns ARMH: Fitts’ Law Paul Cairns
Eye Tracker Performance Evaluation with ISO 9241 – Point and Click by Blinking and Dwelling Student: Matthew Conte Superviser: Prof Scott MacKenzie CSE.
RGBD Camera Integration into CamC Computer Integrated Surgery II Spring, 2015 Han Xiao, under the auspices of Professor Nassir Navab, Bernhard Fuerst and.
Senior Capstone Project Gaze Tracking System
Do-It-Yourself Eye Tracker: Impact of the Viewing Angle on the
Video-based human motion recognition using 3D mocap data
Vehicle Segmentation and Tracking in the Presence of Occlusions
Multiple View Geometry for Robotics
CS2310 Zihang Huang Project :gesture recognition using Kinect
眼動儀與互動介面設計 廖文宏 6/26/2009.
RGBD gesture recognition with the Microsoft Kinect
Presentation transcript:

A novel depth-based head tracking and facial gesture recognition system by Dr. Farzin Deravi– EDA UoK Dr Konstantinos Sirlantzis– EDA UoK Shivanand Guness – EDA UoK Dr. Mohamed Sakel - EKHUFT Dr. Matthew Pepper - EKHUFT

Overview Clinical Background Objectives Kinect : RGB-D sensor Technical Approach Evaluation Technique Experimentation Results Conclusion Future Work

Clinical Background Individuals with conditions such as Motor Neuron Disease (MND), Cerebral Palsy (CP) and Multiple Sclerosis (MS): – Can lose the ability to speak – May only make small head/facial movements in cases where the person has MND or MS Capture and interpret the intentions and messages of the patients from their limited movement

Objectives Develop reliable automatic gesture recognition system – Tracking head movement; – Detect facial gestures such as eye blink, wink etc. Adaptive system to adapt to condition or user over time Develop a low cost assistive device

Kinect : RGB-D sensor

Kinect : RGB-D sensor(cont.) Projects a known pattern (Speckles) in Near-Infrared light. CMOS IR camera observes the scene. Calibration between the projector and camera has already been carried out and is known. Projection generated by a diffuser and diffractive element of IR light

Technical Approach

Technical Approach(cont.) Depth Map RGB Image Area of object nearest to sensor

Experimentation SetupFitts’ Test target screen

Evaluation Technique Fitts’ test for HCI is used to evaluate the tracking algorithms Initially developed by Paul Fitts in 1953 to model human movement Adapted to HCI by Scott MacKenzie in the 1992 ISO :2000 (Ergonomic requirements for office work with visual display terminals (VDTs)—Part 9— Requirements for non-keyboard input devices) – is based on Fitts’ Test

Fitts’ Test Two Key Parameters

Fitts’ Test (cont) Effective index of difficulty ID e = log 2 (D/W e +1) – where (D) is the distance from the home to the target and W e is the effective width of the target. W e, is calculated Effective Width W e = x SD – where SD is the standard deviation of the selection coordinates.

Fitts’ Test (cont)

Fitts’ Test Evaluation Width (W)Distance (D)Index of Difficulty(ID)

Result DevicesEffective Throughput (TP e ) DwellBlinkEyebrows Standard Mouse(ms) 0.84n/a CameraMouse (cm) 0.48n/a SmartNav (sn) 0.42n/a Vision head tracker (using webcam) 0.21 (ht-dwell) 0.15 (ht-blink) 0.08 (ht-brows) RGB-D head tracker (using Kinect) 0.30 (kht-dwell) 0.28 (kht-blink) 0.09 (kht-brows)

Result Throughput and Effective Throughput

Result – Index of Difficulty

Result – Effective Index of Difficulty

Result – Index of Difficulty

Result – Effective Index of Difficulty

Conclusion RGB-D head tracking system is shown to have an improved performance over the vision-based head tracking system – TP e of dwell clicking increased by a third (from 0.21 to 0.30 bits per second) – TP e of blink clicking double (from 0.15 to 0.28 bits per second) The eye blink detection algorithm performance is quite near to the performance of the dwell click switch

Future Work Increase the robustness of the HeadTracker Investigate the impact of the different facial gesture switches on performance Investigate additional facial gesture switches such as mouth (open-close), tongue etc. Conduct Fitts’ Test experimentation using about participants – with healthy volunteer

THANK YOU Project Website :