The Implementation of a Glove-Based User Interface Chris Carey.

Slides:



Advertisements
Similar presentations
Background Implementation of Gesture Recognition in the NIST Immersive Visualization Environment Luis D. Catacora Under the guidance of Judith Terrill.
Advertisements

Interaction Devices By: Michael Huffman Kristen Spivey.
Shweta Jain 1. Motivation ProMOTE Introduction Architectural Choices Pro-MOTE system Architecture Experimentation Conclusion and Future Work Acknowledgement.
Shape extraction framework for similarity search in image databases Jan Klíma,Tomáš Skopal Charles University in Prague Department of Software Engineering.
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007.
Cindy Song Sharena Paripatyadar. Use vision for HCI Determine steps necessary to incorporate vision in HCI applications Examine concerns & implications.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
Touchscreen Implementation for Multi-Touch
1 Transparent control of avatar gestures A prototype Francesca Barrientos GUIR Meeting  28 April 2000.
Smart Traveller with Visual Translator for OCR and Face Recognition LYU0203 FYP.
Rowing Motion Capture System Simon Fothergill Ph.D. student, Digital Technology Group, Computer Laboratory Jesus College graduate conference May 2009.
Dynamic Traffic Light Timing Tony Faillaci John Gilroy Ben Hughes Justin Porter Zach Zientek.
Integrated Astronaut Control System for EVA Penn State Mars Society RASC-AL 2003.
Electrical and Computer Engineer Large Portable Projected Peripheral Touchscreen Team Jackson Brian Gosselin Jr. Greg Langlois Nick Jacek Dmitry Kovalenkov.
Dynamic Traffic Light Timing Tony Faillaci John Gilroy Ben Hughes Justin Porter Zach Zientek.
MULTI-TOUCH TABLE Athena Frazier Chun Lau Adam Weissman March 25, 2008 Senior Projects II.
HAND GESTURE BASED HUMAN COMPUTER INTERACTION. Hand Gesture Based Applications –Computer Interface A 2D/3D input device (Hand Tracking) Translation of.
Knowledge Systems Lab JN 8/24/2015 A Method for Temporal Hand Gesture Recognition Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
June 10, 2009 – CMPE 123b Project Presentations Jas Condley Eddie Izumoto Kevin Nelson Matt Thrailkill Zach Walker.
Dynamic Traffic Light Timing Tony Faillaci John Gilroy Ben Hughes Justin Porter Zach Zientek.
Highlights of ZWCAD+. Overview PERFORMANCE Be Competent For More Complicated Tasks.
Mobile Distributed 3D Sensing Sandia National Laboratories Intelligent Sensors and Robotics POC: Chris Lewis
Autonomous Tracking Robot Andy Duong Chris Gurley Nate Klein Wink Barnes Georgia Institute of Technology School of Electrical and Computer Engineering.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Optical Tracking for VR Bertus Labuschagne Christopher Parker Russell Joffe.
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
Author : Ng Thomas ( ) Under the Guidance of: Iwan Njoto Sandjaja, MSCS. Rudy Adipranata, M.Eng.
Se Over the past decade, there has been an increased interest in providing new environments for teaching children about computer programming. This has.
Reconstructing 3D mesh from video image sequences supervisor : Mgr. Martin Samuelčik by Martin Bujňák specifications Master thesis
TELEKINESYS Group Members: Mir Murtaza SM Rasikh Mukarram Shiraz Sohail.
Final Honours Presentation Principal Investigator: João Lourenço Supervisor: Dr Hannah Thinyane.
Video Based Palmprint Recognition Chhaya Methani and Anoop M. Namboodiri Center for Visual Information Technology International Institute of Information.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
1 Webcam Mouse Using Face and Eye Tracking in Various Illumination Environments Yuan-Pin Lin et al. Proceedings of the 2005 IEEE Y.S. Lee.
資訊工程系智慧型系統實驗室 iLab 南台科技大學 1 A Static Hand Gesture Recognition Algorithm Using K- Mean Based Radial Basis Function Neural Network 作者 :Dipak Kumar Ghosh,
TOUCH ME NOT Presented by: Anjali.G.
The Implementation of a Glove-Based User Interface Chris Carey.
Abstract A Structured Approach for Modular Design: A Plug and Play Middleware for Sensory Modules, Actuation Platforms, Task Descriptions and Implementations.
Team IRALAR Breanna Heidenburg -- Michael Lenisa -- Daniel Wentzel Advisor: Dr. Malinowski.
Gesture Modeling Improving Spatial Recognition in Architectural Design Process Chih-Pin Hsiao Georgia Institute of Technology.
Student: Ibraheem Frieslaar Supervisor: Mehrdad Ghaziasgar.
Face Image-Based Gender Recognition Using Complex-Valued Neural Network Instructor :Dr. Dong-Chul Kim Indrani Gorripati.
Team Members Ming-Chun Chang Lungisa Matshoba Steven Preston Supervisors Dr James Gain Dr Patrick Marais.
Processing Hardware, Software. Hardware Hardware Processing is performed by a computer ’ s central processing unit and is measured by the clock speed.
Knowledge Systems Lab JN 1/15/2016 Facilitating User Interaction with Complex Systems via Hand Gesture Recognition MCIS Department Knowledge Systems Laboratory.
Project Multi-Touch Albert You Aditya Mittal Paul Ferrara Jim Wallace.
Introduction to Operating Systems Prepared by: Dhason Operating Systems.
Su-ting, Chuang 1. Outline Introduction Related work Hardware configuration Finger Detection system Optimal parameter estimation framework Conclusion.
Pen Based User Interface Issues CSE 490RA January 25, 2005.
What you need: In order to use these programs you need a program that sends out OSC messages in TUIO format. There are a few options in programs that.
Over the recent years, computer vision has started to play a significant role in the Human Computer Interaction (HCI). With efficient object tracking.
TOUCHLESS TOUCHSCREEN USER INTERFACE
IMAGE PROCESSING is the use of computer algorithms to perform image process on digital images   It is used for filtering the image and editing the digital.
MAV Optical Navigation Software System April 30, 2012 Tom Fritz, Pamela Warman, Richard Woodham Jr, Justin Clark, Andre DeRoux Sponsor: Dr. Adrian Lauf.
Hand Gestures Based Applications
Standard Methods of Input.
EVOMOUSE By.
Depth Analysis With Stereo Cameras
Human Detection in Surveillance Applications
TOUCHLESS TOUCHSCREEN USER INTERFACE
Salevich Alex & Frenkel Eduard Wizard Hunting
Using Tensorflow to Detect Objects in an Image
Higher School of Economics , Moscow, 2016
An enhanced estimation: motion and rotation estimation
The Implementation of a Glove-Based User Interface
Tracking the Eyes using a Webcam
M. Kezunovic (P.I.) S. S. Luo D. Ristanovic Texas A&M University
Higher School of Economics , Moscow, 2016
Higher School of Economics , Moscow, 2016
Presentation transcript:

The Implementation of a Glove-Based User Interface Chris Carey

Abstract  Multi-touch interfaces offer task simplification through more natural commands  A glove-based interface provides the utility of a multi-touch interface without the proximity restriction  Glove commands that merely match the complexity of mouse commands are less efficient  Instead efficiency is found by simplifying tasks since gestures provide more degrees of input

Background  Why now?  Accessibility of Technology  Increased Application Sophistication  Usage in Restriction Environments Minority Report (2002) - DreamWorksAyo Technology (2007) – Aftermath Entertainment

Past and Current Glove Systems Glove Systems Glove Systems –Haptic Gloves and VR Systems –Full Motion Capture Glove Systems –Basic Wiimote Glove Systems Non-Glove Systems Non-Glove Systems –Neural Network Hand Gesture Recognition –3D Model Reconstruction Gesture Recognition

Project Goals  Task Simplification  Improved User Experience  Overcoming Command Inaccuracy  Creative Applications for Usage

Hardware Implementation Logitech Webcam Logitech Webcam –IR-blocking filter removed –Visible-light blocking filter added IR LED Glove IR LED Glove –3 IR LEDs –3 1.5V AAA batteries

Software Implementation Java and Java Media Framework Java and Java Media Framework Custom LED Detection Custom LED Detection LED Tracking LED Tracking Gesture Recognition Gesture Recognition Command Execution Command Execution (Glove Interface)

Software Implementation  Manipulation of photos with both mouse and glove interfaces (drag, rescale, rotate)  Pre-defined tasks in which photos must be manipulated to reach a final state  Space and time data collection during task  Data export to CSV files (Photo Manipulation Application)

Software Implementation (Photo Manipulation Application)

LED Detection Binary Rasterization Binary Rasterization Brightness Threshold Determination Brightness Threshold Determination

LED Detection  Blob Detection  Finding centers of two overlapping LEDs Equal DistributionUnequal Distribution

LED Tracking  Initial Classification Required  Identifies left/right pointer/clicker/aux LEDs  Logic-Based Reclassification of new LEDs

Gesture Recognition  Application-Specific Gestures  Photo drag, rescale, and rotate  Current status:  Single pointer and clicker that matches mouse commands DragRescaleRotate

Gesture Recognition  Two-Handed Gestures: Rescale/Rotate  Executed when both hands pinch while their respective cursors are holding an image  Rescale: moving cursors closer/farther  Rotate: rotating cursors about midpoint  Still in development/evaluation phase RescaleRotate

Experiment  Three single-handed gesture commands were executed with both a mouse interface and the glove interface  Relevant space and time collected for comparison

Mouse Interface Time dragging: s Glove Interface Time dragging: s 42.6% (0.785 s) more time Task: Drag Photo 500 pixels to the Right

Mouse Interface Time dragging: s Glove Interface Time dragging: s 12.7% (0.282 s) more time Task: Rescale Photo to from 25% to 100%

Mouse Interface Time dragging: s Glove Interface Time dragging: s 72.3% (2.283 s) more time Task: Rotate Photo 2.0 Radians Clockwise

Analysis  Glove interface consistently spent more dragging time than mouse interface  Evaluation of glove interface:  Smooth during movement  Corrections during placement  Indicates lesser accuracy, especially with non-movement

Conclusion  The glove interface gestures evaluated only matched the mouse interface commands  Glove interface has no advantage in input of the same complexity  It is expected that the glove interface will only have an advantage over the mouse interface when it can simplify the command due to its versatile input capabilities