Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Implementation of a Glove-Based User Interface Chris Carey.

Similar presentations


Presentation on theme: "The Implementation of a Glove-Based User Interface Chris Carey."— Presentation transcript:

1 The Implementation of a Glove-Based User Interface Chris Carey

2 Abstract Advantages of multi-touch interfaces are being utilized in numerous applications This project aims to go a step further A glove-based interface provides the utility of a multi-touch interface without the proximity restriction A more natural human-computer interaction can also improve efficiency with complicated tasks.

3 Background Why now? – Accessibility of Technology – Increased Application Sophistication – Usage in Restrictive Environments – Why not?

4 Past and Current Systems Glove Systems – Haptic Gloves and VR Systems – Full Motion Capture Glove Systems – Basic Wiimote Glove Systems Non-Glove Systems – Neural Network Hand Gesture Recognition – 3D Model Reconstruction Gesture Recognition

5 Project Goals Focuses: – Speed – Accuracy – Task Simplification – Improved User Experience

6 Logitech Webcam – IR-blocking filter removed – Visible-light blocking filter added IR LED Glove – 3 IR LEDs – 2-1.5V button cell batteries Hardware Implementation

7 Software Implementation Java and Java Media Framework Custom LED Detection LED Tracking Gesture Recognition Command Execution

8 LED Detection Binary Rasterization Brightness Threshold Determination Blob Comparison

9 LED Tracking LED object class – Records previous positions and velocities – Predicts next position for faster location Balance between detected LEDs and tracked LEDs

10 Gesture Recognition Static Gestures – Do not depend on absolute location – Performed and executed once Dynamic Gestures – Do depend on absolute location – Performed and executed continuously

11 Static Gesture: Minimize Decreasing distance between three LEDs java.awt.Robot class executes keystroke ALT+SPACE + ‘n’

12 Dynamic Gesture: Mouse Pointer Tracks LED with greatest y-value Executed when no other gesture is recognized

13 Dynamic Gesture: Drag and Drop Distance between two LEDs decreasing at minimum velocity DRAG: Minimum distance between maintained DROP: Distance between exceeds minimum distance

14 Planned Gestures Single Hand Gestures – Mouse Click – Mouse Scroll – Window Maximize/Restore Two Handed Gestures – Window Selection – Object Resize – Object Zoom – Object Rotate

15 Preliminary Analysis Speed – Mode length of time for each iteration: 47 ms – Slower than necessary 24fps (41ms) required to bypass human perception of real events Accuracy – Poor LED detection has led to poor gesture recognition – Brighter LEDs or stronger camera necessary

16 Possible Solutions Brighter IR LEDs LED pulse driving circuit Webcam with night vision IR narrow band pass filter

17 Work Remaining Improved hardware Refined LED detection/tracking Quicker processing Increased gesture support Application Control

18 Conclusions Speed and accuracy still an issue Minimize static gesture simplifies task when compared to mouse interface Glove interface constantly receives IR light Multi-touch gestures activate when IR light activates Requirement of multi-touch interfaces for direct contact ensures consistency


Download ppt "The Implementation of a Glove-Based User Interface Chris Carey."

Similar presentations


Ads by Google