Supervisors: Tanja Mitrovic and Moffat Mathews Anthony Bracegirdle Investigating the Usability of Leap Motion Controller: Gesture-Based Interaction with a 3D Virtual Environment Supervisors: Tanja Mitrovic and Moffat Mathews Anthony Bracegirdle
Background ICTG project Strokes Prospective memory
Leap Motion New: 2013 Cheap: about $100 Two monochromatic IR cameras and three IR LEDS FOV: 150 degrees to distance of two feet Claimed to accurately detect 10 fingers to within one hundredth of millimeter at 200 FPS
Why? Gesturing a natural part of everyday life A more intuitive form of interaction Technology now available Previous study – Oculus Rift, Mouse + Keyboard, Joystick and Razer Hydra Prior research – some on gestures, some on Leap Motion Because I want to Because I was told to
Goals Investigate gestures as a method of interaction and navigation with the environment Investigate the viability of the Leap Motion Identify issues with the Leap Motion, eg physical fatigue
VR Environment Navigation Object interaction Crouch Check time and inventory
Implementation Unity and Mono Develop C# Leap Motion SDK Convert low level data from the Leap Motion into movement and actions
Actions Object interaction Ability to check time and inventory Ability to crouch
Navigation Need to able to move forward and backward and rotate Airplane metaphor Airplane and gas pedal metaphor, two handed Positional mode Videos
Issues Multiple hand detection – face, watches SDK built-in gestures unreliable Finger tracking unreliable – discrete methods
Study Preliminary pilot study 30 participants Trialing the modes Data recorded Questionnaire
Preliminary Results Positive feedback Positional mode considered the most user friendly Crouching difficult for some Wanted the ability to simultaneously move and rotate
Future Work Analyse the data retrieved Comparison with previous study Remove keyboard Trial with a stroke patient