Supervisors: Tanja Mitrovic and Moffat Mathews Anthony Bracegirdle

Slides:



Advertisements
Similar presentations
Interaction Devices By: Michael Huffman Kristen Spivey.
Advertisements

 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
SM4140 Graduation Thesis Mid-term Presentation. Introduction Interactive Video Installation Allowing audience to be an active role in watching a video.
SienceSpace Virtual Realities for Learning Complex and Abstract Scientific Concepts.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
Interactions between actors involved in planning and design decision processes Prof.dr.ir. B. de Vries.
Peter Carta Charles Stocker.  Project purpose and scope  Building plan and Virtual Model  Project Decomposition ◦ Modeling Building, Furnishings, and.
Integrated Astronaut Control System for EVA Penn State Mars Society RASC-AL 2003.
The Implementation of a Glove-Based User Interface Chris Carey.
RAGEEVGANDHI MEMORIAL COLLEGE OF ENGINEERING AND TECHNOLOGY
Jocelyn Varela CST 300L Fall Introducing The Leap.
INPUT DEVICES. KEYBOARD Most common input device for a computer.
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
CS Spring 2009 CS 414 – Multimedia Systems Design Lecture 39 – Hot Topics in Multimedia Klara Nahrstedt Spring 2009.
Multi-device Organic 3D Sculpting through Natural User Interface Gestures BSci Honours - Bradley Wesson Supervisor – Brett Wilkinson.
Robotic Arm for Minimally Invasive Surgery Team: Brenton Nelson, Ashley Huth, Max Michalski, Sujan Bhaheetharan BME 200/300 October 14, 2005.
Final Honours Presentation Principal Investigator: João Lourenço Supervisor: Dr Hannah Thinyane.
Welcome to Control ! Hi ! Styx Innov. What is Control ? Control is an android application which enables us to remotely control our PC via Wireless Fidelity.
CMPT480 Term Project Yichen Dang Nov 28,2012.   For me:  Introduce a technology for painting without hands requirement  Deeper understanding of eye.
Physically Realistic Interface for a User Inside VR Masahide Hashimoto Kenji Miyamoto Faculty of Engineering Hosei University (Japan)
VIRTUAL REALITY (VR) INTRODUCTION AND BASIC APPLICATIONS الواقع الافتراضي : مقدمة وتطبيقات Dr. Naji Shukri Alzaza Assist. Prof. of Mobile technology Dean.
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
BlindAid: a Virtual Exploration Tool for People who are Blind O. Lahav, Ph.D., D. Schloerb, Ph.D., S. Kumar, and M. A. Srinivasan, Ph.D Touch Lab, RLE,
Model of the Human  Name Stan  Emotion Happy  Command Watch me  Face Location (x,y,z) = (122, 34, 205)  Hand Locations (x,y,z) = (85, -10, 175) (x,y,z)
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
TOUCH ME NOT Presented by: Anjali.G.
The Implementation of a Glove-Based User Interface Chris Carey.
ECE 4007 L01 DK6 1 FAST: Fully Autonomous Sentry Turret Patrick Croom, Kevin Neas, Anthony Ogidi, Joleon Pettway ECE 4007 Dr. David Keezer.
Human Interaction World in miniature papers. INTERACTIVE W orlds I n M iniature WIM 1.Introduction 2.System Description 3.Findings from Previous Work.
Team Members Ming-Chun Chang Lungisa Matshoba Steven Preston Supervisors Dr James Gain Dr Patrick Marais.
Input Devices. Input devices allow us to enter data into the computer system –Mouse –Keyboard –Graphics Tablet –TrackPad –Touch-sensitive screen - Scanner.
Advanced Science and Technology Letters Vol.46 (Games and Graphics 2014), pp Development of Serious.
TOUCHLESS TOUCHSCREEN USER INTERFACE
Using Hand Gestures for Alternative User Verification
Gesture Control interface
Daniel A. Taylor Pitt- Bradford University
Oculus Rift/360 Fly Ryan, Justin & Matthew.
A seminar on Touchless Touchscreen Technology
Multimedia Programming
First-person Teleoperation of Humanoid Robots
IMPART: The Intelligent Mobility Partnership
TYPICAL INPUT DEVICES By Dajia Forbes 1c.
ARD Presentation January, 2012 BodyPointer.
Xbox Kinect (Microsoft)
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
TOUCHLESS TOUCHSCREEN USER INTERFACE
F-Pointer: Prototype testing of the finger-manipulated device
NBKeyboard: An Arm-based Word-gesture keyboard
Finger Interaction Virtual Reality
Out of this World A VR Experience brought to you by Team WFF:
Depth Perception in Medical Augmented Reality
A seminar on Touchless Technology
Oculus Rift DK2 + Leap Motion Unity Tutorial
Improvements on a Novel Hybrid Tracking System
Higher School of Economics , Moscow, 2016
What is blue eyes ? aims on creating computational machines that have perceptual and sensory ability like those of human beings. interactive computer.
The Implementation of a Glove-Based User Interface
1.2 System Design Basics.
Lesson 4 Alternative Methods Of Input.
Myo + Oculus Rift Tutorial
LEAP MOTION: GESTURAL BASED 3D INTERACTIONS
Natural User Interaction with Perceptual Computing
BlindAid: a Virtual Exploration Tool for People who are Blind
Kinetic Painting VR application
HCI Human Computer Interface
Empirical Evaluation Data Collection: Techniques, methods, tricks Objective data IRB Clarification All research done outside the class (i.e., with non-class.
Multiple Interaction and Spatial Displays for Computer-Human Interaction by Daniel Dixon.
Higher School of Economics , Moscow, 2016
Higher School of Economics , Moscow, 2016
Presentation transcript:

Supervisors: Tanja Mitrovic and Moffat Mathews Anthony Bracegirdle Investigating the Usability of Leap Motion Controller: Gesture-Based Interaction with a 3D Virtual Environment Supervisors: Tanja Mitrovic and Moffat Mathews Anthony Bracegirdle

Background ICTG project Strokes Prospective memory

Leap Motion New: 2013 Cheap: about $100 Two monochromatic IR cameras and three IR LEDS FOV: 150 degrees to distance of two feet Claimed to accurately detect 10 fingers to within one hundredth of millimeter at 200 FPS

Why? Gesturing a natural part of everyday life A more intuitive form of interaction Technology now available Previous study – Oculus Rift, Mouse + Keyboard, Joystick and Razer Hydra Prior research – some on gestures, some on Leap Motion Because I want to Because I was told to

Goals Investigate gestures as a method of interaction and navigation with the environment Investigate the viability of the Leap Motion Identify issues with the Leap Motion, eg physical fatigue

VR Environment Navigation Object interaction Crouch Check time and inventory

Implementation Unity and Mono Develop C# Leap Motion SDK Convert low level data from the Leap Motion into movement and actions

Actions Object interaction Ability to check time and inventory Ability to crouch

Navigation Need to able to move forward and backward and rotate Airplane metaphor Airplane and gas pedal metaphor, two handed Positional mode Videos

Issues Multiple hand detection – face, watches SDK built-in gestures unreliable Finger tracking unreliable – discrete methods

Study Preliminary pilot study 30 participants Trialing the modes Data recorded Questionnaire

Preliminary Results Positive feedback Positional mode considered the most user friendly Crouching difficult for some Wanted the ability to simultaneously move and rotate

Future Work Analyse the data retrieved Comparison with previous study Remove keyboard Trial with a stroke patient