Body Tracking and Gesture Recognition Aaron Pulver

Slides:



Advertisements
Similar presentations
CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Advertisements

Davide Spano CNR-ISTI, HIIS Laboratory, Via G. Moruzzi Pisa, Italy.
OpenCV Introduction Hang Xiao Oct 26, History  1999 Jan : lanched by Intel, real time machine vision library for UI, optimized code for intel 
Voice Controlled Surgical Assistant ECE 7995 Dupindar ghotra, Muhammad Syed, Sam li, Sophia lei.
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
KINECT REHABILITATION
INTERACTING WITH SIMULATION ENVIRONMENTS THROUGH THE KINECT Fayez Alazmi Supervisor: Dr. Brett Wilkinson Flinders University Image 1Image 2Image 3 Source.
Kinect H4x Gesture Recognition and Playback Tools (+Inspiration)
-Baljeet Aulakh -Arnold Csok -Jared Shepherd -Amandeep Singh EEC 490 Spring 2012 Kinect Fitness Trainer 1.
Real Time Visual Body Feedback & IR Tracking in HMD Based Virtual Environments Using Microsoft Kinects Speaker: Srivishnu ( Kaushik ) Satyavolu Advisor:
Game Development with Kinect
INTERACT : M OTION S ENSOR D RIVEN G ESTURE R ECOGNITION C LOUD S ERVICE School of Electronic & Computer Engineering Technical University of Crete, Greece.
A Survey of Mobile Phone Sensing Michael Ruffing CS 495.
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
Feature Extraction Spring Semester, Accelerometer Based Gestural Control of Browser Applications M. Kauppila et al., In Proc. of Int. Workshop on.
P14215 AUTONOMOUS WANDERING AMBASSADOR What is the project about? The objective of this project is to modify a currently remote controlled robot so that.
Kinect Part II Anna Loparev.
 By the end of this, you should be able to state the difference between DATE and INFORMAITON.
Professor : Yih-Ran Sheu Student’s name : Nguyen Van Binh Student ID: MA02B203 Kinect camera 1 Southern Taiwan University Department of Electrical Engineering.
By the end of this chapter, you should:  Understand the properties of an engineering requirement and know how to develop well-formed requirements that.
July 25, 2010 SensorKDD Activity Recognition Using Cell Phone Accelerometers Jennifer Kwapisz, Gary Weiss, Samuel Moore Department of Computer &
Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.
Canyon Adventure Technology David Maung, Tristan Reichardt, Dan Bibyk, Juan Roman Department of Computer Science and Engineering The Ohio State University.
INTERACTING WITH SIMULATION ENVIRONMENTS THROUGH THE KINECT Fayez Alazmi Supervisor: Dr. Brett Wilkinson Flinders University Image 1Image 2 Source : 1.
Human Gesture Recognition Using Kinect Camera Presented by Carolina Vettorazzo and Diego Santo Orasa Patsadu, Chakarida Nukoolkit and Bunthit Watanapa.
Repetition Counting With Microsoft Kinect Presented by: Jonathan Gurary Dai Jun.
Page 1 | Microsoft Work With Skeleton Data Kinect for Windows Video Courses Jan 2013.
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
Human Activity Recognition Using Accelerometer on Smartphones
Microsoft Visual Programming Language Advanced Example.
Online Kinect Handwritten Digit Recognition Based on Dynamic Time Warping and Support Vector Machine Journal of Information & Computational Science, 2015.
Achieving High Software Reliability Using a Faster, Easier and Cheaper Method NASA OSMA SAS '01 September 5-7, 2001 Taghi M. Khoshgoftaar The Software.
Review of Applications By Phoebe Stewart. Introduction I shall be talking about the features, strengths and weaknesses of two different apps, a learning.
Butler Bot Sai Srivatsava Vemu Graduate Student Mechanical and Aerospace Engineering.
By Rachel Hoffman DrumBot.  Mission  Overview  Hardware  Software  Special Sensor  Behaviors  Timeline  Questions Objectives.
XBOX Kinect. Features Controller-free gaming means Kinect responds to how you move Once you wave your hand to activate the sensor, your Kinect will.
Professor : Tsung Fu Chien Student’s name : Nguyen Trong Tuyen Student ID: MA02B208 An application Kinect camera controls Vehicles by Gesture 1 Southern.
TOUCH ME NOT Presented by: Anjali.G.
Kinect Hank Wei. Top - News 1.5 billion USD.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
OCR Software Architecture for Embedded Device Seho Kim', Jaehwa Park Computer Science, Chung-Ang University, Seoul, Korea
Introduction to Kinect For Windows SDK
EEC-693/793 Applied Computer Vision with Depth Cameras Lecture 8 Wenbing Zhao
Auto-Park for Social Robots By Team Daedalus. Requirements for FVE Functional Receive commands from user via smartphone app Share data with other cars.
A Gesture Based System Humanize Technology.  Communication is the way we learn.  What about learners with communication difficulties?  Make technology.
Group 18 Chen Zhang Client: Professor Frank Yin Sept. 30, 2015 Smart Walker Project PRELIMINARY PRESENTATION.
Making Research Tools Accessible for All AI Students Zach Dodds, Christine Alvarado, and Sara Sood Though a compelling area of research with many applications,
On Wikipedia you can find the following definition of NUI: “In computing, a natural user interface, or NUI, or Natural Interface is the common parlance.
UWave: Accelerometer-based personalized gesture recognition and its applications Tae-min Hwang.
Reference Management Module I: Introduction By Rehema Chande-Mallya(PhD)
Obstacle avoidance with user in the loop for powered wheelchair 24th April 2012 Part-financed by the European Regional Development Fund 1 Xin JIN research.
KINECT AMERICAN SIGN TRANSLATOR (KAST)
Creative Coding & the New Kinect
Southern Taiwan University Department of Electrical Engineering
Hand Gestures Based Applications
Capstone Project, Computer Science Department
GRADUATION PROJECT Air Mouse
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
ETHAN BROWN / KATE RAFTER
EEC-693/793 Applied Computer Vision with Depth Cameras
Fusion, Face, HD Face Matthew Simari | Program Manager, Kinect Team
EEC-693/793 Applied Computer Vision with Depth Cameras
Multi-Sensor Soft-Computing System for Driver Drowsiness Detection
A Restaurant Recommendation System Based on Range and Skyline Queries
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
PRELIMINARY DESIGN REVIEW
MOWLES Interim Progress Report
Presentation transcript:

Body Tracking and Gesture Recognition Aaron Pulver M.A.C.S. Body Tracking and Gesture Recognition Aaron Pulver

Overview The M.A.C.S. needs to be able to sucessfully track users and recognize various hand gestures and/or body poses. Several sensor and software options were investigated to determine the most efficient combination

Related Marketing Requirements Description 1 Must autonomously follow an individual throughout a flat environment with obstacles. 2 Must adapt in height for different individuals. 3 Must identify an individual and follow only him or her. 4 Must maintain a reasonable distance from the individual. 5 Must respond to up to four hand gestures to aid in mobility.

Engineering Specification Engineering Specifications Related Marking Requirement Description 1 1,4 Must be able to move at up to 5 mph with a desired average speed of 3 mph while continuously tracking the user’s position. 2 4 The M.A.C.S. should be no more than two feet from the followed individual at any time. The Kinect/Sensor will be mounted on the back of the cart. 3 5 The M.A.C.S. should have an at least 65% accuracy when classifying gestures. The M.A.C.S. should identify the correct user by combining smartphone data with the Kinect sensor. Should the user be out of view, recalibration should occur within 3 seconds of seeing the person. The body tracking should track children (4ft) through large adults (6 ft-6”).

Risk Investigation (Hardware) Ease of use Availability Cost Webcam/Camera Difficult Yes $25-200 new Xtion Pro Moderate No (back-ordered) $179 new MS Kinect $109.99 new, $50 used (free for us for now)

Risk Investigation (Software) Ease of use Platforms Skeleton Tracking Languages OpenCV Difficult Linux, Windows No C++, Python, Java (wrappers for other languages) MS SDK Easy Windows Yes C#, VB.Net Libfreenect Moderate C,C++,C#, VB.net, Java OpenNI C++, (wrappers for .NET and Java)

Why the Kinect and OpenNI? Price and availability The Kinect has very similar attributes to the Xtion Pro but it is more widely used OpenNI has high-level libraries for skeleton tracking OpenNI is multi-platform and multi-language OpenNI is actively supported and developed by PrimeSense OpenNI has good documentation and examples

Tracking Risk Mitigation Mount the Kinect high on the cart in the back Skeleton Tracking Use accelerometer data from phone Communicate to user if lost Tracking Module Motor Controller Proximity Sensors Smartphone user data Kinect location of user(s) Joint(s) Warning Message/Light Gesture Recognition

Kinect Placement Kinect Proximity Sensors 43°

Gesture Recognition Risk Mitigation Skeleton tracking of joint positions/rotations Machine learning of gestures Support Vector Machine Dynamic Time Warping LIBSVM – open source SVM library Cross-validation to verify correct learning Raw Kinect Sensor Data OpenNI More Preprocessing LIBSVM and Classification Controller

Parts List Part Description Cost Our Cost Availability MS Kinect (Xbox 360) A depth, image, and IR sensor. $109.99 $50.00 (used) $40.00 (used) Amazon Newegg Craigslist (used) Android Smart Phone Smartphone to stream accelerometer data. $199.99 Free N/A Laptop (Linux) A laptop to process the Kinect data. $299.99

Testing Strategy (Tracking) User Walking down a hallway Turn Left Walk Straight Turn Left, then Left again (around a wall or something) Requires a robot that moves Multiple users in the field of view Track user  Another user enters  Users cross paths several times  Users obstruct each other for 3 seconds  Initial user leave FOV  Initial user returns to FOV Initial user leaves FOV for too long must recalibrate with gesture

Testing Strategy (Gestures) The gesture recognition testing is embedded in the machine learning process Cross-validation If the recorded test data meets or exceeds the 65% classification rate then it is successful Real-time testing can also be done but it should yield similar results to the test data.

Uncertainties Gesture recognition Time and effort Not as critical as other pieces of the project Data collection and the machine learning can be done while the drive base is being built

Sources [1] Browning, R. C., Baker, E. A., Herron, J. A. and Kram, R. (2006). "Effects of obesity and sex on the energetic cost and preferred speed of walking". Journal of Applied Physiology 100 (2): 390– 398.   [2] Hall, Edward T. (1966). The Hidden Dimension. Anchor Books. [3] “Kinect for Windows Sensor Components and Specifications”. Microsoft Store. Microsoft. Retrieved October 30, 2013. http://msdn.microsoft.com/en-us/library/jj131033.aspx [4] Bhattacharya, S.; Czejdo, B.; Perez, N., "Gesture classification with machine learning using Kinect sensor data," Emerging Applications of Information Technology (EAIT), 2012 Third International Conference on , vol., no., pp.348,351, Nov. 30 2012-Dec. 1 2012 [5] Bodiroza, S.; Doisy, G.; Hafner, V.V., "Position-invariant, real-time gesture recognition based on dynamic time warping," Human-Robot Interaction (HRI), 2013 8th ACM/IEEE International Conference on , vol., no., pp.87,88, 3-6 March 2013 [6] “Xtion PRO”. Asus. Asus. Retrieved October 31, 2013. http://www.asus.com/Multimedia/Xtion_PRO/#specifications

Questions?