Download presentation
Presentation is loading. Please wait.
Published byBeverly Houston Modified over 9 years ago
1
S Robotic application of Human Hand Gesture Ali El-Gabri, Al-Noor Academy Nathaniel Mahowald, Mass Academy Grad Students: Dimitri Kanoulas and Andreas Ten Pas PI: Robert Platt
2
Introduction The fundamental point of this project is: 1.Making gestures that make the robot pick up objects 2.Pointing at the object the user wants the Robot to pick up. How? 1.Creating interfaces between the computer and the sensor 2.Creating interfaces between the sensor and Robot.
3
The materials used were: 1.ROS Hydro Medusa Robot Operating System Provides several helpful tools Hydro Medusa 7 th Version 2.Xtion Pro Like a Kinnect Makes gesture tracking precise 3.Baxter Robot – two hand manipulator Materials
4
Methods 1.Install ROS Hydro Medusa 2.Install openni_launch Camera driver 3.Install openni_tracker Creates a skeleton for any person in front of it 4.Set up a catkin workspace 5.Write python code to communicate
5
Sub-project 1: Directional Pointer 1.Work with Rviz ROS Visualization; visualizes camera feed 2.Set up the Transforms (TF) in Rviz Keeps track of 3D frames’ change over time Operates in a distributed system 3.Created a TF listener in python Receives coordinate frames Query for specific TFs between frames 4.Functional code Informs user of the direction arm is in the xyz plane
6
Sub-project 2: Body Part Pointer 1.Have two users on screen 2.Point with left hand Display body part being pointed at for either user Display which user is pointing at which 3.How this helps: More work with Rviz Rviz already recognized human bodies Experimented with Dot Product, Matrices, and creation of Vectors First step to pointing at other things
7
Sub-project 3: Gesture Control 1.Method of gesture based control without fixed frame 2.This is the first place where we fixed our user buildup problem 3.Went through a few drafts of what positions worked 4.First project we worked with on the robot
8
Final Project: Any Frame Pointer 1.We couldn’t get a “true” pointer without creating a fixed frame; our solution was calibration 2.Extremely accurate 3.Uses left hand as a signal that user is pointing 1.Potential extensions
9
Video of Final Product
10
Next Steps 1.Use voice recognition software to interact with the user 2.Create a pointer that does not require calibration and uses frame to run 3.Compile all the code onto a usable device, so that a disabled person could use a robotic arm to pick up objects they need
11
Acknowledgements A special thanks to our very helpful Grad students, Dimitri Kanoulas and Andreas Ten Pas. A very warm appreciation to Robert Platt, our ever wise PI. And, of course, to those who made it possible and walked us every step of the way: Duggan Claire, Program Director Kassi Stein, Program Coordinator Chi-Yin Tse, Program Coordinator
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.