S Robotic application of Human Hand Gesture Ali El-Gabri, Al-Noor Academy Nathaniel Mahowald, Mass Academy Grad Students: Dimitri Kanoulas and Andreas.

Slides:



Advertisements
Similar presentations
Gestures Recognition. Image acquisition Image acquisition at BBC R&D studios in London using eight different viewpoints. Sequence frame-by-frame segmentation.
Advertisements

User Interface design Teppo Räisänen
Visual Servo Control Tutorial Part 1: Basic Approaches Chayatat Ratanasawanya December 2, 2009 Ref: Article by Francois Chaumette & Seth Hutchinson.
By: Jonathan Lebron Advisor: Nick Webb SENIOR PROJECT – COMPUTER SCIENCE – 2013 RECOGNIZING MILITARY GESTURES DEVELOPING A GESTURE RECOGNITION INTERFACE.
Working for the future - today
Hand Gesture for Taking Self Portrait Shaowei Chu and Jiro Tanaka University of Tsukuba Japan 12th July 15 minutes talk.
COMPUTER-AIDED SURGICAL PLANNING AND PROCEDURES A.Schaeffer; PolyDimensions GmbH, Bickenbach.
Shweta Jain 1. Motivation ProMOTE Introduction Architectural Choices Pro-MOTE system Architecture Experimentation Conclusion and Future Work Acknowledgement.
ECE 450 Introduction to Robotics Section: Instructor: Linda A. Gee 10/05/99 Lecture 10.
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
Stay Kinected: A Home Monitoring System Combining Safety and Comfort Abstract The purpose of this project is to use the Microsoft Kinect sensor to implement.
MUltimo3-D: a Testbed for Multimodel 3-D PC Presenter: Yi Shi & Saul Rodriguez March 14, 2008.
Microsoft Surface Technology Steven Davis MIS 304 9/29/2009.
Game Development with Kinect
EXERCISES – PATH PLANNING & ROS ISSUES By Vuk Vujovic.
Input Devices Text Entry Devices, Positioning, Pointing and Drawing.
Introduction to Embedded Development. What is an Embedded System ? An embedded system is a computer system embedded in a device with a dedicated function.
Project Objectives o Developing android application which turns standard cellular phone into a tracking device that is capable to estimate the current.
1 References: 1. J.M. Hart, Windows System Programming, 4th Ed., Addison-Wesley, 2010, Ch.12 2.Microsoft Kinect SDK for Developers,
TOUCHLESS TOUCH SCREEN
HAND GESTURE BASED HUMAN COMPUTER INTERACTION. Hand Gesture Based Applications –Computer Interface A 2D/3D input device (Hand Tracking) Translation of.
(CONTROLLER-FREE GAMING
1 JCM 106 Computer Application for Journalism Lecture 1 – Introduction to Computing.
1 RoboSapien Based Autonomous Humanoid Robot Researched and Presented by Nick Repka.
ASSISTIVE TECHNOLOGY PRESENTED BY ABDUL BARI KP. CONTENTS WHAT IS ASSISTIVE TECHNOLOGY? OUT PUT: Screen magnifier Speech to Recogonizing system Text to.
CS 5403 Java, GUIs, and Visualization Dr. Chaman Lal Sabharwal Professor Contact information: Fax: Phone:
Brainstorming 3 Juuso Kinnunen Ville Rahikka. Current state + stability, energy - slow, limited work area.
Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.
Canyon Adventure Technology David Maung, Tristan Reichardt, Dan Bibyk, Juan Roman Department of Computer Science and Engineering The Ohio State University.
INTRODUCTION Generally, after stroke, patient usually has cerebral cortex functional barrier, for example, the impairment in the following capabilities,
Page 1 | Microsoft Work With Skeleton Data Kinect for Windows Video Courses Jan 2013.
PRESENTED BY: Nadia Qamoum Suzanne Blasingame Rachael Reano Hunza Iqbal.
Chapter 11 An Introduction to Visual Basic 2008 Why Windows and Why Visual Basic How You Develop a Visual Basic Application The Different Versions of Visual.
1.1 Introduction to Programming academy.zariba.com 1.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
Eye Play. Table of contents Tobii Eye Gaze The project The client Research The roles inside the group The roles inside the group Result The Artefact.
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
Chapter 11 An Introduction to Visual Basic 2005 Why Windows and Why Visual Basic How You Develop a Visual Basic Application The Different Versions of Visual.
A remote control robot with webcam. Responsibilities User Interface Communicate with server Webcam Display Server Web Server Collaborators Work: Harkins.
Professor : Tsung Fu Chien Student’s name : Nguyen Trong Tuyen Student ID: MA02B208 An application Kinect camera controls Vehicles by Gesture 1 Southern.
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
Object Recognition in ROS Using Feature Extractors and Feature Matchers By Dolev Shapira.
1 Human Computer Interaction Week 5 Interaction Devices and Input-Output.
VIRTUAL REALITY PRESENTED BY, JANSIRANI.T, NIRMALA.S, II-ECE.
ECE 450 Introduction to Robotics Section: Instructor: Linda A. Gee 10/07/99 Lecture 11.
Nir Mendel, Yuval Pick & Ilya Roginsky Advisor: Prof. Ronen Brafman
COMPUTER MAIN PARTS SANTIAGO OCAMPO MEJIA. HARDWARE  Or materials set of physical elements of a computer or a computer system.
CONTENT FOCUS FOCUS INTRODUCTION INTRODUCTION COMPONENTS COMPONENTS TYPES OF GESTURES TYPES OF GESTURES ADVANTAGES ADVANTAGES CHALLENGES CHALLENGES REFERENCE.
Presented by : P L N GANESH CH DURGA PRASAD M RAVI TEJA 08551A A A0446.
People today are limited to a mouse and keyboard when using a computer There are little to no alternatives out in the market at this moment Natural human.
Video Overlay Advanced Computer Integrated Surgery ( ) Jeff Hsin, Cyrus Moon, Anand Viswanathan Final Presentation.
Scrumm meeting. Mechanical Done. Worked on final type What to do? Finish the assembly of the bottom part.
Product: Microsoft Kinect Team I Alex Styborski Brandon Sayre Brandon Rouhier Section 2B.
Wekinator
Slicer IGT : Workflow based design Andinet Enquobahrie, PhD Kitware Inc.
Gestures and Device Motion. Introduction Everyone has their own way of picking up and putting down their device Holding and using the device can determine.
TOUCHLESS TOUCHSCREEN USER INTERFACE
TOUCHLESS TOUCH SCREEN USER INTERFACE
Kuantic SMB-SIS.
Hand Gestures Based Applications
A seminar on Touchless Touchscreen Technology
ECE387: INTRODUCTION TO ROBOTIC SYSTEMS
A seminar on Touchless Technology
Mixed Reality Server under Robot Operating System
Quick Introduction to ROS
Robotics and Perception
Computer Vision Readings
DATA MINING Python.
Epson Error Code 0x98 Fix for Epson Error Code 0x98 in USA | Epson support 247.
Presentation transcript:

S Robotic application of Human Hand Gesture Ali El-Gabri, Al-Noor Academy Nathaniel Mahowald, Mass Academy Grad Students: Dimitri Kanoulas and Andreas Ten Pas PI: Robert Platt

Introduction The fundamental point of this project is: 1.Making gestures that make the robot pick up objects 2.Pointing at the object the user wants the Robot to pick up. How? 1.Creating interfaces between the computer and the sensor 2.Creating interfaces between the sensor and Robot.

The materials used were: 1.ROS Hydro Medusa Robot Operating System Provides several helpful tools Hydro Medusa 7 th Version 2.Xtion Pro Like a Kinnect Makes gesture tracking precise 3.Baxter Robot – two hand manipulator Materials

Methods 1.Install ROS Hydro Medusa 2.Install openni_launch Camera driver 3.Install openni_tracker Creates a skeleton for any person in front of it 4.Set up a catkin workspace 5.Write python code to communicate

Sub-project 1: Directional Pointer 1.Work with Rviz ROS Visualization; visualizes camera feed 2.Set up the Transforms (TF) in Rviz Keeps track of 3D frames’ change over time Operates in a distributed system 3.Created a TF listener in python Receives coordinate frames Query for specific TFs between frames 4.Functional code Informs user of the direction arm is in the xyz plane

Sub-project 2: Body Part Pointer 1.Have two users on screen 2.Point with left hand Display body part being pointed at for either user Display which user is pointing at which 3.How this helps: More work with Rviz Rviz already recognized human bodies Experimented with Dot Product, Matrices, and creation of Vectors First step to pointing at other things

Sub-project 3: Gesture Control 1.Method of gesture based control without fixed frame 2.This is the first place where we fixed our user buildup problem 3.Went through a few drafts of what positions worked 4.First project we worked with on the robot

Final Project: Any Frame Pointer 1.We couldn’t get a “true” pointer without creating a fixed frame; our solution was calibration 2.Extremely accurate 3.Uses left hand as a signal that user is pointing 1.Potential extensions

Video of Final Product

Next Steps 1.Use voice recognition software to interact with the user 2.Create a pointer that does not require calibration and uses frame to run 3.Compile all the code onto a usable device, so that a disabled person could use a robotic arm to pick up objects they need

Acknowledgements A special thanks to our very helpful Grad students, Dimitri Kanoulas and Andreas Ten Pas. A very warm appreciation to Robert Platt, our ever wise PI. And, of course, to those who made it possible and walked us every step of the way: Duggan Claire, Program Director Kassi Stein, Program Coordinator Chi-Yin Tse, Program Coordinator