First-Person Tele- Operation of a Humanoid Robot Lars Fritsche, Felix Unverzagt, Jan Peters and Roberto Calandra.

Slides:



Advertisements
Similar presentations
The RobotCub project Giulio Sandini (1,2), Giorgio Metta (1,2), David Vernon (1) (1) University of Genoa (2) Italian Institute of Technology.
Advertisements

By : Adham Suwan Mohammed Zaza Ahmed Mafarjeh. Achieving Security through Kinect using Skeleton Analysis (ASKSA)
Device to Augment Finger Movement in Stroke Patients Team 12 Tom Fleming, Mark Reagan, Tyler Vovos, Brad Rogers.
Project Proposal [5HC99]: Nao Robot playing Checkers Natalia Irigoyen Wouter Kuijpers Alejandro Betancourt.
A Robust Method of Detecting Hand Gestures Using Depth Sensors Yan Wen, Chuanyan Hu, Guanghui Yu, Changbo Wang Haptic Audio Visual Environments and Games.
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
Introduction to Robotics In the name of Allah. Introduction to Robotics o Leila Sharif o o Lecture #2: The Big.
3D Hand Pose Estimation by Finding Appearance-Based Matches in a Large Database of Training Views
1 Design of a controller for sitting of infants Semester Project July 5, 2007 Supervised by: Ludovic Righetti Prof. Auke J. Ijspeert Presented by: Neha.
Teleoperation Interfaces. Introduction Interface between the operator and teleoperator! Teleoperation interface is like any other HMI H(mobile)RI = TI.
REAL ROBOTS. iCub It has a height of 100 cm, weighs 23 Kg, and is able to recognize and manipulate objects. Each hand has 9 DOF and can feel objects almost.
Metta, Sandini, Natale & Panerai.  Developmental principles based on biological systems.  Time-variant machine learning.  Focus on humanoid robots.
3D Fingertip and Palm Tracking in Depth Image Sequences
Twendy-One Presented by: Brandon Norton Robot Designed by: WASEDA University Sugano Laboratory, 2009.
KinectFusion : Real-Time Dense Surface Mapping and Tracking IEEE International Symposium on Mixed and Augmented Reality 2011 Science and Technology Proceedings.
1 Umrer College Of Engineering, Umrer DEPARTMENT OF COMPUTER ENGINEERING VIII SEMESTER Robo-Pathfinder Projectees Manish M. Khurpade Mayank P.
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
Virtual Mirror for Fashion Retailing
INVERSE KINEMATICS ANALYSIS TRAJECTORY PLANNING FOR A ROBOT ARM Proceedings of th Asian Control Conference Kaohsiung, Taiwan, May 15-18, 2011 Guo-Shing.
Graduate Programs in Computer Science A Soft Hand Model for Physically-based Manipulation of Virtual Objects Jan Jacobs Group Research.
Towards Cognitive Robotics Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Christian.
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
Boundary Assertion in Behavior-Based Robotics Stephen Cohorn - Dept. of Math, Physics & Engineering, Tarleton State University Mentor: Dr. Mircea Agapie.
Online Kinect Handwritten Digit Recognition Based on Dynamic Time Warping and Support Vector Machine Journal of Information & Computational Science, 2015.
Robotics Sharif In the name of Allah. Robotics Sharif Introduction to Robotics o Leila Sharif o o Lecture #2: The.
HCI 입문 Graphics Korea University HCI System 2005 년 2 학기 김 창 헌.
EEC 490 GROUP PRESENTATION: KINECT TASK VALIDATION Scott Kruger Nate Dick Pete Hogrefe James Kulon.
The palm was created using a modular cavity design. It was designed using ProEngineer and printed using Rapid Prototype. The fingers were made using Polymorph.
Abstract A Structured Approach for Modular Design: A Plug and Play Middleware for Sensory Modules, Actuation Platforms, Task Descriptions and Implementations.
Contents  Teleoperated robotic systems  The effect of the communication delay on teleoperation  Data transfer rate control for teleoperation systems.
Hand Gesture Recognition Using Haar-Like Features and a Stochastic Context-Free Grammar IEEE 高裕凱 陳思安.
Chapter 8. Learning of Gestures by Imitation in a Humanoid Robot in Imitation and Social Learning in Robots, Calinon and Billard. Course: Robots Learning.
Haris Ali (15) Abdul Ghafoor (01) Kashif Zafar (27)
Design of a Compliant and Force Sensing Hand for a Humanoid Robot Aaron Edsinger-Gonzales MIT Computer Science and Artificial Intelligence Laboratory.
Team Members: Bradley Adriaansen William Chan Mark Gonzalez Paul Slagle William Tierney.
HAPTIC TECHNOLOGY ASHWINI P 1PE06CS017.
Toward humanoid manipulation in human-centered environments T. Asfour, P. Azad, N. Vahrenkamp, K. Regenstein, A. Bierbaum, K. Welke, J. Schroder, R. Dillmann.
MULTIMODAL AND NATURAL COMPUTER INTERACTION Domas Jonaitis.
Louise Hunter. Background Search & Rescue Collapsed caves/mines Natural disasters Robots Underwater surveying Planetary exploration Bomb disposal.
SPACE MOUSE. INTRODUCTION  It is a human computer interaction technology  Helps in movement of manipulator in 6 degree of freedom * 3 translation degree.
Introducing virtual REALITY
Daniel A. Taylor Pitt- Bradford University
iCub Interactive Tutoring Demo
CS 4501: Introduction to Computer Vision Augmented and Virtual Reality
First-person Teleoperation of Humanoid Robots
Bh. Ravi Krishna 12N31A0533 야나 누르디아나 학번 : 사이버경찰하과.
ROBOTICS.
HUMANOID ROBOTS USED FOR SURVEILLANCE
Sai Goud Durgappagari Dr. Yingcai Xiao
Andreas Hermann, Felix Mauch, Sebastian Klemm, Arne Roennau
Salient Features of Soft Tissue Examination Velocity during Manual Palpation Jelizaveta Konstantinova1, Kaspar Althoefer1, Prokar Dasgupta2, Thrishantha.
Automation as the Subject of Mechanical Engineer’s interest
VR Gloves : Latest Thrill For Tech Enthusiasts
Project Proposal: 3D Robotic Hand Continuation
Augmented Reality And Virtual Reality.
Human System Interactions, HSI '09. 2nd Conference on

Virtual Reality By: brady adger.
VR and AR In Education 010/10/2017.
GESTURE CONTROLLED ROBOTIC ARM
Humanoid Robotics – A Social Interaction
Xbox Kinect (Microsoft)
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Domo: Manipulation for Partner Robots Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group
Ch 14. Active Vision for Goal-Oriented Humanoid Robot Walking (1/2) Creating Brain-Like Intelligence, Sendhoff et al. (eds), Robots Learning from.
Data Acquisition of XYZ Coordinate Positions of Human Joints Using Xbox Kinect Sensor and LabVIEW Bio-Robotics Lab, USRB 281 Mechanical/Bio-Medical Engineering.
Euratom-Tekes Annual Fusion Seminar 2012 Janne Tuominen
Virtual Reality.
The Implementation of a Glove-Based User Interface
Domo: Manipulation for Partner Robots Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group
Presentation transcript:

First-Person Tele- Operation of a Humanoid Robot Lars Fritsche, Felix Unverzagt, Jan Peters and Roberto Calandra

OUTLINE Introduction Hardware Setting Control Architecture Operator-Robot Correspondence Experimental Results Discussion & Conclusion

INTRODUCTION Augmented reality: A live view of a real-world environment whose elements are augmented by computer-generated sensory input -- Provides a first-person vision for the robot operator Tele-Operation of Robots: Operating the robot at a distance. Provides the possibility of performing complex tasks in environments inaccessible or too dangerous for humans. Examples: Epidemic areas Surgical applications Damaged nuclear plants (e.g., Fukushima) Space & underwater exploration.

HARDWARE SETTING Microsoft Kinect v2 Collect human motion data Oculus Rift Headset Augmented reality headset SensorGlove Detect grasping motions, provide haptic feedback iCub The humanoid robot

Microsoft Kinect v2 Combines a camera and a depth sensor to retrieve 3D-skeleton data from the operator. From "First-person tele-operation of a humanoid robot“

Oculus Rift Headset A virtual reality & augmented reality headset with OLED displays and accurate, low-latency positional head tracking. Sensor Glove Track operator’s finger motions and use vibration motors to induce haptic feedback on operator’s fingertips according to the pressure measured on the fingertips of the robot. From

iCub The iCub is a humanoid robot of 1 meter tall and weighting 24 kilograms. It has 53 degrees of freedom (DOFs), 30 of them are used during the tele-operation. The robot has a cameras for each eye that provide images at the maximum resolution of 640 x 480 pixels. Additionally, the robot has tactile sensors on the fingertips to measure the pressure From

CONTROL ARCHITECTURE "First-person tele-operation of a humanoid robot“.

YARP: Yet Another Robot Platform YARP (Yet Another Robot Platform) is a software package for interconnecting sensors, processors, and actuators in robots. iCub Uses YARP as a way to define input and output ports for its control. Advantage: Modular and easily extensible.

Control Signals Filtering The signal retrieved are noisy and would lead to very jerky movements if applied directly to the robot that could endanger the motors. Use Butterworth filters Two individual filters are designed: The filter for Kinect is chosen to be of eighth-order The filter for both SensorGlove and Oculus Rift are of fourth-order

Safety Routines Multiple safety routines had to be implemented to guarantee a robust tele-operation in different conditions and even in the presence of non-expert operators. Thresholds are implemented for each joint to limit the workspace A maximum step size is introduced to limit the command signal such that the robot can not exceed its own bounds by force.

Mapping of the Arm Joints 4 DOFs on each arm Mapping of the Torso Joints 3 DOFs Mapping of the Head Joints 5 DOFs Mapping of the Hand Joints 7 DOFs on each hand Operator-Robot Correspondence

Experimental Results: Head Motion

Head Motion "First-person tele-operation of a humanoid robot“.

Experimental Results

Time delay for the Kinect Time delay for the Oculus Rift Time delay for the Sensor Glove "First-person tele-operation of a humanoid robot“.

DISCUSSION & CONCLUSIONS This paper presented an affordable approach to the first-person tele-operation of humanoid robot by augmented-reality motion-tracking and tactile feedback using the Microsoft Kinect, Oculus Rift and SensorGlove. Augmented reality on the humanoid robot may help to achieve a more natural and efficient tele-operation. Shortcoming: High latency for arm movement in tele-operation.

REFERENCE Fritsche, Lars, et al. "First-person tele-operation of a humanoid robot." Humanoid Robots (Humanoids), 2015 IEEE-RAS 15th International Conference on. IEEE, “Official Kinect website”, accessed: “First-Person Tele-Operation of the Humanoid Robot iCub (HUMANOIDS 2015)“, “Controlling the head of the iCub using the Oculus Rift”, Metta, Giorgio, Paul Fitzpatrick, and Lorenzo Natale. "YARP: yet another robot platform." International Journal on Advanced Robotics Systems 3.1 (2006): “Official YARP website”, accessed: “Wikipedia: Augmented reality”, accessed: “Wikipedia: Butterworth filter”, accessed:

Thank you.