First-person Teleoperation of Humanoid Robots

Slides:



Advertisements
Similar presentations
Virtual Me. Motion Capture (mocap) Motion capture is the process of simulating actual movement in a computer generated environment The capture subject.
Advertisements

KINECT REHABILITATION
Haptic Glove Hardware Graduation Project Prepared by Yaman A. Salman Eman M. Masarweh 2012.
Where has all the data gone? In a complex system such as Metalman, the interaction of various components can generate unwanted dynamics such as dead time.
Virtual Reality Interface in MATLAB/Simulink for mechatronic interface D. Todorović, M. Božić, V. Zerbe, and G. S. Đor đ ević.
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
Human-in-the-Loop Control of an Assistive Robot Arm Katherine Tsui and Holly Yanco University of Massachusetts, Lowell.
Hand held tools Selection the model of the tool should be suitable to the task and also the posture adopted during the work, as the correct handing handling.
Teleoperation Interfaces. Introduction Interface between the operator and teleoperator! Teleoperation interface is like any other HMI H(mobile)RI = TI.
Virtual Reality Virtual Reality involves the user entering a 3D world generated by the computer. To be immersed in a 3D VR world requires special hardware.
REAL ROBOTS. iCub It has a height of 100 cm, weighs 23 Kg, and is able to recognize and manipulate objects. Each hand has 9 DOF and can feel objects almost.
2.03B Common Types and Interface Devices and Systems of Virtual Reality 2.03 Explore virtual reality.
Prepared By: Menna Hamza Mohamed Mohamed Hesham Fadl Mona Abdel Mageed El-Koussy Yasmine Shaker Abdel Hameed Supervised By: Dr. Magda Fayek.
Page 1 | Microsoft Work With Skeleton Data Kinect for Windows Video Courses Jan 2013.
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
Virtual Mirror for Fashion Retailing
Robotic Arm for Minimally Invasive Surgery Team: Brenton Nelson, Ashley Huth, Max Michalski, Sujan Bhaheetharan BME 200/300 October 14, 2005.
Robotic Arm and Dexterous Hand Critical Design Review February 18, 2005.
VIRTUAL REALITY Sagar.Khadabadi. Introduction The very first idea of it was presented by Ivan Sutherland in 1965: “make that (virtual) world in the window.
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
Robotics Sharif In the name of Allah. Robotics Sharif Introduction to Robotics o Leila Sharif o o Lecture #2: The.
1 MADRID Measurement Apparatus to Distinguish Rotational and Irrotational Displacement Rafael Ortiz Graduate student Universidad de Valladolid (Spain)
Introduction to Simulation and VR Week 5 Human Dynamics in a Virtual World.
2.03 Explore virtual reality design and use.
Teleoperation In Mixed Initiative Systems. What is teleoperation? Remote operation of robots by humans Can be very difficult for human operator Possible.
HCI 입문 Graphics Korea University HCI System 2005 년 2 학기 김 창 헌.
The palm was created using a modular cavity design. It was designed using ProEngineer and printed using Rapid Prototype. The fingers were made using Polymorph.
Tracking Systems in VR.
Haris Ali (15) Abdul Ghafoor (01) Kashif Zafar (27)
By shooting. Optimal parameters estimation Sample collect Various finger size Hard press and soft press Exhaustive search.
Robots in Space.
Robotic Arm and Dexterous Hand Preliminary Design Review November 12, 2004.
Outline Introduction Related Work System Overview Methodology Experiment Conclusion and Future Work.
I-SNAKE. o WHAT IS I-SNAKE? o COMPONENTS OF I-SNAKE o WHAT IS THE PURPOSE OF NEW CARDIO ARM: o HOW DOES I-SNAKE WORKS? o DEVICE DESIGN o TESTING PROCESS.
1 Robonaut: A Humanoid Robotic Assistant for On-Orbit and Planetary Missions Nicolaus Radford Automation, Robotics and Simulation Division NASA/Johnson.
ROBOTIC COMPONENTS, MOVEMENTS AND ARTICULATION DESIGN & APPLIED ENGINEERING II MR. RANDT.
HAPTIC TECHNOLOGY ASHWINI P 1PE06CS017.
First-Person Tele- Operation of a Humanoid Robot Lars Fritsche, Felix Unverzagt, Jan Peters and Roberto Calandra.
Virtual Reality Prepared By Name - Abhilash Mund Regd.No Branch - Comp.sc & engg.
Introducing virtual REALITY
ROBOTICS.
Salient Features of Soft Tissue Examination Velocity during Manual Palpation Jelizaveta Konstantinova1, Kaspar Althoefer1, Prokar Dasgupta2, Thrishantha.
Human System Interactions, HSI '09. 2nd Conference on
Summarized by Geb Thomas
Virtual Reality By: brady adger.
GESTURE CONTROLLED ROBOTIC ARM
Overview Implement a Haptic Base Station Software
Cumulative Design Review
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Cover Option2.
Domo: Manipulation for Partner Robots Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group
Manipulation in Human Environments
Irving Vasquez, L. Enrique Sucar, Rafael Murrieta-Cid
3.03 Explore virtual reality design and use.
NBKeyboard: An Arm-based Word-gesture keyboard
Finger Interaction Virtual Reality
Jörg Stückler, imageMax Schwarz and Sven Behnke*
Depth Perception in Medical Augmented Reality
Euratom-Tekes Annual Fusion Seminar 2012 Janne Tuominen
Laura A. Miller, PhD, CP, Robert D. Lipschutz, CP, Kathy A
Comparing Slopes of the Lines using the Microsoft Kinect 2.0
Kinect for Creative Development with open source frameworks
Virtual Reality.
A glove that touch the future By jia ming feng
Ms. DaSilva Physical Education
Domo: Manipulation for Partner Robots Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group
3D User Interface Input Hardware
Klaas Werkman Arjen Vellinga
Stefan Oßwald, Philipp Karkowski, Maren Bennewitz
Presentation transcript:

First-person Teleoperation of Humanoid Robots Lars Fritsche, Felix Unverzagt, Jan Peters, Roberto Calandra Presented by Antong Liu

Motivation for Teleoperation Research Encoding joint movements for learning Military and Industry Unstructured tasks in inaccessible environment Medicine Long distance medicine Minimally invasive surgery Radioactive sites, disaster sites, deep oceans

Tools for Teleoperation Direction manipulation Joystick manipulation of joints Joint tracking Motion tracking through Microsoft Kinect Full-body tracking suits

Limitation of Current Methods Direct visual contact between operator and robot required

Proposed Method Oculus Rift virtual reality goggles SensorGlove haptic feedback gloves

Experiment Setup Blue arrows show flow of information from operator to= robot, and red arrows show feedback from robot to operator

Components: Microsoft Kinect Collects motion data of the body of the operator Uses camera with depth sensor to generate 3D skeleton Low price and does not require special cameras or markers Prone to noise and cannot handle occlusion Update frequency limited to 30 Hz

Components: Oculus Rift Head-mounted virtual reality display High resolution displays split vertically for each eye Gyroscopic tracking of head movement at 1 kHz

Components: SensorGlove Haptic feedback sensor gloves Track finger motion at 350 Hz

Components: iCub Robot 104cm tall, 24kg humanoid robot 53 total degrees of freedom 30 used in this experiment 3 in torso 5 in head 4 in each arm 7 in each hand Tactile sensors in fingertips 640x480 camera in each eye

Components: Controller 7 DoF arms calculated from positions of operator shoulder, elbow, and wrist 3 DoF torso controlled under assumption operator is upright (spine aligned with gravity vector) 5 DoF head controlled by Oculus Rift orientation Movement exceeding iCub boundaries is outsourced to eye DoF Torso from spine, hip, and shoulders

Components: Controller 7 DoF hands controlled by bending sensed from glove Thumb, index, and middle fingers are independent Ring and pinky are controlled by a single DoF SensorGlove returns haptic feedback proportional to largest pressure on iCub’s 12 tactile sensors Torso from spine, hip, and shoulders

Components: Controller iCub head has ±35° line of sight By controlling the eyes as well, line of sight improves to ±65°

Control Signals Safety routines implemented to prevent damage to robot Jerking at maximal joint values Control suspended when Kinect detects more than one person in the scene Maximum joint step size for abrupt changes in position

Control Signals Signals from all components needed filtering Butterworth filter used: maximally flat in passband Maximum sample rate of 100 Hz (hardware limit) Kinect: 1.5 Hz Oculus Rift and SensorGlove: 5 Hz Tradeoff between delay and signal smoothness

Control Latency Latency is necessary to ensure safe operation High delay can be disorienting to operator After filtering Kinect delay 600ms, iCub operation delay 200ms Oculus Rift and SensorGlove delay 100ms Delay proportional to how noisy the raw data is

Experiments: Mimic

Experiments: Pick and place

Conclusion Using an Oculus VR device removes the need for the operator to have visual contact with the robot Robot is safe enough to interact directly with humans First-person controls are intuitive for human operators Latency may be a hindering factor Kinect has limitation on trackable poses due to occlusion intolerance