First-person Teleoperation of Humanoid Robots Lars Fritsche, Felix Unverzagt, Jan Peters, Roberto Calandra Presented by Antong Liu
Motivation for Teleoperation Research Encoding joint movements for learning Military and Industry Unstructured tasks in inaccessible environment Medicine Long distance medicine Minimally invasive surgery Radioactive sites, disaster sites, deep oceans
Tools for Teleoperation Direction manipulation Joystick manipulation of joints Joint tracking Motion tracking through Microsoft Kinect Full-body tracking suits
Limitation of Current Methods Direct visual contact between operator and robot required
Proposed Method Oculus Rift virtual reality goggles SensorGlove haptic feedback gloves
Experiment Setup Blue arrows show flow of information from operator to= robot, and red arrows show feedback from robot to operator
Components: Microsoft Kinect Collects motion data of the body of the operator Uses camera with depth sensor to generate 3D skeleton Low price and does not require special cameras or markers Prone to noise and cannot handle occlusion Update frequency limited to 30 Hz
Components: Oculus Rift Head-mounted virtual reality display High resolution displays split vertically for each eye Gyroscopic tracking of head movement at 1 kHz
Components: SensorGlove Haptic feedback sensor gloves Track finger motion at 350 Hz
Components: iCub Robot 104cm tall, 24kg humanoid robot 53 total degrees of freedom 30 used in this experiment 3 in torso 5 in head 4 in each arm 7 in each hand Tactile sensors in fingertips 640x480 camera in each eye
Components: Controller 7 DoF arms calculated from positions of operator shoulder, elbow, and wrist 3 DoF torso controlled under assumption operator is upright (spine aligned with gravity vector) 5 DoF head controlled by Oculus Rift orientation Movement exceeding iCub boundaries is outsourced to eye DoF Torso from spine, hip, and shoulders
Components: Controller 7 DoF hands controlled by bending sensed from glove Thumb, index, and middle fingers are independent Ring and pinky are controlled by a single DoF SensorGlove returns haptic feedback proportional to largest pressure on iCub’s 12 tactile sensors Torso from spine, hip, and shoulders
Components: Controller iCub head has ±35° line of sight By controlling the eyes as well, line of sight improves to ±65°
Control Signals Safety routines implemented to prevent damage to robot Jerking at maximal joint values Control suspended when Kinect detects more than one person in the scene Maximum joint step size for abrupt changes in position
Control Signals Signals from all components needed filtering Butterworth filter used: maximally flat in passband Maximum sample rate of 100 Hz (hardware limit) Kinect: 1.5 Hz Oculus Rift and SensorGlove: 5 Hz Tradeoff between delay and signal smoothness
Control Latency Latency is necessary to ensure safe operation High delay can be disorienting to operator After filtering Kinect delay 600ms, iCub operation delay 200ms Oculus Rift and SensorGlove delay 100ms Delay proportional to how noisy the raw data is
Experiments: Mimic
Experiments: Pick and place
Conclusion Using an Oculus VR device removes the need for the operator to have visual contact with the robot Robot is safe enough to interact directly with humans First-person controls are intuitive for human operators Latency may be a hindering factor Kinect has limitation on trackable poses due to occlusion intolerance