Download presentation
Presentation is loading. Please wait.
Published byKerrie Jasmine Black Modified over 9 years ago
1
Vision-Based Reach-To-Grasp Movements From the Human Example to an Autonomous Robotic System Alexa Hauck
2
Context Special Research Program “Sensorimotor” C1: Human and Robotic Hand-Eye Coordination Neurological Clinic (Großhadern), LMU München Institute for Real-Time Computer Systems, TU München MODEL of Hand-Eye Coordination ANALYSIS of human reaching movements SYNTHESIS of a robotic system
3
The Question is... How to use which visual information for motion control? control strategyrepresentationcatchingreaching
4
State-of-the-art Robotics + easy integration with path planning + only little visual information needed – sensitive against model errors + model errors can be compensated – convergence not assured – high-rate vision needed Impressive results... but nowhere near human performance! Visual Servoing:(visual feedback control) Look-then-move:(visual feedforward control)
5
The Human Example Separately controlled hand transport: almost straight path bell-shaped velocity profile Experiments with target jump: smooth on-line correction of the trajectory Experiments with prism glasses: on-line correction using visual feedback off-line recalibration of internal models Use of visual information in spatial representation Combination of visual feedforward and feedback... but how ?
6
New Control Strategy
7
Example: Point-to-point
8
Example: Target Jump
11
Example: Multiple Jumps
13
Example: Double Jump
14
Hand-Eye System Robot images Image Processing features Image Interpretation position target & hand Motion Planning trajectory Robot Control commands Models Hand-Eye System & Objects object model sensor model arm model object model
15
The Robot: MinERVA manipulator with 6 joints CCD cameras pan-tilt head
16
Robot Vision 3D Bin. Stereo Target corresponding points Hand corresponding points
17
Example: Reaching
20
Model Parameters Arm: geometry, kinematics 3 parameters Arm-Head Relation: coordinate transformation 3 parameters Head-Camera Relations: coordinate transformations 4 parameters Cameras: pinhole camera model 4 parameters (+ rad. distortion) Calibration manufacturer measuring tape HALCON
21
Use of Visual Feedback meanmaxcorr 0 8.9cm20cm 1 Hz 0.4cm1cm
22
Example: Vergence Error
23
Example: Compensation
24
Summary New control strategy for hand-eye coordination Extension of a biological model Unification of look-then-move & visual servoing Flexible, economic use of visual information Validation in simulation Implementation on a real hand-eye system
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.