Vision-Based Reach-To-Grasp Movements From the Human Example to an Autonomous Robotic System Alexa Hauck.

Slides:



Advertisements
Similar presentations
University of Karlsruhe September 30th, 2004 Masayuki Fujita
Advertisements

Visual feedback in the control of reaching movements David Knill and Jeff Saunders.
Visual Servo Control Tutorial Part 1: Basic Approaches Chayatat Ratanasawanya December 2, 2009 Ref: Article by Francois Chaumette & Seth Hutchinson.
Results/Conclusions: In computer graphics, AR is achieved by the alignment of the virtual camera with the actual camera and the virtual object with the.
Kinematic Synthesis of Robotic Manipulators from Task Descriptions June 2003 By: Tarek Sobh, Daniel Toundykov.
System Integration and Experimental Results Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash.
Hybrid Position-Based Visual Servoing
Vision Based Control Motion Matt Baker Kevin VanDyke.
GIS and Image Processing for Environmental Analysis with Outdoor Mobile Robots School of Electrical & Electronic Engineering Queen’s University Belfast.
Vision for Robotics ir. Roel Pieters & dr. Dragan Kostić Section Dynamics and Control Dept. Mechanical Engineering {r.s.pieters, January.
CH24 in Robotics Handbook Presented by Wen Li Ph.D. student Texas A&M University.
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
Model Independent Visual Servoing CMPUT 610 Literature Reading Presentation Zhen Deng.
Vision-Based Motion Control of Robots
Vision Based Motion Control Martin Jagersand University of Alberta CIRA 2001.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Computing Movement Geometry A step in Sensory-Motor Transformations Elizabeth Torres & David Zipser.
Hand-Eye Coordination and Vision-based Interaction Martin Jagersand Collaborators: Zach Dodds, Greg Hager, Andreas Pichler.
Laboratory for Perceptual Robotics Department of Computer Science University of Massachusetts Amherst Natural Task Decomposition with Intrinsic Potential.
MEAM 620 Project Report Nima Moshtagh.
Learning From Demonstration Atkeson and Schaal Dang, RLAB Feb 28 th, 2007.
MIRROR Project Review IST Brussels – December 1 st, 2003.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
Telerobotics on the Internet James Mellington. Overview Telerobotics Project Goals Basic System Components The Original System Extension of the System.
Intelligent Systems Lectures 17 Control systems of robots based on Neural Networks.
Temporal Motion Control CMPUT 610 Martin Jagersand.
Vision Guided Robotics
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.1: 3D Geometry Jürgen Sturm Technische Universität München.
Definition of an Industrial Robot
Airborne Attitude Determination and Ground Target Location Using GPS Information and Vision Technique Shan-Chih Hsieh, Luke K.Wang, Yean-Nong Yang †,Fei-Bin.
All rights reserved © Altec ExoMars 2018 Rover Operations Control Centre Available Tools for planning and Data Processing I. Musso.
Robot Vision SS 2007 Matthias Rüther ROBOT VISION 2VO 1KU Matthias Rüther.
A kinematic cost Reza Shadmehr. Subject’s performanceMinimum jerk motion Flash and Hogan, J Neurosci 1985 Point to point movements generally exhibit similar.
Optimization-Based Full Body Control for the DARPA Robotics Challenge Siyuan Feng Mar
오 세 영, 이 진 수 전자전기공학과, 뇌연구센터 포항공과대학교
Motion Control. Two-Link Planar Robot Determine Kp and Kv and Tv.
Landing a UAV on a Runway Using Image Registration Andrew Miller, Don Harper, Mubarak Shah University of Central Florida ICRA 2008.
Vision-based Landing of an Unmanned Air Vehicle
Emergence of Cognitive Grasping through Emulation, Introspection and Surprise GRASP EUl 7 th Framework Program GRASP Emergence of Cognitive Grasping through.
National Research Council Canada Conseil national de recherches Canada National Research Council Canada Conseil national de recherches Canada Institute.
Metrology 1.Perspective distortion. 2.Depth is lost.
Centre for Mechanical Technology and Automation Institute of Electronics Engineering and Telematics  TEMA  IEETA  Simulation.
The effect of varying the visual context on trajectory planning and perceptual awareness of one’s own performance 1 Psychology and Neurocognition Laboratory.
M.S. Thesis Defense Jason Anderson Electrical and Computer Engineering Dept. Clemson University.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Robot Vision SS 2007 Matthias Rüther 1 ROBOT VISION Lesson 9: Robots & Vision Matthias Rüther.
Based on the success of image extraction/interpretation technology and advances in control theory, more recent research has focused on the use of a monocular.
Accurate Robot Positioning using Corrective Learning Ram Subramanian ECE 539 Course Project Fall 2003.
Research Topics Dr. Ming Liu Force control Non-linear control systems Decentralised control Dynamic vision Real-time image processing.
Voluntary Movement I. Psychophysical principles & Neural control of reaching and grasping Claude Ghez, M.D.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 1.1: Welcome Jürgen Sturm Technische Universität München.
CONTINUOUS PATH MOTION
Typical DOE environmental management robotics require a highly experienced human operator to remotely guide and control every joint movement. The human.
Visuo-Motor Relationships: Plasticity and Development.
MASKS © 2004 Invitation to 3D vision. MASKS © 2004 Invitation to 3D vision Lecture 1 Overview and Introduction.
Geometry Calibration for High Resolution Small Animal Imaging Vi-Hoa (Tim) Tran Thomas Jefferson National Accelerator Facilities.
Robot Vision SS 2009 Matthias Rüther ROBOT VISION 2VO 1KU Matthias Rüther.
Ali Ghadirzadeh, Atsuto Maki, Mårten Björkman Sept 28- Oct Hamburg Germany Presented by Jen-Fang Chang 1.
University of Pennsylvania 1 GRASP Control of Multiple Autonomous Robot Systems Vijay Kumar Camillo Taylor Aveek Das Guilherme Pereira John Spletzer GRASP.
Date of download: 9/19/2016 Copyright © ASME. All rights reserved. From: A Comparative Study on Motion Characteristics of Three Two-Degree-of-Freedom Pointing.
Advanced Computer Graphics
Date of download: 10/22/2017 Copyright © ASME. All rights reserved.
Accurate Robot Positioning using Corrective Learning
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Estimating the Kinematics of Unseen Joints that Affect the Vision System Justin Hart, Brian Scassellati, Steven Zucker Department of Computer Science.
CSE4421/5324: Introduction to Robotics
Multiple View Geometry for Robotics
Cell Simulation Pick and Place Routine.
The Organization and Planning of Movement Ch
Chapter 4 . Trajectory planning and Inverse kinematics
Presentation transcript:

Vision-Based Reach-To-Grasp Movements From the Human Example to an Autonomous Robotic System Alexa Hauck

Context Special Research Program “Sensorimotor”  C1: Human and Robotic Hand-Eye Coordination Neurological Clinic (Großhadern), LMU München Institute for Real-Time Computer Systems, TU München MODEL of Hand-Eye Coordination ANALYSIS of human reaching movements SYNTHESIS of a robotic system

The Question is... How to use which visual information for motion control? control strategyrepresentationcatchingreaching

State-of-the-art Robotics + easy integration with path planning + only little visual information needed – sensitive against model errors + model errors can be compensated – convergence not assured – high-rate vision needed Impressive results... but nowhere near human performance! Visual Servoing:(visual feedback control) Look-then-move:(visual feedforward control)

The Human Example Separately controlled hand transport: almost straight path bell-shaped velocity profile Experiments with target jump: smooth on-line correction of the trajectory Experiments with prism glasses: on-line correction using visual feedback off-line recalibration of internal models  Use of visual information in spatial representation  Combination of visual feedforward and feedback... but how ?

New Control Strategy

Example: Point-to-point

Example: Target Jump

Example: Multiple Jumps

Example: Double Jump

Hand-Eye System Robot images Image Processing features Image Interpretation position target & hand Motion Planning trajectory Robot Control commands Models Hand-Eye System & Objects object model sensor model arm model object model

The Robot: MinERVA manipulator with 6 joints CCD cameras pan-tilt head

Robot Vision 3D Bin. Stereo Target corresponding points Hand corresponding points

Example: Reaching

Model Parameters Arm: geometry, kinematics 3 parameters Arm-Head Relation: coordinate transformation 3 parameters Head-Camera Relations: coordinate transformations 4 parameters Cameras: pinhole camera model 4 parameters (+ rad. distortion) Calibration manufacturer measuring tape HALCON

Use of Visual Feedback meanmaxcorr 0 8.9cm20cm 1 Hz 0.4cm1cm

Example: Vergence Error

Example: Compensation

Summary New control strategy for hand-eye coordination Extension of a biological model Unification of look-then-move & visual servoing Flexible, economic use of visual information Validation in simulation Implementation on a real hand-eye system