Vision-Based Motion Control of Robots

Slides:



Advertisements
Similar presentations
Chayatat Ratanasawanya Min He May 13, Background information The goal Tasks involved in implementation Depth estimation Pitch & yaw correction angle.
Advertisements

Visual Servo Control Tutorial Part 1: Basic Approaches Chayatat Ratanasawanya December 2, 2009 Ref: Article by Francois Chaumette & Seth Hutchinson.
Last 4 lectures Camera Structure HDR Image Filtering Image Transform.
Hybrid Position-Based Visual Servoing
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Venkataramanan Balakrishnan Purdue University Applications of Convex Optimization in Systems and Control.
September, School of Aeronautics & Astronautics Engineering Performance of Integrated Electro-Optical Navigation Systems Takayuki Hoshizaki
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
CSCE 641: Forward kinematics and inverse kinematics Jinxiang Chai.
SA-1 Body Scheme Learning Through Self-Perception Jürgen Sturm, Christian Plagemann, Wolfram Burgard.
CH24 in Robotics Handbook Presented by Wen Li Ph.D. student Texas A&M University.
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
Model Independent Visual Servoing CMPUT 610 Literature Reading Presentation Zhen Deng.
Vision Based Motion Control Martin Jagersand University of Alberta CIRA 2001.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Hybrid Manipulation: Force-Vision CMPUT 610 Martin Jagersand.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
CSCE 641: Forward kinematics and inverse kinematics Jinxiang Chai.
Uncalibrated Geometry & Stratification Sastry and Yang
MEAM 620 Project Report Nima Moshtagh.
CMPUT 412 3D Computer Vision Presented by Azad Shademan Feb , 2007.
CSCE 689: Forward Kinematics and Inverse Kinematics
August, School of Aeronautics & Astronautics Engineering Optical Navigation Systems Takayuki Hoshizaki Prof. Dominick Andrisani.
Intelligent Systems Lectures 17 Control systems of robots based on Neural Networks.
Advanced Computer Vision Structure from Motion. Geometric structure-from-motion problem: using image matches to estimate: The 3D positions of the corresponding.
Estimating Visual-Motor Functions CMPUT Martin Jagersand.
ROBOT MAPPING AND EKF SLAM
Multi-view geometry. Multi-view geometry problems Structure: Given projections of the same 3D point in two or more images, compute the 3D coordinates.
1 Formation et Analyse d’Images Session 7 Daniela Hall 7 November 2005.
Kalman filter and SLAM problem
Definition of an Industrial Robot
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Kinematic Linkages.
오 세 영, 이 진 수 전자전기공학과, 뇌연구센터 포항공과대학교
Geometry and Algebra of Multiple Views
/09/dji-phantom-crashes-into- canadian-lake/
Inverse Kinematics Find the required joint angles to place the robot at a given location Places the frame {T} at a point relative to the frame {S} Often.
Robot Vision Control of robot motion from video cmput 615/499 M. Jagersand.
1 Final Conference, 19th – 23rd January 2015 Geneva, Switzerland RP 15 Force estimation based on proprioceptive sensors for teleoperation in radioactive.
Vision-Based Reach-To-Grasp Movements From the Human Example to an Autonomous Robotic System Alexa Hauck.
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Robot Kinematics and linkage configurations 2 cmput 412 M. Jagersand With slides from A. Shademan and A Casals.
CS654: Digital Image Analysis Lecture 8: Stereo Imaging.
Robust localization algorithms for an autonomous campus tour guide Richard Thrapp Christian Westbrook Devika Subramanian Rice University Presented at ICRA.
Young Ki Baik, Computer Vision Lab.
Human-Computer Interaction Kalman Filter Hanyang University Jong-Il Park.
CSCE 441: Computer Graphics Forward/Inverse kinematics Jinxiang Chai.
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Two-view geometry. Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the.
Hand-Eye Coordination and Vision-based Interaction / cmput610 Martin Jagersand.
Vision and Obstacle Avoidance In Cartesian Space.
ECE 450 Introduction to Robotics Section: Instructor: Linda A. Gee 10/07/99 Lecture 11.
Trajectory Generation
V ISION -B ASED T RACKING OF A M OVING O BJECT BY A 2 DOF H ELICOPTER M ODEL : T HE S IMULATION Chayatat Ratanasawanya October 30, 2009.
Robot Kinematics and linkage configurations 2 cmput 412 M. Jagersand With slides from A. Shademan and A Casals.
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
3D Reconstruction Using Image Sequence
Chayatat Ratanasawanya Min He April 6,  Recall previous presentation  The goal  Progress report ◦ Image processing ◦ depth estimation ◦ Camera.
G. Casalino, E. Zereik, E. Simetti, A. Turetta, S. Torelli and A. Sperindè EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
CSCE 441: Computer Graphics Forward/Inverse kinematics Jinxiang Chai.
Camera calibration from multiple view of a 2D object, using a global non linear minimization method Computer Engineering YOO GWI HYEON.
Robot vision review Martin Jagersand What is Computer Vision? Three Related fieldsThree Related fields –Image Processing: Changes 2D images into other.
Singularity-Robust Task Priority Redundancy Resolution for Real-time Kinematic Control of Robot Manipulators Stefano Chiaverini.
Character Animation Forward and Inverse Kinematics
Robot Vision Control of robot motion from video
Simultaneous Localization and Mapping
Lecture 10 Causal Estimation of 3D Structure and Motion
CSCE 441: Computer Graphics Forward/Inverse kinematics
Christoph J. Backi and Sigurd Skogestad
Chapter 4 . Trajectory planning and Inverse kinematics
Presentation transcript:

Vision-Based Motion Control of Robots Azad Shademan Guest Lecturer CMPUT 412 – Experimental Robotics Computing Science, University of Alberta Edmonton, Alberta, CANADA

Vision-Based Control current desired Left Image Right Image B B A A A Overview of VS, 2. What are the current approaches and their problems), 3. Which problem have we addressed? (what is our motivation for global model estimation) B A. Shademan. CMPUT 412, Vision-based motion control of robots

Vision-Based Control Left Image Right Image B B B A. Shademan. CMPUT 412, Vision-based motion control of robots

Vision-Based Control Feedback from visual sensor (camera) to control a robot Also called “Visual Servoing” Is it any difficult? Images are 2D, the robot workspace is 3D 2D data  3D geometry A. Shademan. CMPUT 412, Vision-based motion control of robots

Where is the camera located? Eye-to-Hand e.g.,hand/eye coordination Eye-in-Hand A. Shademan. CMPUT 412, Vision-based motion control of robots

Visual Servo Control law Position-Based: Robust and real-time pose estimation + robot’s world-space (Cartesian) controller Image-Based: Desired image features seen from camera Control law entirely based on image features A. Shademan. CMPUT 412, Vision-based motion control of robots

Position-Based Desired pose Estimated pose A. Shademan. CMPUT 412, Vision-based motion control of robots

Image-Based Desired Image feature Extracted image feature A. Shademan. CMPUT 412, Vision-based motion control of robots

Visual-motor Equation x1 x2 x3 x4 q=[q1 … q6] Visual-Motor Equation This Jacobian is important for motion control. A. Shademan. CMPUT 412, Vision-based motion control of robots

Visual-motor Jacobian Image space velocity Joint space velocity A B A. Shademan. CMPUT 412, Vision-based motion control of robots

Image-Based Control Law Measure the error in image space Calculate/Estimate the inverse Jacobian Update new joint values A. Shademan. CMPUT 412, Vision-based motion control of robots

Image-Based Control Law Desired Image feature Extracted image feature A. Shademan. CMPUT 412, Vision-based motion control of robots

Jacobian calculation Analytic form available if model is known. Known model  Calibrated Must be estimated if model is not known Unknown model  Uncalibrated A. Shademan. CMPUT 412, Vision-based motion control of robots

Image Jacobian (calibrated) Analytic form depends on depth estimates. Camera Velocity Camera/Robot transform required. No flexibility. A. Shademan. CMPUT 412, Vision-based motion control of robots

Image Jacobian (uncalibrated) A popular local estimator: Recursive secant method (Broyden update): A. Shademan. CMPUT 412, Vision-based motion control of robots

Calibrated vs. Uncalibrated Model derived analytically Global asymptotic stability  Optimal planning is possible  A lot of prior knowledge on the model  Relaxed model assumptions Traditionally: Local methods No global planning  Difficult to show asymptotic stability condition is ensured  The problem of traditional methods is the locality. Global Model Estimation (Research result) Optimal trajectory planning  Global stability guarantee  Kinematics A. Shademan. CMPUT 412, Vision-based motion control of robots

Synopsis of Global Visual Servoing Model Estimation (Uncalibrated) Visual-Motor Kinematics Model Global Model Extending Linear Estimation (Visual-Motor Jacobian) to Nonlinear Estimation Our contributions: K-NN Regression-Based Estimation Locally Least Squares Estimation A. Shademan. CMPUT 412, Vision-based motion control of robots

Local vs. Global 1st Rank Broyden update: Jägersand et al. ’97 Key idea: using only the previous estimation to estimate the Jacobian RLS with forgetting factor Hosoda and Asada ’94 1st Rank Broyden update: Jägersand et al. ’97 Exploratory motion: Sutanto et al. ‘98 Quasi-Newton Jacobian estimation of moving object: Piepmeier et al. ‘04 Key idea: using all of the interaction history to estimate the Jacobian Globally-Stable controller design Optimal path planning Local methods don’t! RLS with forgetting factor, 1st rank broyden, exploratory motion, quasi-Newton Jacobian estimation of a moving object A. Shademan. CMPUT 412, Vision-based motion control of robots

K-NN Regression-based Method ? q1 q2 x1 q2 3 NN q1 A. Shademan. CMPUT 412, Vision-based motion control of robots

Locally Least Squares Method ? q1 q2 x1 (X,q) KNN(q) A. Shademan. CMPUT 412, Vision-based motion control of robots

Experimental Setup Puma 560 Eye-to-hand configuration Stereo vision Features: projection of the end-effector’s position on image planes (4-dim) 3 DOF for control A. Shademan. CMPUT 412, Vision-based motion control of robots

Measuring the Estimation Error Hidden? A. Shademan. CMPUT 412, Vision-based motion control of robots

Global Estimation Error Local estimation and KNN have the same order, but the variance of KNN method is much less. A. Shademan. CMPUT 412, Vision-based motion control of robots

Noise on Estimation Quality KNN LLS With increasing noise level, the error decreases A. Shademan. CMPUT 412, Vision-based motion control of robots

Effect of Number of Neighbors A. Shademan. CMPUT 412, Vision-based motion control of robots

Conclusions Presented two global methods to learn the visual-motor function LLS (global) works better than the KNN (global) and local updates. KNN suffers from the bias in local estimations Noise helps system identification A. Shademan. CMPUT 412, Vision-based motion control of robots

Eye-in-Hand Simulator A. Shademan. CMPUT 412, Vision-based motion control of robots

Eye-in-Hand Simulator A. Shademan. CMPUT 412, Vision-based motion control of robots

Eye-in-Hand Simulator A. Shademan. CMPUT 412, Vision-based motion control of robots

Eye-in-Hand Simulator A. Shademan. CMPUT 412, Vision-based motion control of robots

Mean-Squared-Error A. Shademan. CMPUT 412, Vision-based motion control of robots

Task Errors A. Shademan. CMPUT 412, Vision-based motion control of robots

Questions? A. Shademan. CMPUT 412, Vision-based motion control of robots

Position-Based Robust and real-time relative pose estimation Extended Kalman Filter to solve the nonlinear relative pose equations. Cons: EKF is not the optimal estimator. Performance and the convergence of pose estimates are highly sensitive to EKF parameters. A. Shademan. CMPUT 412, Vision-based motion control of robots

2D-3D nonlinear point correspondences Overview of PBVS 2D-3D nonlinear point correspondences What kind of nonlinearity? IEKF T. Lefebvre et al. “Kalman Filters for Nonlinear Systems: A Comparison of Performance,” Intl. J. of Control, vol. 77, no. 7, pp. 639-653, May 2004. A. Shademan. CMPUT 412, Vision-based motion control of robots

Measurement equation is nonlinear and must be linearized. EKF Pose Estimation yaw pitch roll State variable Process noise Measurement noise Measurement equation is nonlinear and must be linearized. A. Shademan. CMPUT 412, Vision-based motion control of robots

Visual-Servoing Based on the Estimated Global Model A. Shademan. CMPUT 412, Vision-based motion control of robots

Control Based on Local Models See Spong etc book. See if this should be left out as “hidden” A. Shademan. CMPUT 412, Vision-based motion control of robots

Estimation for Local Methods We need to estimate the Jacobian which is equal to minimizing the following problem. In fact we want to fit a plan to the local neighbourhood of the current point q. In discrete form, … This is not how we estimate the Jacobian in practice, rather we estimate the Jacobian locally, that is, at each point we fit a plane to the nonlinear model. For example, by Broyden’s first order method, we estimate the Jacobian as, or using the RLS with forgetting factor (see the paper) In practice: Broyden 1st-rank estimation, RLS with forgetting factor, etc. A. Shademan. CMPUT 412, Vision-based motion control of robots

A. Shademan. CMPUT 412, Vision-based motion control of robots