1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta Page 1.

Slides:



Advertisements
Similar presentations
Zhengyou Zhang Vision Technology Group Microsoft Research
Advertisements

Visual Servo Control Tutorial Part 1: Basic Approaches Chayatat Ratanasawanya December 2, 2009 Ref: Article by Francois Chaumette & Seth Hutchinson.
Results/Conclusions: In computer graphics, AR is achieved by the alignment of the virtual camera with the actual camera and the virtual object with the.
Game-Based Design of Human-Robot Interfaces Notes on USR design papers ( ) Presented by Misha Kostandov.
Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Goal To build.
System Integration and Experimental Results Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash.
VisHap: Guangqi Ye, Jason J. Corso, Gregory D. Hager, Allison M. Okamura Presented By: Adelle C. Knight Augmented Reality Combining Haptics and Vision.
5/13/2015CAM Talk G.Kamberova Computer Vision Introduction Gerda Kamberova Department of Computer Science Hofstra University.
Hybrid Position-Based Visual Servoing
Where has all the data gone? In a complex system such as Metalman, the interaction of various components can generate unwanted dynamics such as dead time.
Vision Based Control Motion Matt Baker Kevin VanDyke.
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
Dana Cobzas-PhD thesis Image-Based Models with Applications in Robot Navigation Dana Cobzas Supervisor: Hong Zhang.
CSCE 641: Forward kinematics and inverse kinematics Jinxiang Chai.
SA-1 Body Scheme Learning Through Self-Perception Jürgen Sturm, Christian Plagemann, Wolfram Burgard.
CH24 in Robotics Handbook Presented by Wen Li Ph.D. student Texas A&M University.
Model Independent Visual Servoing CMPUT 610 Literature Reading Presentation Zhen Deng.
Vision-Based Motion Control of Robots
Multi video camera calibration and synchronization.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Hybrid Manipulation: Force-Vision CMPUT 610 Martin Jagersand.
Overview of Computer Vision CS491E/791E. What is Computer Vision? Deals with the development of the theoretical and algorithmic basis by which useful.
CSCE 641: Forward kinematics and inverse kinematics Jinxiang Chai.
Hand-Eye Coordination and Vision-based Interaction Martin Jagersand Collaborators: Zach Dodds, Greg Hager, Andreas Pichler.
MEAM 620 Project Report Nima Moshtagh.
Human-in-the-Loop Control of an Assistive Robot Arm Katherine Tsui and Holly Yanco University of Massachusetts, Lowell.
Computing With Images: Outlook and applications
Image-based Control Convergence issues CMPUT 610 Winter 2001 Martin Jagersand.
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry Topics: Basics of projective geometry Points and hyperplanes in projective space Homography.
Vision Guided Robotics
Sixth Sense Technology. Already existing five senses Five basic senses – seeing, feeling, smelling, tasting and hearing.
Robot Vision SS 2007 Matthias Rüther ROBOT VISION 2VO 1KU Matthias Rüther.
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
Landing a UAV on a Runway Using Image Registration Andrew Miller, Don Harper, Mubarak Shah University of Central Florida ICRA 2008.
Robot Vision Control of robot motion from video cmput 615/499 M. Jagersand.
IMAGE MOSAICING Summer School on Document Image Processing
Qualitative Vision-Based Mobile Robot Navigation Zhichao Chen and Stanley T. Birchfield Dept. of Electrical and Computer Engineering Clemson University.
CSCE 441: Computer Graphics Forward/Inverse kinematics Jinxiang Chai.
Vision-based human motion analysis: An overview Computer Vision and Image Understanding(2007)
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
1 Distributed and Optimal Motion Planning for Multiple Mobile Robots Yi Guo and Lynne Parker Center for Engineering Science Advanced Research Computer.
Autonomous Navigation Based on 2-Point Correspondence 2-Point Correspondence using ROS Submitted By: Li-tal Kupperman, Ran Breuer Advisor: Majd Srour,
Teleoperation In Mixed Initiative Systems. What is teleoperation? Remote operation of robots by humans Can be very difficult for human operator Possible.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
VIRTUAL REALITY PRESENTED BY, JANSIRANI.T, NIRMALA.S, II-ECE.
Robot Vision SS 2007 Matthias Rüther 1 ROBOT VISION Lesson 9: Robots & Vision Matthias Rüther.
Based on the success of image extraction/interpretation technology and advances in control theory, more recent research has focused on the use of a monocular.
1cs426-winter-2008 Notes  Will add references to splines on web page.
User Performance in Relation to 3D Input Device Design  Studies conducted at University of Toronto  Usability review of 6 degree of freedom (DOF) input.
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
Robotics/Machine Vision Robert Love, Venkat Jayaraman July 17, 2008 SSTP Seminar – Lecture 7.
Typical DOE environmental management robotics require a highly experienced human operator to remotely guide and control every joint movement. The human.
Planning Tracking Motions for an Intelligent Virtual Camera Tsai-Yen Li & Tzong-Hann Yu Presented by Chris Varma May 22, 2002.
Robotics Chapter 6 – Machine Vision Dr. Amit Goradia.
Expressive Intelligence Studio // Center for Games and Playable Media // 3D User Interfaces Using the Kinect.
CSCI 631 – Foundations of Computer Vision March 15, 2016 Ashwini Imran Image Stitching.
Robot Vision SS 2009 Matthias Rüther ROBOT VISION 2VO 1KU Matthias Rüther.
Template-Based Manipulation in Unstructured Environments for Supervised Semi-Autonomous Humanoid Robots Alberto Romay, Stefan Kohlbrecher, David C. Conner,
Correspondence and Stereopsis. Introduction Disparity – Informally: difference between two pictures – Allows us to gain a strong sense of depth Stereopsis.
1 INTRODUCTION TO COMPUTER GRAPHICS. Computer Graphics The computer is an information processing machine. It is a tool for storing, manipulating and correlating.
Over the recent years, computer vision has started to play a significant role in the Human Computer Interaction (HCI). With efficient object tracking.
University of Pennsylvania 1 GRASP Control of Multiple Autonomous Robot Systems Vijay Kumar Camillo Taylor Aveek Das Guilherme Pereira John Spletzer GRASP.
CIRP Annals - Manufacturing Technology 60 (2011) 1–4 Augmented assembly technologies based on 3D bare-hand interaction S.K. Ong (2)*, Z.B. Wang Mechanical.
Robot Vision Control of robot motion from video
CS4670 / 5670: Computer Vision Kavita Bala Lec 27: Stereo.
Jörg Stückler, imageMax Schwarz and Sven Behnke*
Scanners – Robots – Measurement Plans Synergy in Motion
The Organization and Planning of Movement Ch
SENSOR BASED CONTROL OF AUTONOMOUS ROBOTS
Chapter 4 . Trajectory planning and Inverse kinematics
Presentation transcript:

1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta Page 1 Towards Practical Visual Servoing in Robotics R. Tatsambon Fomena

2 Manus ARM, iARM  Exact Dynamics Joysticks and keypads common user interfaces Image from Example of a fully integrated system

3 Click-grasp application with the intelligent Manus Arm Examples of a fully integrated system

4 1. User selects object The position of the object is computed using stereo vision from the shoulder of the camera 2. Robot arm moves to that position expressed in its base frame 3. Having the object in view the robot arm computes a more precise target and adjust orientation 4. Using left gripper camera, robot searches database for best object match. 5. Using the object template the robot arm moves to align the feature points. 6. Once aligned gripper moves forward and closes its gripper. 7. Robot returns object to user. (Tsui et al., JABB, 2011) Visual servoing: Example of a fully integrated system

5 Joystick control.3 Control Modes: 1. Move robot's hand in three dimensional space, while maintain the orientation of the hand. 2. User can modify orientation of the hand, but keeping hand centered at the same point in space. 3. User can grasp and release of the hand using either two or three fingers. Kinova Robotics aims also for a similar system Image from

6 Page 6 Visual servoing: The control concept HRI Specification of Goal S* World ACTION PERCEPTION Robot + Camera(s) S* S + - perception for action (Espiau et al., TRA, 92) (Hutchinson et al., TRA, 96)

7 Page 7 Visual servoing: Why visual sensing?  How to control the position of the end-effector of a robot with respect to an object of unknown location in the robot base frame?  How to track a moving target? A visual sensor provides relative position information

8 Page 8 Visual servoing: How can you use visual data in control?  Look then move  Visual feedback control loop ACTION PERCEPTION Robot + Camera(s) S* S - + ACTION PERCEPTION Robot + Camera(s) S-S*

9 Page 9 Quiz What are the advantages of closed-loop control over open loop control approach?

10 Page 10 Visual servoing: Ingredients for a fully integrated system  HRI  Visual tracking method  Motion control algorithm HRI Specification of Goal S* ACTION PERCEPTION Robot + Camera(s) S* S

11 Page 11 Visual servoing: Visual tracking  Crucial as it provides the necessary visual feedback coordinates of image points or lines  Should give reliable and accurate target position in the image Camshift color tracker provides 2D (x,y) coordinates of the tracked objects PERCEPTION Current image Tracker searches for the end-effector S S=(x,y) Selection of the set of Measurements to use for control

12 Visual Tracking Applications: Watching a moving target  Camera + computer can determine how things move in an scene over time. Uses:  Security: e.g. monitoring people moving in a subway station or store  Measurement: Speed, alert on colliding trajectories etc. Page 12

13 Visual Tracking Applications: Human-Computer Interfaces  Camera + computer tracks motions of human user and interprets this in an on-line interaction.  Can interact with menus, buttons and e.g. drawing programs using hand movements as mouse movements, and gestures as clicking  Furthermore, can interpret physical interactions Page 13

14 Visual Tracking Applications: Human-Machine Interfaces  Camera + computer tracks motions of human user, interprets this and machine/robot carries out task.  Remote manipulation  Service robotics for the handicapped and elderly Page 14

15 Page 15 Visual servoing: Example of visual tracking Registration based Tracking Nearest Neighbor tracker N vs Efficient Second Order Minimization

16 Page 16 Visual servoing: Motion control algorithm  3 possible control methods depending on the selection of S: 2D, 3D, and 2 ½ D (Corke, PhD, 94) High bandwidth requires precise calibration: camera and robot-camera 3D VS 2D VS 2 ½ D VS

17 Page 17 Visual servoing: Motion control algorithm  Key element is the model of the system 3D VS 2D VS 2 ½ D VS Robustness to image noise, calibration errors Suitable for unstructured environments (Corke, PhD, 94) 2D VS2 ½ D VS3D VS Abstraction for control

18 3D Visual servoing Page 18 How to sense position and orientation of an object? (Wilson et al., TRA, 1996)

19 2-1/2 D = Homography-based Visual servoing Page 19 Euclidean Homography? (Malis et al., TRA, 1999)

20 2D Visual servoing Page 20 Example of 2D features? (Espiau et al., TRA, 1992) (Jagersand et al., ICRA, 1997)

21 Page 21 Quiz What are the pro and cons of each approach? 1-a) 2D 1-b) 3D 1-c) 2 ½ D

22 Page 22 Visual servoing: Motion control algorithm  Key element is the model of the system: how does the image measurements S change with respect to changes in robot configuration q? can be seen as a sensitivity matrix

23 Page 23 Visual servoing: Motion control algorithm  How to obtain ? 1) Machine learning technique Estimation using numerical methods, for example Broyen 2) Model-based approach Analytical expression using the robot and the camera projection model Example S=(x,y) How to derive the Jacobian or interaction matrix L? (Jagersand et al., ICRA, 1997)

24 Page 24 Visual servoing: Motion stability  How to move the robot knowing e = S-S* and ?  Classical approach: the control law imposes an exponential decay of the error Classical control

25 Page 25 Visual servoing: Motion control algorithm :=VisualTracker(InitImage) Init = Init While ( > T ) { CurrentImage := GrabImage(camera) := VisualTracker(CurrentImage) Compute = Estimate Compute Change robot configuration with }

26 Page 26 Visual servoing: HRI  Important for task specification point to point alignment for gross motions points to line alignment for fine motions  Should be easy and intuitive  Is user dependent

27 Page 27 (Hager, TRA, 1997)

28 Page 28 (Kragic and Christensen, 2002) How does the error function looks like?

29 Page 29 (Kragic and Christensen, 2002) How does the error function looks like?

30 Page 30 (Kragic and Christensen, 2002) How does the error function looks like?

31 Visual Servoing: HRI-point to point task error  Point to Point task “error”: Why 16 elements? Page 31

32 Visual Servoing: HRI-point to line task error  Point to Line Line: Note: y homogeneous coord. Page 32

33 Visual Servoing: HRI-parallel composition example E ( y ) = wrench y - y y (y  y ) (plus e.p. checks) Page 33

34 Page 34 (Li, PhD thesis, 2013) Maintaining visibility

35 Visual Servoing: HRI with virtual visual fixtures Motivation  Virtual fixtures can be used for motion constraints Potential Applications  Improvements on vision-based power lines or pipelines inspection Flying over a power line or pipeline by keeping a constant yaw angle relative to the line (line tracking from the top) Hovering over a power pole and moving towards the top of a power pole for a closer inspection Page 35

36 Where does virtual fixtures can be useful?  Robot Assistant method for microsurgery (steady hand eye robot) “Here the extreme challenge of physical scale accentuate the need for dexterity enhancement, but the unstructured nature of the task dictates that the human be directly “in the loop”” Page 36 - EyeRobot1

37 Where does virtual fixtures can be useful?  How to assist the surgeon? Cooperative control with the robot Incorporate virtual fixture to help protect the patient, and eliminate hand’s tremors of the surgeon during surgery Page 37 -

38 Where does virtual fixtures can be useful?  Central retinal vein occlusion, solution= retinal vein cannulation  Free hand vein cannulation  Robot assisted vein cannulation  What to prove? “robot can increase success rate of cannulation and increase the time the micropipette is maintained in the retinal vein during infusion” Page 38

39 Virtual fixture: example  JHU for VRCM (Virtual Remote Center of Motion) Page 39

40 What is a virtual fixture?  “Like a real fixture, provides surface that confines and/or guides a motion” (Hager, IROS, 2002) Its role is typically to enhance physical dexterity Page 40 (Bettini et al., TRO, 2004)

41 What is a virtual fixture?  “Software helper control routine” (A. Hernandez Herdocia, Master thesis, 2012) Line constraint, plane constraint Page 41

42 What is virtual visual fixture? Vision-based motion constrains  Geometric virtual linkage between the sensor and the target, for 1 camera visual servoing (Chaumette et al., WS, 1994) Extension of the basic kinematic of contacts  Image-based task specification from 2 cameras visual servoing (Dodds et al., ICRA, 1999) Task geometric constraint defines a virtual fixture Page 42

43 What is virtual visual fixture? Vision-based motion constraints  Geometric virtual linkage (Chaumette et al., WS, 1994) Page 43

44 What is virtual visual fixture? Vision-based motion constraints  Image-based task specification (Dodds et al., ICRA, 1999) Page 44

45 Mathematical insight of virtual fixture  As a control law – filtered motion in a preferred direction  As a geometric constraints – virtual linkage  As condition for observer design - persistency of excitation Page 45 Prove it? (Hager, IROS, 1997)

46 Mathematical insight of virtual visual fixture Take home message: Kernel of the Jacobian or interaction matrix in visual servoing Page 46 (Tatsambon, PhD, 2008)

47 Summary: where are the next steps to move forward?  The conclusion is clear So far only a handful existing fully integrated and tested visual servoing system –Mechatronics & theoretical developments than actual practical software development –Our natural environment is complex: Hard to design an adequate representation for robots navigation  It is time to free visual servoing from its restrictions to solving real world problems Tracking issue: reliability and robustness (light variation, occlusions, …) HRI problem: Image-based task specification, new sensing modalities should be exploited Page 47

48 Short term research goal: new virtual visual fixture  Virtual fixtures Line constraint To keep tool on the line Can be done with point-to-line alignment Ellipse constraint To keep the tool on the mapping of a circle Has never been done Page 48

49 Short term research goal: grasping using visual servoing Page 49 Instead of having predefined grasping points in a database of objects Where to grasp?