Formation Control of Nonholonomic Mobile Robots with Omnidirectional Visual Servoing and Motion Segmentation René Vidal Omid Shakernia Shankar.

Slides:



Advertisements
Similar presentations
University of Karlsruhe September 30th, 2004 Masayuki Fujita
Advertisements

Evidential modeling for pose estimation Fabio Cuzzolin, Ruggero Frezza Computer Science Department UCLA.
Visual Servo Control Tutorial Part 1: Basic Approaches Chayatat Ratanasawanya December 2, 2009 Ref: Article by Francois Chaumette & Seth Hutchinson.
Motion Planning for Point Robots CS 659 Kris Hauser.
Trajectory Generation
INTRODUCTION TO DYNAMICS ANALYSIS OF ROBOTS (Part 6)
EE631 Cooperating Autonomous Mobile Robots Lecture 5: Collision Avoidance in Dynamic Environments Prof. Yi Guo ECE Dept.
CH24 in Robotics Handbook Presented by Wen Li Ph.D. student Texas A&M University.
Vision, Video and Virtual Reality Omnidirectional Vision Lecture 6 Omnidirectional Cameras CSC 59866CD Fall 2004 Zhigang Zhu, NAC 8/203A
Hybrid Manipulation: Force-Vision CMPUT 610 Martin Jagersand.
3D Computer Vision and Video Computing Omnidirectional Vision Topic 11 of Part 3 Omnidirectional Cameras CSC I6716 Spring 2003 Zhigang Zhu, NAC 8/203A.
Uncalibrated Geometry & Stratification Sastry and Yang
MEAM 620 Project Report Nima Moshtagh.
Previously Two view geometry: epipolar geometry Stereo vision: 3D reconstruction epipolar lines Baseline O O’ epipolar plane.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
CS223b, Jana Kosecka Rigid Body Motion and Image Formation.
Autonomous Robotics Team Autonomous Robotics Lab: Cooperative Control of a Three-Robot Formation Texas A&M University, College Station, TX Fall Presentations.
Camera parameters Extrinisic parameters define location and orientation of camera reference frame with respect to world frame Intrinsic parameters define.
Motion Control (wheeled robots)
1 CMPUT 412 Motion Control – Wheeled robots Csaba Szepesvári University of Alberta TexPoint fonts used in EMF. Read the TexPoint manual before you delete.
Landing a UAV on a Runway Using Image Registration Andrew Miller, Don Harper, Mubarak Shah University of Central Florida ICRA 2008.
3D SLAM for Omni-directional Camera
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan Tongmyong University.
1 Distributed and Optimal Motion Planning for Multiple Mobile Robots Yi Guo and Lynne Parker Center for Engineering Science Advanced Research Computer.
December 12 th, 2001C. Geyer/K. Daniilidis GRASP Laboratory Slide 1 Structure and Motion from Uncalibrated Catadioptric Views Christopher Geyer and Kostas.
Raquel A. Romano 1 Scientific Computing Seminar May 12, 2004 Projective Geometry for Computer Vision Projective Geometry for Computer Vision Raquel A.
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
Mixing Catadioptric and Perspective Cameras
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
Decision Making Under Uncertainty PI Meeting - June 20, 2001 Distributed Control of Multiple Vehicle Systems Claire Tomlin and Gokhan Inalhan with Inseok.
City College of New York 1 John (Jizhong) Xiao Department of Electrical Engineering City College of New York Mobile Robot Control G3300:
Computer vision: models, learning and inference M Ahad Multiple Cameras
Robotics Chapter 6 – Machine Vision Dr. Amit Goradia.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
CS682, Jana Kosecka Rigid Body Motion and Image Formation Jana Kosecka
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry
Optimal Acceleration and Braking Sequences for Vehicles in the Presence of Moving Obstacles Jeff Johnson, Kris Hauser School of Informatics and Computing.
Motion Segmentation with Missing Data using PowerFactorization & GPCA
A Vision System for Landing an Unmanned Aerial Vehicle
René Vidal and Xiaodong Fan Center for Imaging Science
Segmentation of Dynamic Scenes
Pursuit Evasion Games and Multiple View Geometry
L-infinity minimization in geometric vision problems.
Pursuit-Evasion Games with UGVs and UAVs
Observability and Identification of Linear Hybrid Systems
Segmentation of Dynamic Scenes
Vision Based Motion Estimation for UAV Landing
A Unified Algebraic Approach to 2D and 3D Motion Segmentation
Segmentation of Dynamic Scenes from Image Intensities
Pursuit Evasion Games and Multiple View Geometry
Omnidirectional Vision-Based Formation Control
Generalized Principal Component Analysis CVPR 2008
Zaid H. Rashid Supervisor Dr. Hassan M. Alwan
Omnidirectional Vision
Dynamical Statistical Shape Priors for Level Set Based Tracking
EE631 Cooperating Autonomous Mobile Robots Lecture: Collision Avoidance in Dynamic Environments Prof. Yi Guo ECE Dept.
Zhigang Zhu, K. Deepak Rajasekar Allen R. Hanson, Edward M. Riseman
Locomotion of Wheeled Robots
Vehicle Segmentation and Tracking from a Low-Angle Off-Axis Camera
Omnidirectional Vision
Omnidirectional epipolar geometry
Motion Segmentation at Any Speed
Multiple View Geometry for Robotics
PRAKASH CHOCKALINGAM, NALIN PRADEEP, AND STAN BIRCHFIELD
George Mason University
Breakthroughs in 3D Reconstruction and Motion Analysis
Segmentation of Dynamical Scenes
Outline: 5.1 INTRODUCTION
Chapter 4 . Trajectory planning and Inverse kinematics
Presentation transcript:

Formation Control of Nonholonomic Mobile Robots with Omnidirectional Visual Servoing and Motion Segmentation René Vidal Omid Shakernia Shankar Sastry Department of EECS, UC Berkeley

Motivation Formation control is ubiquitous Safety in numbers Decreased aerodynamic drag Higher traffic throughput Applications in defense, space exploration, etc Formation control is a problem that has received a lot of attention in the control/robotics community over the past few years. It is a problem that is ubiquitous both in nature and in man made environments. On the top left, we have a duck being followed by baby ducks in a straight line: “String formation” On the bottom left we have a school of fish swimming together to reduce the probability of being eaten by a bigger fish (safety in numbers) On the middle left, we have geese flying in a V-Formation. This formation is actually quite interesting: A geese in the formation can fly 71% further than a geese outside the formation, due to reductions In the aerodynamic drag. On the right, we have man made applications in Defense: the commander can fly in the middle and be protected by the rest of the team Automatic highways: a platoon of cars can increase traffic throughput, while maintaining safety Space exploration: a cluster of satellites with telescopes can better explore the space by moving In a formation Top left duck duckling, line formation School of fish Goose, Geese flying in a V-formation: reduces the drag can fly 71% further than a geese outside the formation They fly on top of the vortex : when the guy in front moves the flags, It forms a vortex of air raising up that cushiones safety in numbers means for example, if a fish is swimming in a large school of other fish he is less likely to get eaten than if he is swimming alone Commander protection in the middle. Explore space better by having a cluster with many satelites with telescopes are better for space exploration if they move in a formation

Previous Work Formation control has a very rich literature (short list): String stability for line formations [Swaroop et.al. TAC96] Formation can become unstable due to error propagation Mesh stability of UAVs [Pant et.al. ACC01] Generalization of string stability to a planar mesh Input-to-state stability [Tanner et.al ICRA02] Structure of interconnections and amount of information communicated affects ISS Feasible formations [Tabuada et.al. ACC01] Differential geometric conditions on feasibility of formations under kinematic constraints of mobile robots Vision-based formation control [Das et.al. TAC02] Leader position estimated by vision; Formation control in task space

Our Approach Distributed formation control (no explicit communication) Formation specified in image plane of each follower Multi-body motion segmentation to estimate leader position Followers employ tracking controller in the image plane Naturally incorporate collision avoidance by exploiting geometry of omni-directional images

Outline Omnidirectional vision Distributed formation control Central panoramic cameras, back-projection ray Central panoramic optical flow equations Multi-body motion segmentation Distributed formation control Leader-follower dynamics in image plane Feedback linearization control design Collision avoidance using navigation functions Experimental results Motion segmentation of robots in real sequence Vision-based formation control simulations

Central Panoramic Camera Catadioptric camera is lens-mirror combination Central panoramic: single effective focal point Parabolic mirror, orthographic lens Hyperbolic camera, perspective lens Efficiently compute back-projection ray associated with each pixel in image

Central Panoramic Optical Flow Optical flow induced by a planar camera motion with velocities and Central Panoramic Projection Model Central Panoramic Optical Flow

Central Panoramic Motion Segmentation Optical flows of pixels in frames live in a 5 dimensional subspace. Optical flows can be factorized into structure and motion Given independent motions Number of independent motions is obtained as:

Central Panoramic Motion Segmentation Independent motions lie in orthogonal subspaces Block diagonal structure of W implies leading singular vector will have the same value for pixels corresponding to same rigid motion Eigenvector has different value for pixels corresponding to different motions The largest group of pixels (static background) is flow induced by camera motion Roots of a polynomial

Image Leader-Follower Dynamics Kinematic model Inputs Leader position in follower’s camera Write as drift-free control system: Recover leader velocity using optical flow of background Central panoramic leader-follower dynamics

Omnidirectional Visual Servoing Cartesian coordinates Polar coordinates Controlling in Cartesian coordinates, leader trajectory intersects circle Controlling in polar coordinates, follower mostly rotates Trajectory passing through inner circle is a collision

Omnidirectional Visual Servoing Leader position , desired leader position Degenerate configurations due to nonholonomy of mobile robot Robot can not move sideways instantaneously due to geometry of central panoramic cameras Corresponds to horizon points at infinity Feedback Linearization Control Law in Polar Coordinates

Omnidirectional Visual Servoing Can avoid degenerate configurations with pseudo-feedback linearizing control law However, the formation is only Input-to-State Stable (ISS) Can easily modify control law to achieve collision avoidance by using a Navigation Function

Experimental Results Multi-body motion segmentation

Wedge Formation Green follows red Blue follows red

String Formation Green follows red Blue follows green

Conclusions A framework for distributed formation control in the omni-directional image plane An algorithm for multi-body motion segmentation in omni-directional images Future work Generalize formation control to UAV dynamics Hybrid theoretic formation switching control Implement on BEAR fleet of UGVs and UAVs