Omnidirectional Vision-Based Formation Control

Slides:



Advertisements
Similar presentations
Evidential modeling for pose estimation Fabio Cuzzolin, Ruggero Frezza Computer Science Department UCLA.
Advertisements

Two-View Geometry CS Sastry and Yang
Trajectory Generation
INTRODUCTION TO DYNAMICS ANALYSIS OF ROBOTS (Part 6)
Vision, Video and Virtual Reality Omnidirectional Vision Lecture 6 Omnidirectional Cameras CSC 59866CD Fall 2004 Zhigang Zhu, NAC 8/203A
3D Computer Vision and Video Computing Omnidirectional Vision Topic 11 of Part 3 Omnidirectional Cameras CSC I6716 Spring 2003 Zhigang Zhu, NAC 8/203A.
Uncalibrated Geometry & Stratification Sastry and Yang
MEAM 620 Project Report Nima Moshtagh.
CSc83029 – 3-D Computer Vision/ Ioannis Stamos 3-D Computational Vision CSc Optical Flow & Motion The Factorization Method.
Previously Two view geometry: epipolar geometry Stereo vision: 3D reconstruction epipolar lines Baseline O O’ epipolar plane.
CS223b, Jana Kosecka Rigid Body Motion and Image Formation.
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
Autonomous Robotics Team Autonomous Robotics Lab: Cooperative Control of a Three-Robot Formation Texas A&M University, College Station, TX Fall Presentations.
Camera parameters Extrinisic parameters define location and orientation of camera reference frame with respect to world frame Intrinsic parameters define.
Automatic Camera Calibration
Definition of an Industrial Robot
Motion Control (wheeled robots)
1 CMPUT 412 Motion Control – Wheeled robots Csaba Szepesvári University of Alberta TexPoint fonts used in EMF. Read the TexPoint manual before you delete.
Constraints-based Motion Planning for an Automatic, Flexible Laser Scanning Robotized Platform Th. Borangiu, A. Dogar, A. Dumitrache University Politehnica.
3D SLAM for Omni-directional Camera
Reconstructing 3D mesh from video image sequences supervisor : Mgr. Martin Samuelčik by Martin Bujňák specifications Master thesis
Computer Vision, Robert Pless Lecture 11 our goal is to understand the process of multi-camera vision. Last time, we studies the “Essential” and “Fundamental”
Complete Pose Determination for Low Altitude Unmanned Aerial Vehicle Using Stereo Vision Luke K. Wang, Shan-Chih Hsieh, Eden C.-W. Hsueh 1 Fei-Bin Hsaio.
Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan Tongmyong University.
A Fast and Accurate Tracking Algorithm of the Left Ventricle in 3D Echocardiography A Fast and Accurate Tracking Algorithm of the Left Ventricle in 3D.
General ideas to communicate Show one particular Example of localization based on vertical lines. Camera Projections Example of Jacobian to find solution.
December 12 th, 2001C. Geyer/K. Daniilidis GRASP Laboratory Slide 1 Structure and Motion from Uncalibrated Catadioptric Views Christopher Geyer and Kostas.
Raquel A. Romano 1 Scientific Computing Seminar May 12, 2004 Projective Geometry for Computer Vision Projective Geometry for Computer Vision Raquel A.
A Flexible New Technique for Camera Calibration Zhengyou Zhang Sung Huh CSPS 643 Individual Presentation 1 February 25,
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
Mixing Catadioptric and Perspective Cameras
City College of New York 1 John (Jizhong) Xiao Department of Electrical Engineering City College of New York Mobile Robot Control G3300:
Computer vision: models, learning and inference M Ahad Multiple Cameras
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
CS682, Jana Kosecka Rigid Body Motion and Image Formation Jana Kosecka
Optimal Acceleration and Braking Sequences for Vehicles in the Presence of Moving Obstacles Jeff Johnson, Kris Hauser School of Informatics and Computing.
Trajectory Generation
Motion Segmentation with Missing Data using PowerFactorization & GPCA
A Vision System for Landing an Unmanned Aerial Vehicle
René Vidal and Xiaodong Fan Center for Imaging Science
Segmentation of Dynamic Scenes
Pursuit Evasion Games and Multiple View Geometry
Multiple-View Geometry for Image-Based Modeling (Course 42)
L-infinity minimization in geometric vision problems.
Pursuit-Evasion Games with UGVs and UAVs
Observability and Identification of Linear Hybrid Systems
Segmentation of Dynamic Scenes
Vision Based Motion Estimation for UAV Landing
A Unified Algebraic Approach to 2D and 3D Motion Segmentation
Segmentation of Dynamic Scenes from Image Intensities
Pursuit Evasion Games and Multiple View Geometry
Formation Control of Nonholonomic Mobile Robots with Omnidirectional Visual Servoing and Motion Segmentation René Vidal Omid Shakernia Shankar.
Generalized Principal Component Analysis CVPR 2008
Omnidirectional Vision
Dynamical Statistical Shape Priors for Level Set Based Tracking
EE631 Cooperating Autonomous Mobile Robots Lecture: Collision Avoidance in Dynamic Environments Prof. Yi Guo ECE Dept.
Locomotion of Wheeled Robots
Vehicle Segmentation and Tracking from a Low-Angle Off-Axis Camera
Omnidirectional Vision
Omnidirectional epipolar geometry
Outline: 5.1 INTRODUCTION
Motion Segmentation at Any Speed
Multiple View Geometry for Robotics
PRAKASH CHOCKALINGAM, NALIN PRADEEP, AND STAN BIRCHFIELD
George Mason University
Breakthroughs in 3D Reconstruction and Motion Analysis
Segmentation of Dynamical Scenes
Outline: 5.1 INTRODUCTION
Outline: 5.1 INTRODUCTION
Chapter 4 . Trajectory planning and Inverse kinematics
Presentation transcript:

Omnidirectional Vision-Based Formation Control René Vidal Omid Shakernia Shankar Sastry Department of EECS, UC Berkeley

Motivation Formation control is ubiquitous Safety in numbers Decreased aerodynamic drag Higher traffic throughput Applications in defense, space exploration, etc Formation control is a problem that has received a lot of attention in the control/robotics community over the past few years. It is a problem that is ubiquitous both in nature and in man made environments. On the top left, we have a duck being followed by baby ducks in a straight line: “String formation” On the bottom left we have a school of fish swimming together to reduce the probability of being eaten by a bigger fish (safety in numbers) On the middle left, we have geese flying in a V-Formation. This formation is actually quite interesting: A geese in the formation can fly 71% further than a geese outside the formation, due to reductions In the aerodynamic drag. On the right, we have man made applications in Defense: the commander can fly in the middle and be protected by the rest of the team Automatic highways: a platoon of cars can increase traffic throughput, while maintaining safety Space exploration: a cluster of satellites with telescopes can better explore the space by moving In a formation Top left duck duckling, line formation School of fish Goose, Geese flying in a V-formation: reduces the drag can fly 71% further than a geese outside the formation They fly on top of the vortex : when the guy in front moves the flags, It forms a vortex of air raising up that cushiones safety in numbers means for example, if a fish is swimming in a large school of other fish he is less likely to get eaten than if he is swimming alone Commander protection in the middle. Explore space better by having a cluster with many satelites with telescopes are better for space exploration if they move in a formation

Previous Work Formation control has a very rich literature (short list): String stability for line formations [Swaroop et.al. TAC96] Formation can become unstable due to error propagation Mesh stability of UAVs [Pant et.al. ACC01] Generalization of string stability to a planar mesh Input-to-state stability [Tanner et.al ICRA02] Structure of interconnections and amount of information communicated affects ISS Feasible formations [Tabuada et.al. ACC01] Differential geometric conditions on feasibility of formations under kinematic constraints of mobile robots Vision-based formation control [Das et.al. TAC02] Leader position estimated by vision; Formation control in task space

Our Approach Distributed formation control (no explicit communication) Formation specified in image plane of each follower Multi-body motion segmentation to estimate leader position Followers employ tracking controller in the image plane Naturally incorporate collision avoidance by exploiting geometry of omni-directional images

Outline Omnidirectional vision Distributed formation control Catadioptric cameras, back-projection ray Paracatadioptric optical flow equations Multi-body motion segmentation Distributed formation control Leader-follower dynamics in image plane Feedback linearization control design Collision avoidance using navigation functions Experimental results Motion segmentation of robots in real sequence Vision-based formation control simulations

Catadioptric Camera Catadioptric camera is lens-mirror combination Single effective focal point Parabolic mirror, orthographic lens Hyperbolic camera, perspective lens Efficiently compute back-projection ray associated with each pixel in image

Paracatadioptric Projection Model Paraboloidal mirror surface: Projection model: Back-projection ray for pixel Paracatadioptric Projection Perspective Projection

Paracatadioptric Optical Flow Optical flow induced by camera motion of angular velocity and linear velocity Camera motion restricted to X-Y plane Paracatadioptric Optical Flow Perspective Optical Flow

Multi-body Motion Segmentation Optical flows of pixels in frames live in a 5 dimensional subspace. Optical flows can be factorized into structure and motion Given independent motions Number of independent motions is obtained as:

Motion Segmentation Independent motions lie in orthogonal subspaces Block diagonal structure of W implies leading singular vector will have the same value for pixels corresponding to same rigid motion Eigenvector has different value for pixels corresponding to different motions The largest group of pixels (static background) is flow induced by camera motion Roots of a polynomial

Leader-Follower Dynamics Kinematic model Inputs Leader position in follower’s camera Write as drift-free control system: Recover leader velocity using optical flow of background Paracatadioptric leader-follower dynamics

Cartesian coordinates Control in Image Plane Cartesian coordinates Polar coordinates Controlling in Cartesian coordinates, leader trajectory intersects circle Controlling in polar coordinates, follower mostly rotates Trajectory passing through inner circle is a collision

Feedback Linearization Control Law in Polar Coordinates Leader position , desired leader position Degenerate configurations due to nonholonomy of mobile robot Robot can not move sideways instantaneously due to geometry of catadioptric projection Corresponds to horizon points at infinity Feedback Linearization Control Law in Polar Coordinates

Distributed Formation Controller Enhanced collision avoidance using a Navigation Function Control law incorporating collision avoidance

Experimental Results Multi-body motion segmentation

Wedge Formation Green follows red Blue follows red

String Formation Green follows red Blue follows green

Conclusions A framework for distributed formation control in the omni-directional image plane An algorithm for multi-body motion segmentation in omni-directional images Future work Generalize formation control to UAV dynamics Hybrid theoretic formation switching control Implement on BEAR fleet of UGVs and UAVs