SENSOR BASED CONTROL OF AUTONOMOUS ROBOTS

Slides:



Advertisements
Similar presentations
Visual Servo Control Tutorial Part 1: Basic Approaches Chayatat Ratanasawanya December 2, 2009 Ref: Article by Francois Chaumette & Seth Hutchinson.
Advertisements

Slides from: Doug Gray, David Poole
Discussion topics SLAM overview Range and Odometry data Landmarks
Spherical Convolution in Computer Graphics and Vision Ravi Ramamoorthi Columbia Vision and Graphics Center Columbia University SIAM Imaging Science Conference:
System Integration and Experimental Results Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash.
Hybrid Position-Based Visual Servoing
Probabilistic Robotics SLAM. 2 Given: The robot’s controls Observations of nearby features Estimate: Map of features Path of the robot The SLAM Problem.
Neuromorphic Engineering
Longin Jan Latecki Zygmunt Pizlo Yunfeng Li Temple UniversityPurdue University Project Members at Temple University: Yinfei Yang, Matt Munin, Kaif Brown,
Visual Navigation in Modified Environments From Biology to SLAM Sotirios Ch. Diamantas and Richard Crowder.
Fast Computational Methods for Visually Guided Robots Maryam Mahdaviani, Nando de Freitas, Bob Fraser and Firas Hamze Department of Computer Science, University.
Reliable Range based Localization and SLAM Joseph Djugash Masters Student Presenting work done by: Sanjiv Singh, George Kantor, Peter Corke and Derek Kurth.
Monash University Dept Research Forum Active Sensing for Mobile and Humanoid Robots - Lindsay Kleeman Active Sensing for Mobile and Humanoid Robots.
Overview of Computer Vision CS491E/791E. What is Computer Vision? Deals with the development of the theoretical and algorithmic basis by which useful.
MEAM 620 Project Report Nima Moshtagh.
Computing With Images: Outlook and applications
SENSOR FUSION LABORATORY Thad Roppel, Associate Professor AU Electrical and Computer Engineering Dept. EXAMPLES Distributed networks.
Computational Vision Jitendra Malik University of California at Berkeley Jitendra Malik University of California at Berkeley.
Study on Mobile Robot Navigation Techniques Presenter: 林易增 2008/8/26.
High Speed Obstacle Avoidance using Monocular Vision and Reinforcement Learning Jeff Michels Ashutosh Saxena Andrew Y. Ng Stanford University ICML 2005.
Artificial Neural Networks
Intelligent Agents: an Overview. 2 Definitions Rational behavior: to achieve a goal minimizing the cost and maximizing the satisfaction. Rational agent:
InerVis Mobile Robotics Laboratory Institute of Systems and Robotics ISR – Coimbra Contact Person: Jorge Lobo Human inertial sensor:
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry Topics: Basics of projective geometry Points and hyperplanes in projective space Homography.
Simple set-membership methods and control algorithms applied to robots for exploration Fabrice LE BARS.
Vision Guided Robotics
ROBOT MAPPING AND EKF SLAM
1 DARPA TMR Program Collaborative Mobile Robots for High-Risk Urban Missions Second Quarterly IPR Meeting January 13, 1999 P. I.s: Leonidas J. Guibas and.
November 25, 2014Computer Vision Lecture 20: Object Recognition IV 1 Creating Data Representations The problem with some data representations is that the.
Kalman filter and SLAM problem
A Brief Overview of Computer Vision Jinxiang Chai.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception.
Robot Vision SS 2007 Matthias Rüther ROBOT VISION 2VO 1KU Matthias Rüther.
The SmartWheeler platform Collaboration between McGill, U.Montreal, Ecole Polytechnique Montreal + 2 clinical rehab centers. Standard commercial power.
오 세 영, 이 진 수 전자전기공학과, 뇌연구센터 포항공과대학교
Cooperating AmigoBots Framework and Algorithms
SPIE'01CIRL-JHU1 Dynamic Composition of Tracking Primitives for Interactive Vision-Guided Navigation D. Burschka and G. Hager Computational Interaction.
Towards Cognitive Robotics Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Christian.
Lecture 10: 8/6/1435 Machine Learning Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Young Ki Baik, Computer Vision Lab.
Computer Vision Michael Isard and Dimitris Metaxas.
The Hardware Design of the Humanoid Robot RO-PE and the Self-localization Algorithm in RoboCup Tian Bo Control and Mechatronics Lab Mechanical Engineering.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
Algorithmic, Game-theoretic and Logical Foundations
Ghislain Fouodji Tasse Supervisor: Dr. Karen Bradshaw Computer Science Department Rhodes University 04 August 2009.
Chapter 15. Cognitive Adequacy in Brain- Like Intelligence in Brain-Like Intelligence, Sendhoff et al. Course: Robots Learning from Humans Cinarel, Ceyda.
Robot Vision SS 2009 Matthias Rüther ROBOT VISION 2VO 1KU Matthias Rüther.
Basilio Bona DAUIN – Politecnico di Torino
1 26/03/09 LIST – DTSI Design of controllers for a quad-rotor UAV using optical flow Bruno Hérissé CEA List Fontenay-Aux-Roses, France
CS b659: Intelligent Robotics
CS344: Introduction to Artificial Intelligence (associated lab: CS386)
Thrust IC: Action Selection in Joint-Human-Robot Teams
Machine Learning Basics
Simultaneous Localization and Mapping
Using Flow Textures to Visualize Unsteady Vector Fields
© James D. Skrentny from notes by C. Dyer, et. al.
Lecture 10 Causal Estimation of 3D Structure and Motion
Institute of Neural Information Processing (Prof. Heiko Neumann •
Introduction to Robot Mapping
Combining Geometric- and View-Based Approaches for Articulated Pose Estimation David Demirdjian MIT Computer Science and Artificial Intelligence Laboratory.
Creating Data Representations
Artificial Intelligence Lecture No. 28
CMSC 426: Image Processing (Computer Vision)
Robot Intelligence Kevin Warwick.
Estimation of relative pose
Liyuan Li, Jerry Kah Eng Hoe, Xinguo Yu, Li Dong, and Xinqi Chu
Overview: Chapter 2 Localization and Tracking
Behavior Based Systems
Presentation transcript:

SENSOR BASED CONTROL OF AUTONOMOUS ROBOTS Robert Mahony Department of Engineering Australian National University Department of Engineering Research Forum, November, 2006.

Active topics in robotics Physical control of vehicle/machine Sensory perception of environment Interaction of vehicle with environment AI (artificial intelligence) Autonomy Co-ordination Data interpretation Multiple robots Multiple tasks Human-robot interfaces Inter-robot interfaces Data fusion SLAM (simultaneous localisation and mapping) Object recognition, sensor segmentation Sensor based control of autonomous robots 13-May-19

Active topics in robotics Physical control of vehicle/machine Sensory perception of environment Interaction of vehicle with environment AI (artificial intelligence) Autonomy SENSOR BASED CONTROL Co-ordination Data interpretation Multiple robots Multiple tasks Human-robot interfaces Inter-robot interfaces Data fusion SLAM (simultaneous localisation and mapping) Object recognition, sensor segmentation Sensor based control of autonomous robots 13-May-19

Autonomy in robotic systems An autonomous robot is capable of moving about within an unstructured (or partially) structured environment independently. Unstructured environment: No map available. Partially structured environment: There is a map but it does not contain all objects – and is not necessarily accurate. In all cases the robot must regulate its motion with respect to the local environment. Sensor based control of autonomous robots 13-May-19

Sensor based control of autonomous robots Example: Aerial robot A common task that is considered in aerial robotics is regulation of the vehicle relative to an observed feature. Other important tasks Obstacle avoidance Close approach and landing Sensor based control of autonomous robots 13-May-19

Classical control approach Classical control theory provides a standard approach to regulation problems Model the dynamics of the system. Represent the dynamics in terms of a minimal state. Represent the task in terms of a state error. Design a control algorithm to drive the state error to zero. Measure something. Estimate the system state on-line. Input the state estimate into the control algorithm to close the loop. Sensor based control of autonomous robots 13-May-19

Issues with classical control approach REAL WORLD Observations Sensors The mapping from observation to state estimate is non-linear over-determined ill-conditioned Task error State estimates Computing a state estimate from the observations requires: Model of the environment (SLAM) Model of the system dynamics Estimates tends to be ill-conditioned when the vehicle is distant from local features. Task error is naturally conditioned relative to proximity to environment! Easy to represent in terms of sensor measurements. Sensor based control of autonomous robots 13-May-19

Sensor based control of autonomous robots Sensor based control is a paradigm that is only subtly different from the classical approach. Model the dynamics of the system Use this model to determine the dynamic response of the sensor signals based on the expected environment. Represent the task in terms of a sensor error Design a control algorithm to drive the sensor error to zero based on analysis of the sensor dynamics Input the sensor measurements into the control algorithm to close the loop. Sensor based control of autonomous robots 13-May-19

Sensor based control of autonomous robots Bio-mimetic systems One of the major motivations for sensor based control of autonomous robots is the growing evidence for simple sensor based control algorithms in biological system. A honey bee regulates its thrust in landing approach in proportion to a measure of divergence of the observed optic flow (Srinivasan et al. 2000, Moffit et al. 2006) Optical flow field  of textured surface under direct approach Sensor based control of autonomous robots 13-May-19

Challenges to sensor based control. Sensor dynamics tend to be highly non-linear. Very challenging control problems. Sensor data tends to be high dimensional – much higher dimensional than the state vector. Leads to non-minimal system representations. Early work in this area has depended on finding good features (eg average flow divergence s div ) that provide a low dimensional “sensor state” representation. Overcoming these problems leads to highly robust and effective task based control of autonomous systems. Sensor based control of autonomous robots 13-May-19

Stabilisation of aerial robot relative to image. Observed closed-loop error evolution in the sensor based task criterion. Regulation of position in task space. Computed from inverse pose algorithm. Sensor based control of autonomous robots 13-May-19

Sensor based control of autonomous robots Collaborators Peter Corke Tarek Hamel Odile Bourquardez Nicolas Guenard (many other honours and stagiere students) Francois Chaumette Sensor based control of autonomous robots 13-May-19

Dynamic image based visual servo control Consider the problem of stabilising an aerial robot relative to some physical object who’s image is easily segmented. Observed object Image on spherical image plane Spherical centroid is the integral of observed image on the sphere. Sensor based control of autonomous robots 13-May-19

Sensor space dynamics and control Sensor based control of autonomous robots 13-May-19