Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 5.3: Reasoning with Bayes Law Jürgen Sturm Technische Universität.

Slides:



Advertisements
Similar presentations
David Rosen Goals  Overview of some of the big ideas in autonomous systems  Theme: Dynamical and stochastic systems lie at the intersection of mathematics.
Advertisements

Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 1.3: Flying Principle of a Quadrotor Jürgen Sturm Technische.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
SA-1 Probabilistic Robotics Planning and Control: Partially Observable Markov Decision Processes.
Reasoning Under Uncertainty: Bayesian networks intro Jim Little Uncertainty 4 November 7, 2014 Textbook §6.3, 6.3.1, 6.5, 6.5.1,
Foundations of Artificial Intelligence 1 Bayes’ Rule - Example  Medical Diagnosis  suppose we know from statistical data that flu causes fever in 80%
1 Slides for the book: Probabilistic Robotics Authors: Sebastian Thrun Wolfram Burgard Dieter Fox Publisher: MIT Press, Web site for the book & more.
Bayesian Robot Programming & Probabilistic Robotics Pavel Petrovič Department of Applied Informatics, Faculty of Mathematics, Physics and Informatics
Bayes Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read the.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
Probability Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1.Examples of using probabilistic ideas in robotics 2.Reverend Bayes and review of probabilistic ideas 3.Introduction to Bayesian AI 4.Simple example.
Autonomous Robot Navigation Panos Trahanias ΗΥ475 Fall 2007.
Probabilistic Methods in Mobile Robotics. Stereo cameras Infra-red Sonar Laser range-finder Sonar Tactiles.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
City College of New York 1 Dr. John (Jizhong) Xiao Department of Electrical Engineering City College of New York A Taste of Localization.
Probability: Review TexPoint fonts used in EMF.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotics Research Laboratory University of Southern California
© 2003 by Davi GeigerComputer Vision November 2003 L1.1 Tracking We are given a contour   with coordinates   ={x 1, x 2, …, x N } at the initial frame.
Thanks to Nir Friedman, HU
CS 188: Artificial Intelligence Fall 2009 Lecture 19: Hidden Markov Models 11/3/2009 Dan Klein – UC Berkeley.
Probability, Bayes’ Theorem and the Monty Hall Problem
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.1: 3D Geometry Jürgen Sturm Technische Universität München.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 2.1: Recap on Linear Algebra Daniel Cremers Technische Universität.
Mobile Robot controlled by Kalman Filter
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.2: Sensors Jürgen Sturm Technische Universität München.
Markov Localization & Bayes Filtering
Lab 4 1.Get an image into a ROS node 2.Find all the orange pixels (suggest HSV) 3.Identify the midpoint of all the orange pixels 4.Explore the findContours.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 6.2: Kalman Filter Jürgen Sturm Technische Universität München.
Autonomous Navigation for Flying Robots Lecture 2.2: 2D Geometry
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 2.3: 2D Robot Example Jürgen Sturm Technische Universität München.
Reasoning Under Uncertainty: Bayesian networks intro CPSC 322 – Uncertainty 4 Textbook §6.3 – March 23, 2011.
Visibility Graph. Voronoi Diagram Control is easy: stay equidistant away from closest obstacles.
1 Robot Environment Interaction Environment perception provides information about the environment’s state, and it tends to increase the robot’s knowledge.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 4.1: Motors and Controllers Jürgen Sturm Technische Universität.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 4.2 : Feedback Control Jürgen Sturm Technische Universität München.
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 6.1: Bayes Filter Jürgen Sturm Technische Universität München.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 5.2: Recap on Probability Theory Jürgen Sturm Technische Universität.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 5.1: State Estimation Jürgen Sturm Technische Universität München.
4 Proposed Research Projects SmartHome – Encouraging patients with mild cognitive disabilities to use digital memory notebook for activities of daily living.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 1.2: Why Quadrotors? Jürgen Sturm Technische Universität München.
Autonomous Navigation for Flying Robots Lecture 6.3: EKF Example
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 1.1: Welcome Jürgen Sturm Technische Universität München.
Probabilistic Robotics Introduction.  Robotics is the science of perceiving and manipulating the physical world through computer-controlled devices.
Probabilistic Robotics
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
Computer vision: models, learning and inference Chapter 2 Introduction to probability.
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Probabilistic Robotics Introduction. SA-1 2 Introduction  Robotics is the science of perceiving and manipulating the physical world through computer-controlled.
Probabilistic Robotics Probability Theory Basics Error Propagation Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics.
General approach: A: action S: pose O: observation Position at time t depends on position previous position and action, and current observation.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
CSE 468/568 Deadlines Lab1 grades out tomorrow (5/1) HW2 grades out by weekend Lab 3 grades out next weekend HW3 – Probability homework out Due 5/7 FINALS:
Review of Probability.
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
Today.
Markov ó Kalman Filter Localization
State Estimation Probability, Bayes Filtering
Probabilistic Reasoning over Time
CSE-490DF Robotics Capstone
CS 188: Artificial Intelligence Fall 2008
A Short Introduction to the Bayes Filter and Related Models
EE-565: Mobile Robotics Non-Parametric Filters Module 2, Lecture 5
Principle of Bayesian Robot Localization.
Presentation transcript:

Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 5.3: Reasoning with Bayes Law Jürgen Sturm Technische Universität München

The State Estimation Problem We want to estimate the world state from 1.Sensor measurements and 2.Controls (or odometry readings) We need to model the relationship between these random variables, i.e., Jürgen Sturm Autonomous Navigation for Flying Robots 2

Causal vs. Diagnostic Reasoning  is diagnostic  is causal  Diagnosic reasoning is typically what we need  Often causal knowledge is easier to obtain  Bayes rule allows us to use causal knowledge in diagnostic reasoning Jürgen Sturm Autonomous Navigation for Flying Robots 3

Bayes Rule  Definition of conditional probability  Bayes rule Jürgen Sturm Autonomous Navigation for Flying Robots 4 observation likelihoodprior on world state prior on sensor observations

Normalization  Direct computation of can be difficult  Idea: Compute improper distribution, normalize afterwards  Step 1:  Step 2:  Step 3: Jürgen Sturm Autonomous Navigation for Flying Robots 5

Background Knowledge  Same derivation also works in the presence of background knowledge  Bayes rule with background knowledge Jürgen Sturm Autonomous Navigation for Flying Robots 6

Example: Sensor Measurement  Quadrotor seeks the landing zone  Landing zone is marked with many bright lamps  Quadrotor has a brightness sensor Jürgen Sturm Autonomous Navigation for Flying Robots 7

Example: Sensor Measurement  Binary sensor  Binary world state  Sensor model  Prior on world state  Assume: Robot observes light, i.e.,  What is the probability that the robot is above the landing zone? Jürgen Sturm Autonomous Navigation for Flying Robots 8

Example: Sensor Measurement  Sensor model  Prior on world state  Probability after observation (using Bayes) Jürgen Sturm Autonomous Navigation for Flying Robots 9

Combining Evidence  Suppose our robot obtains another observation (either from the same or a different sensor)  How can we integrate this new information?  More generally, how can we estimate ? Jürgen Sturm Autonomous Navigation for Flying Robots 10

Combining Evidence  Suppose our robot obtains another observation (either from the same or a different sensor)  How can we integrate this new information?  More generally, how can we estimate ?  Bayes formula (with background knowledge) gives us Jürgen Sturm Autonomous Navigation for Flying Robots 11

Recursive Bayesian Updates Markov Assumption: is independent of if we know Jürgen Sturm Autonomous Navigation for Flying Robots 12

Example: Sensor Measurement  Quadrotor seeks the landing zone  Landing zone is marked with many bright lamps and a visual marker Jürgen Sturm Autonomous Navigation for Flying Robots 13

Example: Second Measurement  Sensor model  Previous estimate  Assume robot does not observe marker  What is the probability of being home? Jürgen Sturm Autonomous Navigation for Flying Robots 14 The second observation lowers the probability that the robot is above the landing zone!

Actions/Controls (Motions)  World state changes over time because of  actions carried out by the robot…  actions carried out by other agents…  or just time passing by… …change the world  How can we incorporate actions? Jürgen Sturm Autonomous Navigation for Flying Robots 15

Typical Actions  Quadrotor accelerates by changing the speed of its motors  Position also changes when quadrotor does “nothing” (and drifts..)  Actions are never carried out with absolute certainty  In contrast to measurements, actions generally increase the uncertainty of the state estimate Jürgen Sturm Autonomous Navigation for Flying Robots 16

Action Models  To incorporate the outcome of an action into the current state estimate (“belief”), we use the conditional pdf  This term specifies the probability that executing the action u in state x will lead to state x’ Jürgen Sturm Autonomous Navigation for Flying Robots 17

Example: Take-Off  Action:  World state: Jürgen Sturm Autonomous Navigation for Flying Robots 18 ground air

Integrating the Outcome of Actions  Discrete case  Continuous case Jürgen Sturm Autonomous Navigation for Flying Robots 19

Example: Take-Off  Prior belief on robot state: (robot is located on the ground)  Robot executes “take-off” action  What is the robot’s belief after one time step? Jürgen Sturm Autonomous Navigation for Flying Robots 20

Lessons Learned  Bayes rule  Data fusion of sensor observations  Data fusion of actions/motion commands  Next: Bayes filter Jürgen Sturm Autonomous Navigation for Flying Robots 21