Course: Autonomous Machine Learning

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Motivating Markov Chain Monte Carlo for Multiple Target Tracking
Probabilistic Reasoning over Time
State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.
Modeling Uncertainty over time Time series of snapshot of the world “state” we are interested represented as a set of random variables (RVs) – Observable.
Hidden Markov Models Reading: Russell and Norvig, Chapter 15, Sections
Robot Localization Using Bayesian Methods
Markov Localization & Bayes Filtering 1 with Kalman Filters Discrete Filters Particle Filters Slides adapted from Thrun et al., Probabilistic Robotics.
Introduction of Probabilistic Reasoning and Bayesian Networks
Chapter 15 Probabilistic Reasoning over Time. Chapter 15, Sections 1-5 Outline Time and uncertainty Inference: ltering, prediction, smoothing Hidden Markov.
1 Slides for the book: Probabilistic Robotics Authors: Sebastian Thrun Wolfram Burgard Dieter Fox Publisher: MIT Press, Web site for the book & more.
Bayesian Robot Programming & Probabilistic Robotics Pavel Petrovič Department of Applied Informatics, Faculty of Mathematics, Physics and Informatics
Bayes Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics TexPoint fonts used in EMF. Read the.
Presenter: Yufan Liu November 17th,
1.Examples of using probabilistic ideas in robotics 2.Reverend Bayes and review of probabilistic ideas 3.Introduction to Bayesian AI 4.Simple example.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Particle Filtering for Non- Linear/Non-Gaussian System Bohyung Han
Jeff Howbert Introduction to Machine Learning Winter Classification Bayesian Classifiers.
Bayesian Filtering for Robot Localization
Muhammad Moeen YaqoobPage 1 Moment-Matching Trackers for Difficult Targets Muhammad Moeen Yaqoob Supervisor: Professor Richard Vinter.
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
Markov Localization & Bayes Filtering
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Jamal Saboune - CRV10 Tutorial Day 1 Bayesian state estimation and application to tracking Jamal Saboune VIVA Lab - SITE - University.
1 Robot Environment Interaction Environment perception provides information about the environment’s state, and it tends to increase the robot’s knowledge.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 6.1: Bayes Filter Jürgen Sturm Technische Universität München.
Processing Sequential Sensor Data The “John Krumm perspective” Thomas Plötz November 29 th, 2011.
An Introduction to Kalman Filtering by Arthur Pece
Sequential Monte-Carlo Method -Introduction, implementation and application Fan, Xin
Probabilistic Robotics
OBJECT TRACKING USING PARTICLE FILTERS. Table of Contents Tracking Tracking Tracking as a probabilistic inference problem Tracking as a probabilistic.
Kalman Filtering And Smoothing
Tracking with dynamics
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
Probabilistic Robotics Introduction. SA-1 2 Introduction  Robotics is the science of perceiving and manipulating the physical world through computer-controlled.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Probabilistic Robotics Probability Theory Basics Error Propagation Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
Overview G. Jogesh Babu. R Programming environment Introduction to R programming language R is an integrated suite of software facilities for data manipulation,
Tracking We are given a contour G1 with coordinates G1={x1 , x2 , … , xN} at the initial frame t=1, were the image is It=1 . We are interested in tracking.
Today.
Department of Civil and Environmental Engineering
Particle Filtering for Geometric Active Contours
PSG College of Technology
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 14
Markov ó Kalman Filter Localization
Dynamical Statistical Shape Priors for Level Set Based Tracking
State Estimation Probability, Bayes Filtering
Hidden Markov chain models (state space model)
CSE-490DF Robotics Capstone
Multidimensional Integration Part I
Filtering and State Estimation: Basic Concepts
Particle Filtering.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 14
LECTURE 07: BAYESIAN ESTIMATION
Chapter14-cont..
JFG de Freitas, M Niranjan and AH Gee
Kalman Filter: Bayes Interpretation
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 14
Presentation transcript:

Course: Autonomous Machine Learning Chapter 4: Bayesian Filtering for State Estimation of the Environment Cognitive Dynamic Systems, S.Haykin Course: Autonomous Machine Learning Nguyen Duc Lam Social and Computer Networks Lab School of Computer Science and Engineering Seoul National University http://incpaper.snu.ac.kr/

Outline Introduction Bayesian Filter Conclusion 9/22/2018 IoT & SDN

Outline Introduction What is Bayesian ? Problem Statement Bayesian Filter Conclusion 9/22/2018 IoT & SDN

Introduction [1/3] What is Bayesian theory ? use Bayes' Theorem to find the conditional probability of an event P(A | B), when the "reverse" conditional probability P(B | A) is the probability that is known Where A and B are events and P(B) ≠ 0 P(A | B), a conditional probability, is the probability of observing event A given that B is true. P(A) and P(B) are the probabilities of observing A and B without regard to each other P(B | A) is the probability of observing event B given that A is true. “In probability theory and statistics, Bayes’ theorem describes the probability of an event, based on conditions that might be related to the event”. 9/22/2018 IoT & SDN

Introduction [2/3] What is Bayesian ? Likelihood Prior Posterior Given: Likelihood: A doctor knows that flu causes stiff neck 50% of the time Prior : Probability of any patient having flu is 1/50000 Evidence: Probability of any patient having stiff neck is 1/20 if a patient has stiff neck What is the probability he/she has flu ? 𝑃 𝑀 𝑆 = 𝑃 𝑆 𝑀 ∗ 𝑃 𝑀 𝑃(𝑆) = 0.5 ∗1/50000 1/20 =0.0002 Likelihood Prior Posterior Evidence 9/22/2018 IoT & SDN

Introduction [3/3] Given a state-space model of the environment A system equation A measurement equation Practical issues: The state of the environment is hidden from the observer Evolution of the state across time and measurements on the environment are both corrupted by the unavoidable presence of physical uncertainties in the environment. Solutions: Bayesian Framework Goal : Develop algorithms for solving the state-estimation problem. The state of the environment is hidden from the observer, with information about the state being available only indirectly through dependence of the observables (measurements) on the state. 9/22/2018 IoT & SDN

Outline Introduction Bayesian Filter Conclusion State-Space Model Sequential Extended Kalman Filter Conclusion 9/22/2018 IoT & SDN

Bayesian Filter [1/8] State-space Model the state of a dynamic system. System equation: n denotes discrete time. xn the vector denotes the current value of the state. xn+1 denotes its immediate future value. vector 𝜔 𝑛 denotes system noise an(.,.) is a vector function of its two arguments, representing transition from state xn to state xn+1 Measurement equation: vector yn denotes a set of measurements (observables) vector vn denotes measurement noise. and bn(.,.) denotes another vector function. The material on Bayesian inference presented previos provides the right background for Bayesian filtering, which is aimed at estimating the state of a dynamic system. The state of a dynamic system is defined as the minimal amount of information about the effects past inputs applied to the system, such that it is sufficient to completely describe the future behavior. Typically, the state is not measurable directly, as it is hidden from the perceptor. Rather, in an indirect manner, the state makes its effect on the environment (outside world) to be estimatable through a set of observables. As such, characterization of the dynamic system is described by a state-space model, which embodies a pair of equations System equation: which, formulated as a first-order Markov chain, describes the evolution of the state as a function of time. 9/22/2018 IoT & SDN

Bayesian Filter [2/8] State-space Model Assumptions: The initial state x0 is uncorrelated with the system noise 𝜔 𝑛 for all n The two sources of noise, vn and vn, are statistically independent of each other Generic state-space model of a time-varying, nonlinear dynamic system, where Z-1 denotes a block of time-unit delays. Although, indeed, the state is hidden from the perceptor, the environment does provide information about the state through measurements (observables), which prompts us to make the following statement: Given a record of measurements, consisting of y1, y2, …, yn, the requirement is to compute an estimate of the unknown state xk that is optimal in some statistical sense, with the estimation being performed in a sequential manner. In a way, this statement embodies two systems: • The unknown dynamic system, whose observable yn is a function of the hidden state. • The sequential state-estimator or fi lter, which exploits information about the state that is contained in the observables. This equation is a sufficient condition for independence when vn and 𝜔 𝑛 are jointly Gaussian. 9/22/2018 IoT & SDN

Bayesian Filter [3/8] Sequential State Estimation problem The State-estimation problem Prediction : k > n Filtering : k=n Smoothing : k <n The state-estimation problem is commonly referred to as prediction if k > n, filtering if k = n, and smoothing if k < n. Typically, a smoother is statistically more accurate than both the predictor and filter, as it uses more observables, past and present. On the other hand, both prediction and filtering can be performed in real time, whereas smoothing cannot 9/22/2018 IoT & SDN

Bayesian Filters [4/8] Framework Given: Stream of observations z and action data u: Sensor model P(z|x). Action model P(x|u,x’). Prior probability of the system state P(x). Wanted: Estimate of the state X of a dynamical system. The posterior of the state is also called Belief: 9/22/2018 IoT & SDN

Bayesian Filters [5/8] Markov Assumption Underlying Assumptions Static world Independent noise Perfect model, no approximation errors

Bayesian Filters [6/8] z = observation u = action x = state Bayes Markov Total prob. Markov 9/22/2018 IoT & SDN

Bayesian Filter [7/8] The Bayesian Filter Optimal of Bayesian Filter The adoption of a Bayesian fi lter to solve the state estimation of a dynamic system, be it linear or nonlinear, is motivated by the fact that it provides a general unifying framework for sequential state estimation, at least in a conceptual sense. The fi lter operates in a recursive manner by propagating the posterior p(xn| Yn) from one recursion to the next Knowledge about the state xn, extracted from the entire observations process Yn by the fi lter, is completely contained in the posterior p(xn | Yn), which is the “best” that can be achieved, at least in a conceptual sense. The time update and measurement update are both carried out at every time step throughout the computation of the Bayesian fi lter. In effect, they constitute a computational recursion of the fi lter, as depicted in Figure 4.6; the factor Zn has been left out in the fi gure for convenience of presentation 9/22/2018 IoT & SDN

Bayesian Filter [8/8] Approximation of the Bayesian Filter Direct numerical approximation of the posterior Kalman Filter Theory Indirect numerical approximation of the posterior Monte Carlo Particle filters 9/22/2018 IoT & SDN

Outline Introduction Bayesian Filter Conclusion 9/22/2018 IoT & SDN

Conclusion [1/1] Overview of Bayesian Theorem State-Space Model Bayesian Filter for state estimation 9/22/2018 IoT & SDN

THANK YOU Q&a 9/22/2018 IoT & SDN

Appendix Time update Measurement Update Time update. The fi rst update involves computing the predictive distribution of xn given the observations sequence Yn−1, as shown by 9/22/2018 IoT & SDN