Presentation is loading. Please wait.

Presentation is loading. Please wait.

Part 2 of 3: Bayesian Network and Dynamic Bayesian Network.

Similar presentations


Presentation on theme: "Part 2 of 3: Bayesian Network and Dynamic Bayesian Network."— Presentation transcript:

1 Part 2 of 3: Bayesian Network and Dynamic Bayesian Network

2 References and Sources of Figures Part 1: Stuart Russell and Peter Norvig, Artificial Intelligence A Modern Approach, 2 nd ed., Prentice Hall, Chapter 13 Part 2: Stuart Russell and Peter Norvig, Artificial Intelligence A Modern Approach, 2 nd ed., Prentice Hall, Chapters 14, 15 & 25 Sebastian Thrun, Wolfram Burgard, and Dieter Fox, Probabilistic Robotics, Chapters 2 & 7 Part 3: Sebastian Thrun, Wolfram Burgard, and Dieter Fox, Probabilistic Robotics, Chapter 2

3 Bayesian Network A data structure to represent dependencies among variables A directed graph in which each node is annotated with quantitative probability information

4 An Example of a Simple Bayesian Network Cavity ToothacheCatch Weather Weather is independent of the other three variables. Toothache and Catch are conditionally independent, given Cavity.

5 Bayesian Network Full specifications: 1.A set of random variables makes up the nodes of the network 2.A set of directed links or arrows connects pairs of nodes. parent  child 3.Each node X i has a conditional probability distribution P(X i |Parents(X i )) that quantifies the effect of the parents on the node 4.Directed acyclic graph (DAG), i.e. no directed cycles

6 Implementation of BN Open source BN software: Java Bayes Commercial BN software: MS Bayes, Netica

7 Teaching and Research Tools in Academic Environments GeNIe Developed at the Decision Systems Laboratory, University of Pittsburgh Runs only on Windows computers

8 Demo

9 An Example of Bayesian Network BurglaryLightning Alarm JohnCalls MaryCalls Demo using GeNIe

10 An Application of Bayesian Network Horvitz, et. al. (Microsoft Research) The Lumiere Project: Bayesian User Modeling for Inferring the Goals and Needs of Software Users ftp://ftp.research.microsoft.com/pub/ejh/lum.pdf

11 An Application of Bayesian Network Horvitz, et. al. (Microsoft Research) The Lumiere Project: Bayesian User Modeling for Inferring the Goals and Needs of Software Users ftp://ftp.research.microsoft.com/pub/ejh/lum.pdf

12 Dynamic Bayesian Network: Probabilistic Reasoning Over Time

13 Basic Ideas The process of change can be viewed as a series of snapshots Each snapshot (called a time slice) describes the state of the world at a particular time Each time slice contains a set of random variables, some of which are observable and some of which are not

14

15 DBN with Evolution of States, Controls, and Measurements

16 Terminology

17 State Environments are characterized by state Think of the state as the collection of all aspects of the robot and its environment that can impact the future

18 State Examples: the robot's pose (location and orientation) variables for the configuration of the robot's actuators (e.g. joint angles) robot velocity and velocities of its joints location and features of surrounding objects in the environment location and velocities of moving objects and people whether or not a sensor is broken, the level its battery charge

19 State denoted as: x x t : the state at time t

20 Environment measurement data evidence information about a momentary state of the environment Examples: –camera images –range scans

21 Environment measurement data Denoted as: z z t : the measurement data at time t denotes the set of all measurements acquired from time t 1 to time t 2

22 Control data convey information regarding change of state in the environment related to actuation Examples: –velocity of a robot (suggests the robot's pose after executing a motion command) –odometry (measure of the effect of a control action)

23 Control data Denoted as: u u t : the change of state in the time interval ( t -1; t ] denotes a sequence of control data from time t 1 to time t 2

24 Scenario a mobile robot uses its camera to detect the state of the door (open or closed) camera is noisy: –if the door is in fact open: the probability of detecting it open is 0.6 –if the door is in fact closed: the probability of detecting it closed is 0.8

25 Scenario the robot can use its manipulator to push open the door if the door is in fact closed: the probability of robot opening it is 0.8

26 Scenario At time t 0, the probability of the door being open is 0.5 Suppose at t 1 the robot takes no control action but it senses an open door, what is the probability of the door is open?

27 Scenario Using Bayes Filter, we will see that: –at time t 1 the probability of the door is open is: 0.75 after taking a measurement –at time t 2 the probability of the door is open is ~ 0.984 after the robot pushes open the door and takes another measurement

28 DBN with Evolution of States, Controls, and Measurements for the Mobile Robot Example x t : state of the door (open or closed) at time t u t : control data (robot's manipulator pushes open or does nothing) at time t z t : evidence or measurement by sensors at time t

29 Demo using GeNIe

30 Basic Idea of the Algorithm of Bayes Filter Bayes_filter( bel(x t-1 ), u t, z t ): for all x t do endfor return bel(x t ) Predict x t after exerting u t Update belief of x t after making a measurement z t

31 The subsequent slides explain how the Bayes' rule is applied in this filter.

32 Inference in Temporal Models

33 Basic Inference Tasks in Probabilistic Reasoning Filtering or monitoring Prediction Smoothing or hindsight Most likely explanation

34 Filtering, or monitoring The task of computing the belief state— the posterior distribution over the current state, given all evidence to date i.e. compute P(X t |z 1:t )

35 Prediction The task of computing the posterior distribution over the future state, given all evidence to date i.e. compute P(X t+k |z 1:t ), for some k > 0 or P(X t+1 |z 1:t ) for one-step prediction

36 Smoothing, or hindsight The task of computing the posterior distribution over the past state, given all evidence to date i.e. compute P(X k |z 1:t ), for some 0  k < t

37 Most likely explanation Given a sequence of observations, we want to find the sequence of states that is most likely to have generated those observations i.e. compute

38 Review Bayes' rule

39 Review Conditioning For any sets of variables Y and Z, Read as: Y is conditioned on the variable Z. Often referred to as Theorem of total probability.

40 DBN with Evolution of States and Measurements—To be used in the explanation of filtering and prediction tasks in the subsequent slides x t : state of the door (open or closed) at time t z t : evidence or measurement by sensors at time t

41 The Task of Filtering To update the belief state By computing: from the current state

42 The Task of Filtering

43

44 abcbc bca

45 The probability distribution of the state at t+1, given the measurements (evidence) to date i.e. it is a one-step prediction for the next state

46 The Task of Filtering By conditioning on the current state x t, this term becomes:

47 The Task of Filtering To update the belief state By computing: from the current state

48 The Task of Filtering

49 The robot's belief state: The posterior over the state variables X at time t+1 is calculated recursively from the corresponding estimate one time step earlier

50 The Task of Filtering Most modern localization algorithms use one of two representations of the robot's belief: Kalman filter and particle filter called Monte Carol localization (MCL).

51 The Task of Filtering Kalman filter: represent this belief state as a single multivariate Gaussian

52 The Task of Filtering particle filter: represent this belief state as a collection of particles that correspond to states

53 The Task of Filtering in Localization To update the belief state By computing: from the current state m shaded nodes: information that is given white nodes: information you want to find

54 The Task of Filtering in Mapping To update the belief state By computing: from the current state m shaded nodes: information that is given white nodes: information you want to find

55 The Task of Filtering

56 An Example Graphical Solution of Extended Kalman Filter

57 An Example Graphical Solution of Particle Filter Example

58 An Example Implementation of EKF


Download ppt "Part 2 of 3: Bayesian Network and Dynamic Bayesian Network."

Similar presentations


Ads by Google