Temporal Models Template Models Representation Probabilistic Graphical

Slides:



Advertisements
Similar presentations
State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.
Advertisements

Probabilistic Reasoning Bayesian Belief Networks Constructing Bayesian Networks Representing Conditional Distributions Summary.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Dynamic Bayesian Networks (DBNs)
Kansas State University Department of Computing and Information Sciences KDD Seminar Series, Fall 2002: Dynamic Bayesian Networks Friday, August 23, 2002.
CSCI 121 Special Topics: Bayesian Networks Lecture #5: Dynamic Bayes Nets.
Reasoning Under Uncertainty: Bayesian networks intro Jim Little Uncertainty 4 November 7, 2014 Textbook §6.3, 6.3.1, 6.5, 6.5.1,
Rao-Blackwellised Particle Filtering Based on Rao-Blackwellised Particle Filtering for Dynamic Bayesian Networks by Arnaud Doucet, Nando de Freitas, Kevin.
Introduction of Probabilistic Reasoning and Bayesian Networks
1 Reasoning Under Uncertainty Over Time CS 486/686: Introduction to Artificial Intelligence Fall 2013.
Probabilistic reasoning over time So far, we’ve mostly dealt with episodic environments –One exception: games with multiple moves In particular, the Bayesian.
10/28 Temporal Probabilistic Models. Temporal (Sequential) Process A temporal process is the evolution of system state over time Often the system state.
Part 3 of 3: Beliefs in Probabilistic Robotics. References and Sources of Figures Part 1: Stuart Russell and Peter Norvig, Artificial Intelligence, 2.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Network Representation Continued
WHY ARE DBNs SPARSE? Shaunak Chatterjee and Stuart Russell, UC Berkeley Sparsity in DBNs is counter-intuitive Consider the unrolled version of a sample.
Temporal Processes Eran Segal Weizmann Institute.
Part 2 of 3: Bayesian Network and Dynamic Bayesian Network.
11/14  Continuation of Time & Change in Probabilistic Reasoning Project 4 progress? Grade Anxiety? Make-up Class  On Monday?  On Wednesday?
Dynamic Bayesian Networks CSE 473. © Daniel S. Weld Topics Agency Problem Spaces Search Knowledge Representation Reinforcement Learning InferencePlanningLearning.
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Reasoning Under Uncertainty: Bayesian networks intro CPSC 322 – Uncertainty 4 Textbook §6.3 – March 23, 2011.
Temporal Models Template Models Representation Probabilistic Graphical
Dynamic Bayesian Networks
UIUC CS 498: Section EA Lecture #21 Reasoning in Artificial Intelligence Professor: Eyal Amir Fall Semester 2011 (Some slides from Kevin Murphy (UBC))
Dynamic Bayesian Networks and Particle Filtering COMPSCI 276 (chapter 15, Russel and Norvig) 2007.
Tractable Inference for Complex Stochastic Processes X. Boyen & D. Koller Presented by Shiau Hong Lim Partially based on slides by Boyen & Koller at UAI.
Probability and Time. Overview  Modelling Evolving Worlds with Dynamic Baysian Networks  Simplifying Assumptions Stationary Processes, Markov Assumption.
Daphne Koller Markov Networks General Gibbs Distribution Probabilistic Graphical Models Representation.
Introduction on Graphic Models
1 Relational Factor Graphs Lin Liao Joint work with Dieter Fox.
Dynamic Bayesian Network Fuzzy Systems Lifelog management.
Daphne Koller Template Models Plate Models Probabilistic Graphical Models Representation.
Conditional Independence As with absolute independence, the equivalent forms of X and Y being conditionally independent given Z can also be used: P(X|Y,
Daphne Koller Bayesian Networks Semantics & Factorization Probabilistic Graphical Models Representation.
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk Undergrad TAs: Sam Johnson, Nikhil Johri CS 440 / ECE 448 Introduction to Artificial Intelligence.
Daphne Koller Independencies Bayesian Networks Probabilistic Graphical Models Representation.
CS498-EA Reasoning in AI Lecture #23 Instructor: Eyal Amir Fall Semester 2011.
Maximum Expected Utility
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Sequential Stochastic Models
Dynamic Bayesian Networks
Read R&N Ch Next lecture: Read R&N
Probabilistic Reasoning Over Time
Probabilistic Reasoning over Time
Probabilistic Reasoning over Time
Read R&N Ch Next lecture: Read R&N
General Gibbs Distribution
Probabilistic Reasoning; Network-based reasoning
Markov Networks Independencies Representation Probabilistic Graphical
Instructors: Fei Fang (This Lecture) and Dave Touretzky
General Gibbs Distribution
Luger: Artificial Intelligence, 5th edition
Bayesian Networks Independencies Representation Probabilistic
Markov Networks.
Pairwise Markov Networks
General Gibbs Distribution
A Direct Measure for the Efficacy of Bayesian Network Structures Learned from Data Gary F. Holness.
Chapter14-cont..
Factorization & Independence
Factorization & Independence
Markov Networks Independencies Representation Probabilistic Graphical
Bayesian networks Chapter 14 Section 1 – 2.
Markov Networks Independencies Representation Probabilistic Graphical
Probabilistic Reasoning
Flow of Probabilistic Influence
Read R&N Ch Next lecture: Read R&N
Non-Standard-Datenbanken
Presentation transcript:

Temporal Models Template Models Representation Probabilistic Graphical

Distributions over Trajectories 1 2 3 4 5 Pick time granularity  X(t) – variable X at time t X(t:t’) = {X(t), …, X(t’)} (t  t’) Want to represent P(X(t:t’)) for any t, t’

Markov Assumption Counter example: location and velocity

Time Invariance Template probability model P(X’ | X) For all t: Counter example: day of week, time of day

Template Transition Model Weather Weather’ Velocity Velocity’ Location Location’ Failure Failure’ Obs’ Time slice t Time slice t+1

Initial State Distribution Weather0 Velocity0 Location0 Failure0 Obs0 Time slice 0

Ground Bayesian Network Weather0 Location1 Failure1 Obs1 Time slice 1 Velocity1 Weather1 Location2 Failure2 Obs2 Velocity2 Weather2 Time slice 2 Velocity0 Location0 Failure0 Obs0 Time slice 0

2-time-slice Bayesian Network A transition model (2TBN) over X1,…,Xn is specified as a BN fragment such that: The nodes include X1’,…,Xn’ and a subset of X1,…,Xn Only the nodes X1’,…,Xn’ have parents and a CPD The 2TBN defines a conditional distribution

Dynamic Bayesian Network A dynamic Bayesian network (DBN) over X1,…,Xn is defined by a 2TBN BN over X1,…,Xn a Bayesian network BN(0) over X1(0) ,…,Xn(0)

Ground Network For a trajectory over 0,…,T we define a ground (unrolled network) such that The dependency model for X1(0) ,…,Xn(0) is copied from BN(0) The dependency model for X1(t) ,…,Xn(t) for all t > 0 is copied from BN

Tim Huang, Dieter Koller, Jitendra Malik, Gary Ogasawara, Bobby Rao, Stuart Russell, J. Weber

Hidden Markov Models s1 s2 s3 s4 S S’ S1 O1 S0 S2 O2 S3 O3 O’ 0.5 0.7 0.3 0.4 0.6 0.1 0.9

Consider a smoke detection tracking application, where we have 3 rooms connected in a row. Each room has a true smoke level (X) and a smoke level (Y) measured by a smoke detector situated in the middle of the room. Which of the following is the best DBN structure for this problem? X’1 Y’1 X1 X’2 Y’2 X2 X’3 Y’3 X3 X’1 Y’1 X1 X’2 Y’2 X2 X’3 Y’3 X3 X’1 Y’1 X1 X’2 Y’2 X2 X’3 Y’3 X3 X’1 Y’1 X1 X’2 Y’2 X2 X’3 Y’3 X3

Summary DBNS are a compact representation for encoding structured distributions over arbitrarily long temporal trajectories They make assumptions that may require appropriate model (re)design: Markov assumption Time invariance