Knowledge Repn. & Reasoning Lec #11: Partitioning & Treewidth UIUC CS 498: Section EA Professor: Eyal Amir Fall Semester 2004.

Slides:



Advertisements
Similar presentations
Time-Space Tradeoffs in Resolution: Superpolynomial Lower Bounds for Superlinear Space Chris Beck Princeton University Joint work with Paul Beame & Russell.
Advertisements

CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
Automatic Verification Book: Chapter 6. What is verification? Traditionally, verification means proof of correctness automatic: model checking deductive:
UIUC CS 497: Section EA Lecture #2 Reasoning in Artificial Intelligence Professor: Eyal Amir Spring Semester 2004.
Proofs from SAT Solvers Yeting Ge ACSys NYU Nov
Efficient access to TIN Regular square grid TIN Efficient access to TIN Let q := (x, y) be a point. We want to estimate an elevation at a point q: 1. should.
Bayesian Networks, Winter Yoav Haimovitch & Ariel Raviv 1.
A Model of Computation for MapReduce
Approximation, Chance and Networks Lecture Notes BISS 2005, Bertinoro March Alessandro Panconesi University La Sapienza of Rome.
Introduction to Markov Random Fields and Graph Cuts Simon Prince
Junction Trees And Belief Propagation. Junction Trees: Motivation What if we want to compute all marginals, not just one? Doing variable elimination for.
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
Knowledge Repn. & Reasoning Lec #26: Filtering with Logic UIUC CS 498: Section EA Professor: Eyal Amir Fall Semester 2004.
CS774. Markov Random Field : Theory and Application Lecture 17 Kyomin Jung KAIST Nov
Partition-Based Logical Reasoning Bill MacCartney (KSL), Sheila A. McIlraith (KSL), Eyal Amir (FRG/Berkeley), Tomas Uribe (SRI) Richard Fikes and John.
From Variable Elimination to Junction Trees
16:36MCS - WG20041 On the Maximum Cardinality Search Lower Bound for Treewidth Hans Bodlaender Utrecht University Arie Koster ZIB Berlin.
Welcome to the TACO Project Finding tree decompositions Hans L. Bodlaender Institute of Information and Computing Sciences Utrecht University.
What is the next line of the proof? a). Let G be a graph with k vertices. b). Assume the theorem holds for all graphs with k+1 vertices. c). Let G be a.
Global Approximate Inference Eran Segal Weizmann Institute.
Implicit Hitting Set Problems Richard M. Karp Harvard University August 29, 2011.
Practical Partition-Based Theorem Proving for Large Knowledge Bases Bill MacCartney (Stanford KSL) Sheila A. McIlraith (Stanford KSL) Eyal Amir (UC Berkeley)
. DAGs, I-Maps, Factorization, d-Separation, Minimal I-Maps, Bayesian Networks Slides by Nir Friedman.
CSE 589 Applied Algorithms Spring Colorability Branch and Bound.
Theory of Computing Lecture 10 MAS 714 Hartmut Klauck.
Graph Coalition Structure Generation Maria Polukarov University of Southampton Joint work with Tom Voice and Nick Jennings HUJI, 25 th September 2011.
Crossing Minimisation (1) Ronald Kieft. Global contents Specific 2-layer Crossing Minimisation techniques After the break, by Johan Crossing Minimisation.
Kernel Bounds for Structural Parameterizations of Pathwidth Bart M. P. Jansen Joint work with Hans L. Bodlaender & Stefan Kratsch July 6th 2012, SWAT 2012,
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
UIUC CS 497: Section EA Lecture #3 Reasoning in Artificial Intelligence Professor: Eyal Amir Spring Semester 2004.
Daphne Koller Variable Elimination Graph-Based Perspective Probabilistic Graphical Models Inference.
CS774. Markov Random Field : Theory and Application Lecture 02
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 17 Wednesday, 01 October.
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
The Dominating Set and its Parametric Dual  the Dominated Set  Lan Lin prepared for theory group meeting on June 11, 2003.
NP-Complete problems.
Data Structures and Algorithms in Parallel Computing Lecture 2.
Computing Branchwidth via Efficient Triangulations and Blocks Authors: F.V. Fomin, F. Mazoit, I. Todinca Presented by: Elif Kolotoglu, ISE, Texas A&M University.
1 Use graphs and not pure logic Variables represented by nodes and dependencies by edges. Common in our language: “threads of thoughts”, “lines of reasoning”,
Knowledge Repn. & Reasoning Lec. #5: First-Order Logic UIUC CS 498: Section EA Professor: Eyal Amir Fall Semester 2004.
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 5.
1 Propositional Logic Limits The expressive power of propositional logic is limited. The assumption is that everything can be expressed by simple facts.
Today Graphical Models Representing conditional dependence graphically
Algorithms for hard problems Parameterized complexity Bounded tree width approaches Juris Viksna, 2015.
CSE 421 Algorithms Richard Anderson Autumn 2015 Lecture 5.
Markov Random Fields in Vision
Knowledge Repn. & Reasoning Lecture #9: Propositional Logic UIUC CS 498: Section EA Professor: Eyal Amir Fall Semester 2005.
Computing & Information Sciences Kansas State University Wednesday, 04 Oct 2006CIS 490 / 730: Artificial Intelligence Lecture 17 of 42 Wednesday, 04 October.
Knowledge Representation & Reasoning Lecture #5 UIUC CS 498: Section EA Professor: Eyal Amir Fall Semester 2005 (Based on slides by Lise Getoor and Alvaro.
NP-Completeness A problem is NP-complete if: It is in NP
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk
Hans Bodlaender, Marek Cygan and Stefan Kratsch
Knowledge Evolution Tools
What is the next line of the proof?
Exact Inference Continued
Intro to Theory of Computation
Structural graph parameters Part 2: A hierarchy of parameters
Graphs Chapter 13.
Graphs Chapter 11 Objectives Upon completion you will be able to:
CS 583 Analysis of Algorithms
Exact Inference ..
Richard Anderson Autumn 2016 Lecture 5
Exact Inference Continued
Compact Propositional Encoding of First Order Theories
CSE 589 Applied Algorithms Spring 1999
Richard Anderson Lecture 5 Graph Theory
Richard Anderson Winter 2019 Lecture 5
Treewidth meets Planarity
Eyal Amir (UC Berkeley) Barbara Engelhardt (UC Berkeley)
Presentation transcript:

Knowledge Repn. & Reasoning Lec #11: Partitioning & Treewidth UIUC CS 498: Section EA Professor: Eyal Amir Fall Semester 2004

Last Time Resolution strategies –Deletion of clauses –Restricting resolution to some pairs –Ordering resolution between clauses Some strategies refutation complete, others only complete for Horn refutation

Today 1.We can partition reasoning while not hurting soundness and completeness 2.How to partition a KB with the best computational benefit Still maintaining soundness & completeness 3. Applications du jour: Planning

key   locked  can_open can_open  open  opened opened  fetch  broom key  opened open  opened  broom  fetch broom  dry  can_clean can_clean  cleaned broom  let_dry  dry time  let_dry drier  let_dry time  drier High-Level Structure in First- Order Logic

key   locked  can_open can_open  open  opened opened  fetch  broom key  opened open  opened  broom  fetch broom  dry  can_clean can_clean  cleaned broom  let_dry  dry time  let_dry drier  let_dry time  drier broom

Structured First-Order Reasoning Craig’s interpolation theorem (First-Order Logic): –If A B, then there is a formula C including only symbols from L(A)  L(B) such that A C and C B  clean    

Structured First-Order Reasoning Craig’s interpolation theorem (First-Order Logic): –If A B, then there is a formula C including only symbols from L(A)  L(B) such that A C and C B clean    

High-Level Structure in First- Order Logic key   locked  can_open can_open  open  opened opened  fetch  broom key  opened open  opened  broom  fetch broom  dry  can_clean can_clean  cleaned broom  let_dry  dry time  let_dry drier  let_dry time  drier broom

Structured First-Order Reasoning Craig’s interpolation theorem (First-Order Logic): –If A B, then there is a formula C including only symbols from L(A)  L(B) such that A C and C B clean   broom  

High-Level Structure in First- Order Logic key   locked  can_open can_open  open  opened opened  fetch  broom key  opened open  opened  broom  fetch broom  dry  can_clean can_clean  cleaned broom  let_dry  dry time  let_dry drier  let_dry time  drier broom

Start with a tree-decomposition partition graph Reasoning with partitions using MP MP Algorithm  Pass messages in L i toward goal Identify goal partition Direct edges toward goal (fixing outbound link language L i for each partition) Concurrently, in each partition:  Generate consequences in L i

Another Example Message Passing: Espresso machineEspresso machine SAT via partitioning

Benefits of Message-Passing Search space is restricted Allows parallel processing Sound and complete Can use different reasoners for each partition Small links imply short proofs Small partitions imply short proofs

High-Level Structure in First- Order Logic Has(key(x))   locked(x)  can_open(x) can_open(x)  open(x)  opened(x) opened(x)  fetch(y,x)  in(y,x)  has(y) Has(key(closet))  opened(closet) open(closet)  opened(closet) In(broom,closet) fetch(broom,closet) Has(broom)  dry(broom)  can_clean(x) can_clean(x)  cleaned(x) Has(y)  let_dry(y)  dry(y) Has(time)  let_dry(y) Has(drier)  let_dry(y) Has(time)  Has(drier) Has broom

Structured First-Order Reasoning Craig’s interpolation theorem (First-Order Logic): –If A B, then there is a formula C including only symbols from L(A)  L(B) such that A C and C B clean(room)   Has(broom)  

Structured First-Order Reasoning Craig’s interpolation theorem (First-Order Logic): –If A B, then there is a formula C including only symbols from L(A)  L(B) such that A C and C B clean(room)   Has(broom)  x Has(x)  Has(broom)  

Contents 1.We can partition reasoning while not hurting soundness and completeness 2.How to partition a KB with the best computational benefit Still maintaining soundness & completeness 3. Applications: Planning

key   locked  can_open can_open  open  opened opened  fetch  broom key  opened open  opened  broom  fetch broom  dry  can_clean can_clean  cleaned broom  let_dry  dry time  let_dry drier  let_dry time  drier Automatic Decomposition of a Theory

key   locked  can_open can_open  open  opened opened  fetch  broom key  opened open  opened  broom  fetch broom  dry  can_clean can_clean  cleaned broom  let_dry  dry time  let_dry drier  let_dry time  drier Automatic Decomposition of a Theory

key   locked  can_open can_open  open  opened opened  fetch  broom key  opened open  opened  broom  fetch broom  dry  can_clean can_clean  cleaned broom  let_dry  dry time  let_dry drier  let_dry time  drier Automatic Decomposition of a Theory

keylocked can_openopen openedfetch broom dry cleaned let_dry time drier can_clean

Automatic Decomposition of a Theory keylocked can_openopen openedfetch broom dry cleaned let_dry time drier can_clean

Automatic Decomposition of a Theory keylocked can_openopen openedfetch broom dry cleaned let_dry time drier can_clean broom

Automatic Decomposition of a Theory broom keylocked can_openopen openedfetch broom dry cleaned let_dry time drier can_clean broom

Automatic Decomposition of a Theory key   locked  can_open can_open  open  opened opened  fetch  broom key  opened open  opened  broom  fetch broom  dry  can_clean can_clean  cleaned broom  let_dry  dry time  let_dry drier  let_dry time  drier broom

Automatic Partitioning Begin with a KB in PL or FOL Construct symbol graph –Edges join symbols which appear together in an axiom Find a tree decomposition of low width –Roughly, generalizes balanced vertex cut Partition axioms correspondingly –Each partition has its own vocabulary –Edge labels defined by shared vocabulary

Automatic Partitioning Find a tree decomposition of minimum width: –A tree in which each node corresponds to a set of vertices from the original graph –The tree satisfies the running intersection property: if v appears in two nodes in the tree, then v appears in all the nodes on the path connecting them –The width of the tree is the size of its largest node

Why Tree Decomposition? Example: BREAK-CYCLESBREAK-CYCLES

Automatic Partitioning Treewidth: [Robertson & Seymour ’86], … Approximation Algorithms: –General theories: [A. & McIlraith ’00] –O(Log(OPT))-approximation for general graphs: [A. ’01] –Constant factor approximation for planar graphs: [Seymour & Thomas ’94], [A., Krauthgamer & Rao ’03]

Automatic Partitioning: Heuristics Heuristic: min-degree 1.Given a graph G; List L - empty 2.Add to L a node v with minimum number of neighbors 3.Make a clique from v’s neighbors 4.Remove v from G 5.If G is empty, return L 6.Go to 2

Automatic Partitioning: Heuristics Heuristic: min-fill 1.Given a graph G; List L - empty 2.Add to L a node v with minimum number of edges missing between neighbors 3.Make a clique from v’s neighbors 4.Remove v from G 5.If G is empty, return L 6.Go to 2

Reasoning is performed locally in each partition Specialized reasoning procedures in every partition Globally sound & complete … provided each local reasoner is sound & complete for L i -consequence finding Performance is worst-case exponential within partitions, but linear in tree structure Summary: Characteristics of MP Minimizes between-partition deduction Supports parallel processing Different reasoners in different partitions Focuses within-partition deduction

Contents 1.We can partition reasoning while not hurting soundness and completeness 2.How to partition a KB with the best computational benefit Still maintaining soundness & completeness 3. Applications: Planning

Application: Planning General-purpose planning problem: –Given: Domain features (fluents) Action descriptions: effects, preconditions Initial state Goal condition –Find: Sequence of actions that is guaranteed to achieve the goal starting from the initial state

Application: Planning with partitions PartPlan Algorithm Start with a tree- structured partition graph Identify goal partition Direct edges toward goal In each partition –Generate all plans possible with depth d and width k –Pass messages toward goal

Planning with partitions PartPlan Algorithm Start with a tree-structured partition graph Identify goal partition Direct edges toward goal In each partition –Generate all plans possible with depth d and width k: “if you give me a block, I can return it to you painted”, “if you give me a block, let me do a few things, and then give me another block, then I can return the two painted and glued together.” –Pass messages toward goal: All preconditions/effects for which there are feasible action sequences

Factored Planning: Analysis Planner is sound and complete Running time for finding plans of width w with m partitions of treewidth k is O(m  w  2 2w+2k ) Factoring can be done in polynomial time Goal can be distributed over partitions by adding at most 2 features per partition

Next Time Probabilistic Graphical Models: –Directed models: Bayesian Networks –Undirected models: Markov Fields Requires prior knowledge of: –Treewidth and graph algorithms –Probability theory