1 Directional consistency Chapter 4 ICS-275 Spring 2007.

Slides:



Advertisements
Similar presentations
Tree Clustering for Constraint Networks 1 Chris Reeson Advanced Constraint Processing Fall 2009 By Rina Dechter & Judea Pearl Artificial Intelligence,
Advertisements

Constraint Satisfaction Problems
Variational Methods for Graphical Models Micheal I. Jordan Zoubin Ghahramani Tommi S. Jaakkola Lawrence K. Saul Presented by: Afsaneh Shirazi.
CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
1 Constraint Satisfaction Problems A Quick Overview (based on AIMA book slides)
1 Finite Constraint Domains. 2 u Constraint satisfaction problems (CSP) u A backtracking solver u Node and arc consistency u Bounds consistency u Generalized.
. Exact Inference in Bayesian Networks Lecture 9.
ICS-271:Notes 5: 1 Lecture 5: Constraint Satisfaction Problems ICS 271 Fall 2008.
Artificial Intelligence Constraint satisfaction problems Fall 2008 professor: Luigi Ceccaroni.
Bayesian Networks, Winter Yoav Haimovitch & Ariel Raviv 1.
Lauritzen-Spiegelhalter Algorithm
Wednesday, January 29, 2003CSCE Spring 2003 B.Y. Choueiry Constraint Consistency Chapter 3.
Statistical Methods in AI/ML Bucket elimination Vibhav Gogate.
Anagh Lal Tuesday, April 08, Chapter 9 – Tree Decomposition Methods- Part II Anagh Lal CSCE Advanced Constraint Processing.
Constraint Optimization Presentation by Nathan Stender Chapter 13 of Constraint Processing by Rina Dechter 3/25/20131Constraint Optimization.
Exact Inference in Bayes Nets
Foundations of Constraint Processing More on Constraint Consistency 1 Foundations of Constraint Processing CSCE421/821, Spring
Foundations of Constraint Processing, Spring 2008 April 16, 2008 Tree-Structured CSPs1 Foundations of Constraint Processing CSCE421/821, Spring 2008:
From Variable Elimination to Junction Trees
1 Directional consistency Chapter 4 ICS-179 Spring 2010 ICS Graphical models.
1 Exact Inference Algorithms Bucket-elimination and more COMPSCI 179, Spring 2010 Set 8: Rina Dechter (Reading: chapter 14, Russell and Norvig.
Tutorial #6 by Ma’ayan Fishelson Based on notes by Terry Speed.
Anagh Lal Monday, April 14, Chapter 9 – Tree Decomposition Methods Anagh Lal CSCE Advanced Constraint Processing.
1 Advanced consistency methods Chapter 8 ICS-275 Spring 2007.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
. Hidden Markov Models For Genetic Linkage Analysis Lecture #4 Prepared by Dan Geiger.
Constraint Satisfaction Problems
1 Consistency algorithms Chapter 3. Spring 2007 ICS 275A - Constraint Networks 2 Consistency methods Approximation of inference: Arc, path and i-consistecy.
Cut-and-Traverse: A new Structural Decomposition Method for CSPs Yaling Zheng and Berthe Y. Choueiry Constraint Systems Laboratory Computer Science & Engineering.
M. HardojoFriday, February 14, 2003 Directional Consistency Dechter, Chapter 4 1.Section 4.4: Width vs. Local Consistency Width-1 problems: DAC Width-2.
Learning Equivalence Classes of Bayesian-Network Structures David M. Chickering Presented by Dmitry Zinenko.
Tutorial #9 by Ma’ayan Fishelson
1 Hybrid of search and inference: time- space tradeoffs chapter 10 ICS-275A Fall 2003.
Stochastic greedy local search Chapter 7 ICS-275 Spring 2007.
Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 2 Ryan Kinworthy CSCE Advanced Constraint Processing.
Constraint Satisfaction Problems
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
Graph Coalition Structure Generation Maria Polukarov University of Southampton Joint work with Tom Voice and Nick Jennings HUJI, 25 th September 2011.
APPROXIMATION ALGORITHMS VERTEX COVER – MAX CUT PROBLEMS
1 Treewidth, partial k-tree and chordal graphs Delpensum INF 334 Institutt fo informatikk Pinar Heggernes Speaker:
Constraint Satisfaction Read Chapter 5. Model Finite set of variables: X1,…Xn Variable Xi has values in domain Di. Constraints C1…Cm. A constraint specifies.
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 11 th, 2006 Readings: K&F: 8.1, 8.2, 8.3,
Artificial Intelligence CS482, CS682, MW 1 – 2:15, SEM 201, MS 227 Prerequisites: 302, 365 Instructor: Sushil Louis,
Data Structures & Algorithms Graphs
1 Directional consistency Chapter 4 ICS-275 Spring 2009 ICS Constraint Networks.
A Logic of Partially Satisfied Constraints Nic Wilson Cork Constraint Computation Centre Computer Science, UCC.
Wednesday, January 29, 2003CSCE Spring 2003 B.Y. Choueiry Directional Consistency Chapter 4.
Intro to Junction Tree propagation and adaptations for a Distributed Environment Thor Whalen Metron, Inc.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
1. 2 Outline of Ch 4 Best-first search Greedy best-first search A * search Heuristics Functions Local search algorithms Hill-climbing search Simulated.
. Basic Model For Genetic Linkage Analysis Prepared by Dan Geiger.
Foundations of Constraint Processing, Spring 2009 Structure-Based Methods: An Introduction 1 Foundations of Constraint Processing CSCE421/821, Spring 2009.
1 Tutorial #9 by Ma’ayan Fishelson. 2 Bucket Elimination Algorithm An algorithm for performing inference in a Bayesian network. Similar algorithms can.
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 15 th, 2008 Readings: K&F: 8.1, 8.2, 8.3,
Structure-Based Methods Foundations of Constraint Processing
Consistency Methods for Temporal Reasoning
Exact Inference Continued
Directional Resolution: The Davis-Putnam Procedure, Revisited
Structure-Based Methods Foundations of Constraint Processing
Structure-Based Methods Foundations of Constraint Processing
Constraint Satisfaction Problems
Structure-Based Methods Foundations of Constraint Processing
Basic Model For Genetic Linkage Analysis Lecture #3
Advanced consistency methods Chapter 8
Exact Inference Continued
Tutorial #6 by Ma’ayan Fishelson
Directional consistency Chapter 4
Structure-Based Methods Foundations of Constraint Processing
An Introduction Structure-Based Methods
Structure-Based Methods Foundations of Constraint Processing
Presentation transcript:

1 Directional consistency Chapter 4 ICS-275 Spring 2007

ICS Constraint Networks 2 Backtrack-free search: or What level of consistency will guarantee global- consistency

Spring 2007 ICS Constraint Networks 3 Directional arc-consistency: another restriction on propagation D4={white,blue,black} D3={red,white,blue} D2={green,white,black} D1={red,white,black} X1=x2, x1=x3,x3=x4

Spring 2007 ICS Constraint Networks 4 Directional arc-consistency: another restriction on propagation D4={white,blue,black} D3={red,white,blue} D2={green,white,black} D1={red,white,black} X1=x2, x1=x3,x3=x4

Spring 2007 ICS Constraint Networks 5 Directional arc-consistency: another restriction on propagation D4={white,blue,black} D3={red,white,blue} D2={green,white,black} D1={red,white,black} X1=x2, x1=x3, x3=x4 After DAC: D1= {white}, D2={green,white,black}, D3={white,blue}, D4={white,blue,black}

Spring 2007 ICS Constraint Networks 6 Algorithm for directional arc- consistency (DAC) Complexity :

Spring 2007 ICS Constraint Networks 7 Example of DAC

Spring 2007 ICS Constraint Networks 8 Is DAC sufficient?

Spring 2007 ICS Constraint Networks 9 Directional arc-consistency may not be enough  Directional path-consistency

Spring 2007 ICS Constraint Networks 10 Directional path-consistency

Spring 2007 ICS Constraint Networks 11 Algorithm directional path consistency (DPC)

Spring 2007 ICS Constraint Networks 12 Example of DPC

Spring 2007 ICS Constraint Networks 13 Induced-width

Spring 2007 ICS Constraint Networks 14 Directional i-consistency

Spring 2007 ICS Constraint Networks 15 Example

Spring 2007 ICS Constraint Networks 16 Algorithm directional i- consistency

Spring 2007 ICS Constraint Networks 17 The induced-width DPC recursively connects parents in the ordered graph, yielding: Width along ordering d, w(d): max # of previous parents Induced width w*(d): The width in the ordered induced graph Induced-width w*: Smallest induced-width over all orderings Finding w* NP-complete (Arnborg, 1985) but greedy heuristics (min-fill).

Spring 2007 ICS Constraint Networks 18 Induced-width

Spring 2007 ICS Constraint Networks 19 Examples Three orderings : d 1 = (F,E,D,C,B,A), its reversed ordering d 2 = (A,B,C,D,E, F), and d 3 = (F,D,C,B,A,E). Orderings from bottom to top, The parents of A along d 1 are {B,C,E}. Width of A along d 1 is 3, of C along d 1 is 1, of A along d 3 is 2. w(d 1 ) = 3, w(d 2 ) = 2, and w(d 3 ) = 2. The width of graph G is 2.

Spring 2007 ICS Constraint Networks 20 Induced-width and DPC The induced graph of (G,d) is denoted (G*,d) The induced graph (G*,d) contains the graph generated by DPC along d, and the graph generated by directional i- consistency along d

Spring 2007 ICS Constraint Networks 21 Refined Complexity using induced-width Consequently we wish to have ordering with minimal induced-width Induced-width is equal to tree-width to be defined later. Finding min induced-width ordering is NP-complete

Spring 2007 ICS Constraint Networks 22 Greedy algorithms for iduced- width Min-width ordering Max-cardinality ordering Min-fill ordering Chordal graphs

Spring 2007 ICS Constraint Networks 23 Min-width ordering

Spring 2007 ICS Constraint Networks 24 Min-induced-width

Spring 2007 ICS Constraint Networks 25 Min-fill algorithm Prefers a node who add the least number of fill-in arcs. Empirically, fill-in is the best among the greedy algorithms (MW,MIW,MF,MC)

Spring 2007 ICS Constraint Networks 26 Cordal graphs and Max- cardinality ordering A graph is cordal if every cycle of length at least 4 has a chord Finding w* over chordal graph is easy using the max-cardinality ordering If G* is an induced graph it is chordal K-trees are special chordal graphs. Finding the max-clique in chordal graphs is easy (just enumerate all cliques in a max- cardinality ordering

Spring 2007 ICS Constraint Networks 27 Example We see again that G in Figure 4.1(a) is not chordal since the parents of A are not connected in the max-cardinality ordering in Figure 4.1(d). If we connect B and C, the resulting induced graph is chordal.

Spring 2007 ICS Constraint Networks 28 Max-caedinality ordering Figure 4.5 The max-cardinality (MC) ordering procedure.

Spring 2007 ICS Constraint Networks 29 Width vs local consistency: solving trees

Spring 2007 ICS Constraint Networks 30 Tree-solving

Spring 2007 ICS Constraint Networks 31 Width-2 and DPC

Spring 2007 ICS Constraint Networks 32 Width vs directional consistency (Freuder 82)

Spring 2007 ICS Constraint Networks 33 Width vs i-consistency DAC and width-1 DPC and width-2 DIC_i and with-(i-1)  backtrack-free representation If a problem has width 2, will DPC make it backtrack-free? Adaptive-consistency: applies i-consistency when i is adapted to the number of parents

Spring 2007 ICS Constraint Networks 34 Special case of induced-width 2

Spring 2007 ICS Constraint Networks 35 Adaptive-consistency

Spring 2007 ICS Constraint Networks 36 Bucket E: E  D, E  C Bucket D: D  A Bucket C: C  B Bucket B: B  A Bucket A: A  C contradiction = D = C B = A Bucket Elimination Adaptive Consistency (Dechter & Pearl, 1987) = 

Spring 2007 ICS Constraint Networks 37 A E D C B E A D C B || R D BE, || R E || R DB || R DCB || R ACB || R AB RARA R C BE Bucket Elimination Adaptive Consistency (Dechter & Pearl, 1987)

Spring 2007 ICS Constraint Networks 38 The Idea of Elimination 3 value assignment D B C R DBC eliminating E

Spring 2007 ICS Constraint Networks 39 Adaptive- consistency, bucket-elimination

Spring 2007 ICS Constraint Networks 40 Properties of bucket-elimination (adaptive consistency) Adaptive consistency generates a constraint network that is backtrack-free (can be solved without dead- ends). The time and space complexity of adaptive consistency along ordering d is respectively, or O(r k^(w*+1)) when r is the number of constraints. Therefore, problems having bounded induced width are tractable (solved in polynomial time) Special cases: trees ( w*=1 ), series-parallel networks (w*=2 ), and in general k-trees ( w*=k ).

Spring 2007 ICS Constraint Networks 41 Back to Induced width Finding minimum-w* ordering is NP-complete (Arnborg, 1985) Greedy ordering heuristics: min-width, min-degree, max-cardinality (Bertele and Briochi, 1972; Freuder 1982), Min-fill.

Spring 2007 ICS Constraint Networks 42 Solving Trees (Mackworth and Freuder, 1985) Adaptive consistency is linear for trees and equivalent to enforcing directional arc-consistency (recording only unary constraints)

Spring 2007 ICS Constraint Networks 43 Summary: directional i-consistency A E CD B D C B E D C B E D C B E Adaptived-arcd-path

Spring 2007 ICS Constraint Networks 44 Variable Elimination Eliminate variables one by one: “constraint propagation” Solution generation after elimination is backtrack-free

Spring 2007 ICS Constraint Networks 45 Relational consistency ( Chapter 8 ) Relational arc-consistency Relational path-consistency Relational m-consistency Relational consistency for Boolean and linear constraints: Unit-resolution is relational-arc-consistency Pair-wise resolution is relational path- consistency

Spring 2007 ICS Constraint Networks 46 Sudoku’s propagation What kind of propagation we do?

Spring 2007 ICS Constraint Networks 47 Complexity of Adaptive-consistency as a function of the hypergraph Processing a bucket is also exponential in its number of constraints. The number of constraints in a bucket is bounded by the total, r, number of functions. Can we have a better bound? Theorem: If we combine buckets into superbuckets so variables in a superbucket are covered by original constraints scopes, then “super-bucket elimination is exp in the max number of constraints in a superbucket. Hyper-induced-width: The number of constraint in a super-bucket

Spring 2007 ICS Constraint Networks 48 Example of CSP: Find Possible Genotypes Possible haplotypes : h 1 =A,A 1 ; h 2 = A,A 2 ; h 3 =O,A 1 ; h 4 = O,A 2 ; Possible genotypes (ordered): g ij = h i /h j (father/mother) Ind #1: g 11, g 13, g 31 Ind #2: g 44 Ind #3: g 12, g 21, g 14, g 41, g 23, g 32 Ind #4: g 22, g 24, g 42 Ind #5: g 34, g A A 1 /A 1 O A 2 /A 2 A A 1 /A 2 A A 2 /A 2 O A 1 /A 2 A pedigree typed at the ABO and AK1 loci : Constraints: R 13 = {, }, similarly for R 23, R 35, and R 45.

Spring 2007 ICS Constraint Networks 49 Bucket Elimination for CSP Given an order X 1,…, X n : Distribute constraints into buckets, according to the highest variable in the constraint’s scope. Process the buckets one by one starting from X n : Join the constraints (with equal value for the eliminated variable), in the eliminated bucket and remember the value of the eliminated variable. Put the new constraint in the bucket of the highest variable in its scope.

Spring 2007 ICS Constraint Networks 50 Example of CSP: Find Possible Genotypes (cont. 1) Constraints : R AC = {, }; R BC = { }; R CE = {<g {14,41,23,32} g {34,43} }; R DE = { } AB C E D Constraint graph

Spring 2007 ICS Constraint Networks 51 Find Possible Genotypes (cont. 2) Suppose an order: A, B, C, D, E. The distribution into buckets is as follows: BABAB BCBC BDBD BEBE Initial R AC = {, } R BC = { } R CE = { } R DE = { }

Spring 2007 ICS Constraint Networks 52 Find Possible Genotypes (cont. 3) BABAB BCBC BDBD BEBE Initial R AC = {, } R BC = { } R CE = { } R DE = { } When processing B E : compute R CD = { }. put R CD into bucket B D. remember the value g 34 for E.

Spring 2007 ICS Constraint Networks 53 Find Possible Genotypes (cont. 4) BABAB BCBC BDBD BEBE After Processing B E. R AC = {, } R BC = { } R CD = { } When processing B D : compute R C = {<g {23,32,14,41 } }. put R C into bucket B C. remember the values g 24,g 42 for D.

Spring 2007 ICS Constraint Networks 54 Find Possible Genotypes (cont. 5) BABAB BCBC BDBD BEBE After Processing B D. R AC = {, } R BC = { } R C = {<g {23,32,14,41} }. When processing B C : compute R AB = { }. put R C into bucket B B. remember the value g 14 for C.

Spring 2007 ICS Constraint Networks 55 Find Possible Genotypes (cont. 6) BABAB BCBC BDBD BEBE After Processing B D. R AB = { }. When processing B B : compute R A = { }. remember the value g 44 for B.

Spring 2007 ICS Constraint Networks 56 Find Possible Genotypes (cont. 7) By backtracking we get the possible values: Ind #1 (A): g 11, g 13, g 31 Ind #2 (B): g 44 Ind #3 (C): g 14 Ind #4 (D): g 24, g 42 Ind #5 (E): g A A 1 /A 1 O A 2 /A 2 A A 1 /A 2 A A 2 /A 2 O A 1 /A A|A or A|O or O|A A 1 |A 1 O|O A 2 |A 2 A|O A 1 |A 2 A|O or O|A A 2 /A 2 O|O A 1 |A 2

Spring 2007 ICS Constraint Networks 57 Probability of data in one locus given an inheritance vector S 23m L 21f L 21m L 23m X 21 S 23f L 22f L 22m L 23f X 22 X 23 Model for locus 2 P(x 21, x 22, x 23 |s 23m,s 23f ) = =  P(l 21m ) P(l 21f ) P(l 22m ) P(l 22f ) P(x 21 | l 21m, l 21f ) P(x 22 | l 22m, l 22f ) P(x 23 | l 23m, l 23f ) P(l 23m | l 21m, l 21f, S 23m ) P(l 23f | l 22m, l 22f, S 23f ) l 21m,l 21f,l 22m,l 22f l 22m,l 22f The five last terms are always zero-or-one, namely, indicator functions.

Spring 2007 ICS Constraint Networks 58 Efficient computation S 23m L 21f L 21m L 23m X 21 S 23f L 22f L 22m L 23f X 22 X 23 Model for locus 2 Assume only individual 3 is genotyped. For the inheritance vector (0,1), the founder alleles L 21m and L 22f are not restricted by the data while (L 21f,L 22m ) have two possible joint assignments (A 1,A 2 ) or (A 2,A 1 ) only: The five last terms are always zero-or-one, namely, indicator functions. ={A 1,A 2 } =1 =0 p(x 21, x 22, x 23 |s 23m =1,s 23f =0 ) = p( A 1 )p( A 2 ) + p( A 2 )p( A 1 ) In general. Every inheritance vector defines a subgraph of the Bayesian network above. We build a founder graph

Spring 2007 ICS Constraint Networks 59 Efficient computation S 23m L 21f L 21m L 23m X 21 S 23f L 22f L 22m L 23f X 22 X 23 Model for locus 2 The five last terms are always zero-or-one, namely, indicator functions. ={A 1,A 2 } =1 =0 In general. Every inheritance vector defines a subgraph as indicated by the black lines above. Construct a founder graph whose vertices are the founder variables and where there is an edge between two vertices if they have a common typed descendent. The label of an edge is the constraint dictated by the common typed descendent. Now find all consistent assignments for every connected component. L 21f L 21m L 22f L 22m {A 1,A 2 }

Spring 2007 ICS Constraint Networks 60 A Larger Example a,b a,c b,d {a,b} 5364 {b,d} {a,c} Descent graph Founder graph (An example of a constraint satisfaction graph) Connect two nodes if they have a common typed descendant. The constraint {a,b} means the relation {(a,b)(b,a)}

Spring 2007 ICS Constraint Networks 61 The Constraint Satisfaction Problem {a,b} 5364 {b,d} {a,c} The number of possible consistent alleles per non-isolated node is 0, 1 or 2. For example node 2 has all possible alleles, node 6 can only be b because its domain must be {a,b} and {b,d}. and node 3 can be assigned either a or b. namely, the intersection of its adjacent edges labels. For each non-singleton connected component: Start with an arbitrary node, pick one of its values. This dictates all other values in the component. Repeat with the other value if it has one. So each non-singleton component yields at most two solutions. What is the special constriant problem here?

Spring 2007 ICS Constraint Networks 62 Solution of the CSP {a,b} 5364 {b,d} {a,c} Since each non-singleton component yields at most two solutions. The likelihood is simply the product of sums each of two terms at most. Each component contributes one term. Singleton components contribute the term 1 In our example: 1 * [ p(a)p(b)p(a) + p(b)p(a)p(b)] * p(d)p(b)p(a)p(c). Complexity. Building the founder graph: O(f 2 +n). While solving general CSPs is NP-hard. This is graph coloring where domains are often size 2.