Soft constraint processing Thomas Schiex INRA – Toulouse France Javier Larrosa UPC – Barcelona Spain Thanks to … School organizers Francesca, Michela,

Slides:



Advertisements
Similar presentations
Constraint Satisfaction Problems
Advertisements

Constraint Satisfaction Problems
Constraint Satisfaction Problems
ECE 667 Synthesis and Verification of Digital Circuits
ALIMENTATION AGRICULTURE ENVIRONNEMENT Max-CSP solver competition 2008 toulbar2 Marti Sanchez 1, Sylvain Bouveret 2, Simon de Givry 1, Federico Heras 3,
1 Constraint Satisfaction Problems A Quick Overview (based on AIMA book slides)
1 Finite Constraint Domains. 2 u Constraint satisfaction problems (CSP) u A backtracking solver u Node and arc consistency u Bounds consistency u Generalized.
Artificial Intelligence Constraint satisfaction problems Fall 2008 professor: Luigi Ceccaroni.
IJCAI Russian Doll Search with Tree Decomposition Martí Sànchez, David Allouche, Simon de Givry, and Thomas Schiex INRA Toulouse, FRANCE.
Lecture 24 Coping with NPC and Unsolvable problems. When a problem is unsolvable, that's generally very bad news: it means there is no general algorithm.
Constraint Optimization Presentation by Nathan Stender Chapter 13 of Constraint Processing by Rina Dechter 3/25/20131Constraint Optimization.
Foundations of Constraint Processing More on Constraint Consistency 1 Foundations of Constraint Processing CSCE421/821, Spring
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
Optimization of Pearl’s Method of Conditioning and Greedy-Like Approximation Algorithm for the Vertex Feedback Set Problem Authors: Ann Becker and Dan.
Soft constraint processing Thomas Schiex INRA – Toulouse France Javier Larrosa UPC – Barcelona Spain These slides are provided as a teaching support for.
Soft constraint processing Thomas Schiex INRA – Toulouse France Pedro Meseguer CSIC – IIIA – Barcelona Spain Special thanks to: Javier Larrosa UPC – Barcelona.
Introduction Combining two frameworks
1 Refining the Basic Constraint Propagation Algorithm Christian Bessière and Jean-Charles Régin Presented by Sricharan Modali.
4 Feb 2004CS Constraint Satisfaction1 Constraint Satisfaction Problems Chapter 5 Section 1 – 3.
Recent Development on Elimination Ordering Group 1.
Unifying Local and Exhaustive Search John Hooker Carnegie Mellon University September 2005.
Cost function optimization & local consistency Thomas Schiex INRA – Toulouse France.
Constraint Satisfaction Problems
Approximation Algorithms
Jean-Charles REGIN Michel RUEHER ILOG Sophia Antipolis Université de Nice – Sophia Antipolis A global constraint combining.
Utrecht, february 22, 2002 Applications of Tree Decompositions Stan van Hoesel KE-FdEWB Universiteit Maastricht
2-Layer Crossing Minimisation Johan van Rooij. Overview Problem definitions NP-Hardness proof Heuristics & Performance Practical Computation One layer:
Data Flow Analysis Compiler Design Nov. 8, 2005.
Soft constraints. Classical (hard) Constraints Variables {X 1,…,X n }=X Variables {X 1,…,X n }=X Domains {D(X 1 ),…,D(X n )}=D Domains {D(X 1 ),…,D(X.
Chapter 5 Outline Formal definition of CSP CSP Examples
Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 2 Ryan Kinworthy CSCE Advanced Constraint Processing.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Constraint Satisfaction Problems
1 Constraint Programming: An Introduction Adapted by Cristian OLIVA from Peter Stuckey (1998) Ho Chi Minh City.
Constraint Satisfaction Read Chapter 5. Model Finite set of variables: X1,…Xn Variable Xi has values in domain Di. Constraints C1…Cm. A constraint specifies.
Chapter 5 Section 1 – 3 1.  Constraint Satisfaction Problems (CSP)  Backtracking search for CSPs  Local search for CSPs 2.
CP Summer School Modelling for Constraint Programming Barbara Smith 2. Implied Constraints, Optimization, Dominance Rules.
Hande ÇAKIN IES 503 TERM PROJECT CONSTRAINT SATISFACTION PROBLEMS.
Chapter 5: Constraint Satisfaction ICS 171 Fall 2006.
Data Structures & Algorithms Graphs
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
1 Chapter 5 Constraint Satisfaction Problems. 2 Outlines  Constraint Satisfaction Problems  Backtracking Search for CSPs  Local Search for CSP  The.
A Logic of Partially Satisfied Constraints Nic Wilson Cork Constraint Computation Centre Computer Science, UCC.
NP-COMPLETE PROBLEMS. Admin  Two more assignments…  No office hours on tomorrow.
Valued Constraints Islands of Tractability. Agenda The soft constraint formalism (5 minutes) Valued Constraint Languages (5 minutes) Hard and Easy Languages.
Chapter 5 Constraint Satisfaction Problems
Chapter 2) CSP solving-An overview Overview of CSP solving techniques: problem reduction, search and solution synthesis Analyses of the characteristics.
An Introduction to Artificial Intelligence Lecture 5: Constraint Satisfaction Problems Ramin Halavati In which we see how treating.
Implicit Hitting Set Problems Richard M. Karp Erick Moreno Centeno DIMACS 20 th Anniversary.
CHAPTER 5 SECTION 1 – 3 4 Feb 2004 CS Constraint Satisfaction 1 Constraint Satisfaction Problems.
CSE 589 Part V One of the symptoms of an approaching nervous breakdown is the belief that one’s work is terribly important. Bertrand Russell.
Arc Consistency CPSC 322 – CSP 3 Textbook § 4.5 February 2, 2011.
William Lam March 20, 2013 (with slides from the IJCAI-09 tutorial on Combinatorial Optimization for Graphical Models) Discrete Optimization via Branch.
Soft constraint processing Thomas Schiex INRA – Toulouse France Pedro Meseguer CSIC – IIIA – Barcelona Spain Special thanks to: Javier Larrosa UPC – Barcelona.
1. 2 Outline of Ch 4 Best-first search Greedy best-first search A * search Heuristics Functions Local search algorithms Hill-climbing search Simulated.
Foundations of Constraint Processing, Spring 2009 Structure-Based Methods: An Introduction 1 Foundations of Constraint Processing CSCE421/821, Spring 2009.
Chapter 13 Backtracking Introduction The 3-coloring problem
Approximation Algorithms based on linear programming.
Constraint Programming
An Overview of Soft Constraints
Exact Inference Continued
Constraint Satisfaction Problems
More on Constraint Consistency
Exact Inference Continued
Constraint satisfaction problems
CS 8520: Artificial Intelligence
Constraint Satisfaction Problems
Constraint satisfaction problems
Consistency algorithms
Presentation transcript:

Soft constraint processing Thomas Schiex INRA – Toulouse France Javier Larrosa UPC – Barcelona Spain Thanks to … School organizers Francesca, Michela, Kryzstof… These slides are provided as a teaching support for the community. They can be freely modified and used as far as the original authors (T. Schiex and J. Larrosa) contribution is clearly mentionned and visible and that any modification is acknowledged by the author of the modification.

September 2005First international summer school on constraint processing2 Overview  Introduction and definitions Why soft constraints and dedicated algorithms? Generic and specific models  Handling soft constraint problems Fundamental operations on soft CN Solving by search (systematic and local) Complete and incomplete inference Hybrid (search+inference) Polynomial classes  Ressources

September 2005First international summer school on constraint processing3 Why soft constraints?  CSP framework: for decision problems  Many problems are overconstrained or optimization problems  Economics (combinatorial auctions) Given a set G of goods and a set B of bids…  Bid (B i,V i ), B i requested goods, V i value … find the best subset of compatible bids Best = maximize revenue (sum)

September 2005First international summer school on constraint processing4 Why soft constraints?  Satellite scheduling Spot 5 is an earth observation satellite It has 3 on-board cameras Given a set of requested pictures (of different importance)…  Resources, data-bus bandwidth, setup-times, orbiting …select a subset of compatible pictures with max. importance (sum)

September 2005First international summer school on constraint processing5 Why soft constraints?  Probabilistic inference (bayesian nets) Given a probability distribution defined by a DAG of conditional probability tables And some evidence …find the most probable explanation for the evidence (product)

September 2005First international summer school on constraint processing6 Why soft constraints?  Ressource allocation (frequency assignment) Given a telecommunication network …find the best frequency for each communication link avoiding interferences Best can be:  Minimize the maximum frequency (max)  Minimize the global interference (sum)

September 2005First international summer school on constraint processing7 Why soft constraints  Even in decision problems: the problem may be unfeasible users may have preferences among solutions  It happens in most real problems. Experiment: give users a few solutions and they will find reasons to prefer some of them.

September 2005First international summer school on constraint processing8 Notations and definitions  X={x 1,..., x n } variables ( n variables)  D={D 1,..., D n } finite domains (max size d )  Y⊆X ℓ(Y) = ∏ x i ∊Y D i t∊ℓ(Y) = t Y  Z⊆Y t Y [Z] = projection on Z  t Y [-x i ] = t Y [Y-{x i }]  Relation R ⊆ ℓ(Y), scope Y, denoted R Y  Projection extended to relations

Generic and specific models Valued CN Semiring CN Soft as Hard

September 2005First international summer school on constraint processing10 Combined local preferences  Constraints are local cost functions  Costs combine with a dedicated operator max: priorities +: additive costs *: factorized probabilities…  Goal: find an assignment with the « best » combined cost Fuzzy/possibilistic CN Weighted CN Probabilistic CN, BN 

September 2005First international summer school on constraint processing11 Soft constraint network (CN) (X,D,C)(X,D,C) X={x 1,..., x n } variables D={D 1,..., D n } finite domains C={f,...} e cost functions  f S, f ij, f i f ∅ scope S,{x i,x j },{x i }, ∅  f S (t):  E (ordered by ≼,  ≼ T )  Obj. Function: F(X)=  f S (X[S])  Solution: F(t)  T  Soft CN: find minimal solution identity annihilator commutative associative monotonic

September 2005First international summer school on constraint processing12 Specific frameworks Lexicographic CN, probabilistic CN… Instance E  ≼T≼T Classic CN {t,f}{t,f} and t ≼ f Possibilistic [0,1] max 0≼10≼1 Fuzzy CN [0,1] usual min 1≼01≼0 Weighted CN [0,k]+ 0≼k0≼k Bayes net [0,1] × 1≼01≼0

September 2005First international summer school on constraint processing13 Weighted CSP example (  = +) x3x3 x2x2 x5x5 x1x1 x4x4 F(X): number of non blue vertices For each vertex For each edge: xixi f(x i ) b0 g1 r1 xixi xjxj f(x i,x j ) bbT bg0 br0 gb0 ggT gr0 rb0 rg0 rrT

September 2005First international summer school on constraint processing14 Possibilistic constraint network (  =max) x3x3 x2x2 x5x5 x1x1 x4x4 F(X): highest color used (b<g<r<y) For each vertex Connected vertices must have different colors xixi f(x i ) b0.2 g0.4 r0.6 y0.8

September 2005First international summer school on constraint processing15 Some important details  T = maximum violation. Can be set to a bounded max. violation k Solution: F(t) < k = Top  Empty scope soft constraint f  (a constant) Gives an obvious lower bound on the optimum If you do not like it: f  =  Additional expression power

September 2005First international summer school on constraint processing16 Weighted CSP example (  = +) x3x3 x2x2 x5x5 x1x1 x4x4 F(X): number of non blue vertices For each vertex For each edge: xixi c(x i ) b0 g1 r1 xixi xjxj c(x i,x j ) bbT bg0 br0 gb0 ggT gr0 rb0 rg0 rrT T=6 f  = 0 T=3 Optimal coloration with less than 3 non-blue

September 2005First international summer school on constraint processing17 Microstructural graph (binary network) b g r x5x5 x4x4 x2x2 x1x1 x3x3 T=6 f  =

September 2005First international summer school on constraint processing18 Valued CSP and Semiring CSP Valued CN total order Semiring CN Order specified by ACI operator + c-semiring a + b = b  b preferred Lattice order

September 2005First international summer school on constraint processing19 Examples of P.O. structures  Multiple criteria E 1, E 2 (v 1, v 2 ) ≼ (w 1,w 2 )  v 1 ≼ v 2 and w 1 ≼ w 2 If E 1, E 2 are c-semiring then E 1 x E 2 too E i totally ordered, then multiple criteria = multiple valued CN  Set lattice order Criteria =set of satisfied constraints  = ∩+ = ∪(order = ⊇) c-semiring only

September 2005First international summer school on constraint processing20 Soft constraints as hard constraints  one extra variable x s per cost function f S  all with domain E  f S ➙ c S ∪{x S } allowing (t,f S (t)) for all t∊ℓ(S)  one variable x C =  x s (global constraint)

September 2005First international summer school on constraint processing21 Soft as Hard (SaH)  Criterion represented as a variable  Multiple criteria = multiple variables  Constraints on/between criteria  Can be used also in soft CN  Extra variables (domains), increased arities  SaH constraints give weak propagation

September 2005First international summer school on constraint processing22 A (more) general picture valued semiring weighted probabilistic lexicographic lattice HCLP Partial CSP General fuzzy CN … Soft as Hard Bayes net fuzzy possibilistic classic Idempotent  (a  a = a) set criteria

September 2005First international summer school on constraint processing23 Cost function implication f P implied by g Q ( f P ≼g Q, g Q tighter than f P ) iff for all t∊ℓ(P∪Q), f P (t[P])≼g Q (t[Q]) xixi xjxj g(x i,x j ) bb2 bg3 br2 gb4 gg5 gr4 rbT=6 rg rr xixi f(x i ) b1 g2 r3 xixi g[x i ] b2 g4 rT=6 impliesor strong implication T=1 (classic case): implied = redundant

September 2005First international summer school on constraint processing24 Idempotent soft CN a  a = a (for any a ) For any f S implied by CN (X,D,C) (X,D,C) ≡ (X,D,C∪{f S }) where ≡ means ≼ and ≽ idempotent:implied = redundant (good)  = max (total order)(bad)

Handling soft CN Fundamental operations - assignment - combination/join - projection

September 2005First international summer school on constraint processing26 Assignment (conditioning) xixi xjxj f(x i,x j ) bbT=6 bg0 br0 gb0 gg gr0 rb0 rg0 rr f[x i =b] xjxj bT=6 g0 r0 g(x j ) g[x j =r] 0 0 hh

September 2005First international summer school on constraint processing27 Combination (join with , + here ) xixi xjxj f(x i,x j ) bb6 bg0 gb0 gg6 xjxj xkxk g(x j,x k ) bb6 bg0 gb0 gg6 xixi xjxj xkxk h(x i,x j,x k ) bbb12 bbg6 bgb0 bgg6 gbb6 gbg0 ggb6 ggg  = 0  6

September 2005First international summer school on constraint processing28 Projection (elimination) xixi xjxj f(x i,x j ) bb4 bg6 br0 gb2 gg6 gr3 rb1 rg0 rr6 f[x i ] xixi g(x i ) b g r g[  ] hh 0 Min

Pure systematic search Branch and bound(s) A pause ?

September 2005First international summer school on constraint processing30 Systematic search (LB) Lower Bound (UB) Upper Bound If  UB then prune variables under estimation of the best solution in the sub-tree = best solution so far Each node is a soft constraint subproblem LB ff = f  = T

September 2005First international summer school on constraint processing31 Assigning x 3 x3x3 x2x2 x5x5 x1x1 x4x4 x2x2 x5x5 x1x1 x4x4 x2x2 x5x5 x1x1 x4x4 x2x2 x5x5 x1x1 x4x4

September 2005First international summer school on constraint processing32 T Assign g to X 0 x5x5 x4x4 x2x2 x1x1 x3x3 T=6 f=f= b g r T T b g r 34 T Backtrack

September 2005First international summer school on constraint processing33 Depth First Search (DFS) BT(X,D,C) if ( X=  ) then Top :=f  else x j := selectVar(X) forall a  D j do  f S  C s.t. x j  S f := f[x j =a] f  :=  g S  C s.t. S=  g S if (LB<Top) then BT(X-{x j },D-{D j },C ) variable heuristics improve LB good UB ASAP value heuristics

September 2005First international summer school on constraint processing34 The three queens

September 2005First international summer school on constraint processing35 Improving the LB Current node (X,D,C):  Unassigned constraints  Assigned constraints arity 0: f  arity one: ignored Gives a stronger LB lb fc = f   (  xi∊X min a∊Di f i (a))

September 2005First international summer school on constraint processing36 Three queens Unary assigned constraints: pruning guidance (value ordering) value deletion

September 2005First international summer school on constraint processing37 Still improving the LB  Assigned constraints ( f i,f  )  Original constraints Solve Optimal cost  lb fc Gives a stronger LB Can be solved beforehand

September 2005First international summer school on constraint processing38 Russian Doll Search  static variable ordering  solves increasingly large subproblems  uses an improved version of the previous LB recursively  May speedup search by several orders of magnitude X1X1 X2X2 X3X3 X4X4 X5X5 LightRDS nice CP implementation for unary costs

September 2005First international summer school on constraint processing39 Other approaches  Taking into account unassigned constraints: PFC-DAC (Wallace et al. 1994) PFC-MRDAC (Larrosa et al. 1999…) Integration into the « SaH » schema (Régin et al. 2001)

Local search Nothing really specific

September 2005First international summer school on constraint processing41 Local search Based on perturbation of solutions in a local neighborhood  Simulated annealing  Tabu search  Variable neighborhood search  Greedy rand. adapt. search (GRASP)  Evolutionary computation (GA)  Ant colony optimization…  See: Blum & Roli, ACM comp. surveys, 35(3), 2003

September 2005First international summer school on constraint processing42 Boosting Systematic Search with Local Search Local search (X,D,C) time limit Sub-optimal solution  Do local search prior systematic search  Use best cost found as initial T If optimal, we just prove optimality In all cases, we may improve pruning

September 2005First international summer school on constraint processing43 Boosting Systematic Search with Local Search  Ex: Frequency assignment problem Instance: CELAR6-sub4  #var: 22, #val: 44, Optimum: 3230 Solver: toolbar with default options UB initialized to  3 hours UB initialized to 3230  1 hour  Optimized local search can find the optimum in a few minutes  Other combination: use systematic search in large neighborhoods

Complete inference What is it Variable (bucket) elimination Graph structural parameters

September 2005First international summer school on constraint processing45 Inference…  Automated production of implied constraints Cost function implication Cost function implication  Complete… Is able to produce the strongest implied constraint on the empty scope  Gives the optimal cost  Classic case: detects infeasibility

September 2005First international summer school on constraint processing46 Variable elimination (aka bucket elimination)  Solves the problem by a sequence of problem transformations  Each step yields a problem with one less variable and the same optimum.  When all variables have been eliminated, the problem is solved  Optimal solutions of the original problem can be recomputed

September 2005First international summer school on constraint processing47 CC Variable/bucket elimination  Select a variable x i  Compute the set K i of constraints that involves this variable  Add  Remove variable and K i  Time:  (exp(deg i +1))  Space:  (exp(deg i )) X 4 X 3 X5X5 X 2 X 1

September 2005First international summer school on constraint processing48 Three queens…. combine

September 2005First international summer school on constraint processing49 Three queens… eliminate, add

September 2005First international summer school on constraint processing50 Three queens

September 2005First international summer school on constraint processing51 Elimination order influence  Order G,D,F,B,C,A  Order G,B,C,D,F,A

September 2005First international summer school on constraint processing52 Induced width  For G=(V,E) and a given elimination (vertex) ordering, the largest degree encountered is the induced width of the ordered graph  Minimizing induced width (dimension) is NP-hard.  Heuristics, better criteria, hypergraphs

September 2005First international summer school on constraint processing53 How does it looks like ?  A graph has induced width (k-tree #, max- clique #…)  it is a partial k-tree  A k-tree is either a k-clique or a k-tree with an additional vertex connected to all vertices of a k-clique in it.

September 2005First international summer school on constraint processing54 History / terminology  Non serial dynamic programming (Bertelé Brioschi, 72) for cost functions (VE, block)  Acyclic DB (Beeri et al 1983)  Bayesian nets (Pearl 88, Lauritzen et Spiegelhalter 88)  Constraint nets (Dechter and Pearl 88)  Semirings (Shenoy and Shafer 90)  C-semirings (Bistarelli et al. 95)…

September 2005First international summer school on constraint processing55 Real example (CELAR SCEN06)

Incomplete inference Mini buckets Local consistency (idempotent  or …)

September 2005First international summer school on constraint processing57 Incomplete inference  Tries to trade completeness for space/time Produces only specific classes of implied constraints Often in order to produce a lb on the optimum And usually in polynomial time/space 1.Keep these constraints outside: mini-buckets 2.Add them to the network: local consistency a.directly (idempotent: implied = redundant) b.Try to accomodate for non idempotency

September 2005First international summer school on constraint processing58  Eliminate x vs. « mini-eliminate » x Repeat: produces a lower bound (and an upper bound) Complexity/strength controlled by the # of variables in each mini bucket. Mini buckets (Dechter 97) x x

September 2005First international summer school on constraint processing59 Local consistency  Very useful in classical CSP Node consistency Arc consistency…  Produces an equivalent more explicit problem: Same optimum (consistency): gives a lb (may detect unfeasibility) Compositional: can be synergetically combined with other techniques « transparently »

September 2005First international summer school on constraint processing60 Classical arc consistency (binary CSP)  for any x i and c ij c=(c ij ⋈ c j )[x i ] brings no new information on x i w v v w ij T T xixi xjxj c ij vv0 vw0 wvT wwT xixi c(x i ) v0 wT c ij ⋈ c j T (c ij ⋈ c j )[-x j ]

September 2005First international summer school on constraint processing61 Arc consistency and soft constraints w v v w ij xixi xjxj f ij vv0 vw0 wv1 ww0.5 XiXi f(x i ) v0 w0.5 f ij  f j 0.5 Always equivalent iff  idempotent  for any x i and f ij f=(f ij  f j )[x i ] brings no new information on x i (f ij  f j )[x i ]

September 2005First international summer school on constraint processing62 Idempotent soft CN (Bistarelli et al. 97)  The previous operational extension works on any idempotent semiring CN Repeated application of the local enforcing process until quiescence Terminates and yields an equivalent problem Extends to non binary constraints, to higher level of consistency (k-consistency) Total order: idempotent  (  = max)

September 2005First international summer school on constraint processing63 Non idempotent: weighted CN  for any x i and f ij f=(f ij  f j )[x i ] brings no new information on x i w v v w ij 2 1 xixi xjxj f ij vv0 vw0 wv2 ww1 xixi f(x i ) v0 w1 f ij  f j 1 EQUIVALENCE LOST f ij  f j [x i ]

September 2005First international summer school on constraint processing64  Combination+Extraction: equivalence preserving transformation  Requires the existence of a pseudo difference (a b)  b = a: fair valued CN. Extraction of implied constraints 2 w v v w ij xixi xjxj f ij vv0 vw0 wv2 ww1 xixi f(x i ) v0 w1 f ij  f j f ij  f j [x i ] xixi xjxj f ij vv0 vw0 wv1 ww0

September 2005First international summer school on constraint processing65 Fair and unfair Classic CN: a ⊖b = or (max) Fuzzy CN: a⊖b = max ≼ Weighted CN: a⊖b = a-b (a≠T) else T Bayes nets:a⊖b = / … (N∪{ ,T},≽)  n years of prison  life imprisonment  death penalty m  n=m+n   n =     = T Set of differences of  and  is N, no maximum difference

September 2005First international summer school on constraint processing66 (K,Y) equivalence preserving inference  For a set K of constraints and a scope Y Replace K by (  K) Add (  K)[Y] to the CN (implied by K) Extract (  K)[Y] from (  K)  Yields an equivalent network  All implicit information on Y in K is explicit  Repeat for a class of (K,Y) until fixpoint

September 2005First international summer school on constraint processing67 Node Consistency (NC * ): (f i,  ) EPI For any variable X i   a, f  + f i (a)<T   a, f i (a)= 0 Complexity: O(nd) w v v v w w C =C = T = x y z

September 2005First international summer school on constraint processing Full AC (FAC * ): ({f ij,f j },x i ) EPI NC * For all f ij   a  b f ij (a,b) + f j (b) = 0 (full support) w v v w C  =0 T= x z 1 1 That’s our starting point! No termination !!!

September 2005First international summer school on constraint processing69 0 Arc Consistency (AC * ): ({f ij },x i ) EPI NC * For all f ij   a  b f ij (a,b)= 0 b is a support complexity: O(n 2 d 3 ) w v v w w C =C = T= x y z

September 2005First international summer school on constraint processing70 Confluence is lost Finding an AC closure that maximizes the lb is an NP- hard problem (Cooper & Schiex 2004)

September 2005First international summer school on constraint processing Directional AC (DAC * ): ({f ij,f j },x i ) i<j EPI NC * For all f ij (i<j)   a  b f ij (a,b) + f j (b) = 0 b is a full-support complexity: O(ed 2 ) w v v v w w C =C = T= x y z x<y<zx<y<z

September 2005First international summer school on constraint processing Full DAC (FDAC * ): DAC + AC NC * For all f ij (i<j)   a  b f ij (a,b) + f j (b) = 0 (full support) For all f ij (i>j)   a  b, f ij (a,b) = 0 (support) Complexity: O(end 3 ) w v v v w w C =C = T= x y z x<y<zx<y<z

September 2005First international summer school on constraint processing73 Implementation example for AC Data structures  A queue Q of variables (pruned domains)  S(i,a,j): current support for a∊ D i on f ij (AC)  S(i): current support for f ∅ on x i (NC)

September 2005First international summer school on constraint processing74 AC 2001 based version Support for Node consistency Support for Arc consistency

September 2005First international summer school on constraint processing75 Value deletion (cost back propagation)

September 2005First international summer school on constraint processing76 The fixpoint loop

September 2005First international summer school on constraint processing77 Space complexity  Space can be reduced to O(ed) O(erd) for arity r (not only tables)

September 2005First international summer school on constraint processing78 Hierarchy NC * O(nd) AC * O(n 2 d 3 )DAC * O(ed 2 ) FDAC * O(end 3 ) AC NC DAC Special case: CSP (Top=1) EDAC (ijcai 2005)

September 2005First international summer school on constraint processing79 Other results  AC defined for arbitrary fair structures and arbitrary arities (Schiex 2000, Cooper & Schiex 2004)  k-consistency: from all constraints with a scope included in a set of n variables to a subset of n-1 variables (Cooper 2005)  Cyclic consistency (Cooper 2004), Existential AC (de Givry et al. 2005) : not easily described as EPI.

Global soft constraints Using SaH schema Compare with Soft AC

September 2005First international summer school on constraint processing81 Global soft constraints  Existing global soft constraints are «Soft as Hard» (Petit et al. 2000)  Three questions What is the semantics of the global constraints What level of consistency is enforced And how ?

September 2005First international summer school on constraint processing82 The all-diff (Petit et al 2001, van Hoeve 2004)  Semantics ? a) number of variables whose value needs to be changed to satisfy the constraint (max n). b) primal graph violations: # of pair of variables with the same value (max n(n-1)/2).  Level of consistency: hyper arc (GAC)  Algorithm: (minimum cost) maximum flow algorithms on the value graph (with cost).

September 2005First international summer school on constraint processing83 Other soft global constraints  Soft global cardinality (van Hoeve et al. 2004)  Soft regular (van Hoeve et al. 2004)  Other possible semantics for soft constraints (some with NP-hard consistency checks) (Beldiceanu & Petit 2004)

September 2005First international summer school on constraint processing84 GAC on SaH vs. Soft AC ? f ∅ =1 a b x1x1 x3x3 x2x2 x1x1 x3x3 x2x x 12 x 23 xcxc x c =x 12 +x 23 2

September 2005First international summer school on constraint processing85 So…  Is it possible… That soft AC is always at least as good as SaH GAC ? (my conjecture is yes) To offer cost variables and nicely integrate soft CN (local consistency, unary constraints for heuristics…) (my conjecture is yes) To improve current global soft constraints handling so that they reach (or exceed) soft AC, DAC, FDAC, EDAC…

Hybrid search (search+inference) + variable elimination + mini buckets + local consistency

September 2005First international summer school on constraint processing87 Cycle cutset  Tree-structured problem are easy Assign variables until all cycles are cut (all possible ways) Solve the remaining tree using VE  Time exponential in the cutset size (NP- hard to minimize)  Space economical

September 2005First international summer school on constraint processing88 Boosting search with variable elimination: BB-VE(k)  At each node Select an unassigned variable x i If deg i ≤ k then eliminate x i Else branch on the values of x i  Properties BE-VE(-1) is BB BE-VE(w*) is VE BE-VE(1) is similar to cycle-cutset

September 2005First international summer school on constraint processing89 Example BB-VE(2)

September 2005First international summer school on constraint processing90 Example BB-VE(2) cc cc cc

September 2005First international summer school on constraint processing91 Boosting search with variable elimination  Ex: still-life (academic problem) Instance: n=14  #var:196, #val:2 Ilog Solver  5 days Variable Elimination  1 day BB-VE(18)  2 seconds Other combination: Backtrack with Tree Decomposition (BTD, Jégou and Terrioux 2003).

Use incomplete inference Lower bound inside BB

September 2005First international summer school on constraint processing93 Using mini buckets (Kask & Dechter 99)  Assume a DFS tree search using a static variable order x 1 …x n  Compute mini-buckets (elim. x n …x 1 ) mini-eliminating x generates a family of cost functions f x i j = (  K x i j )[-x i ]  When (x 1 …x p ) assigned, any f x i j Whose scope is included in (x 1 …x p ) Such that i>p (implied by non completely assigned constraints) Has a known value non redundant with lb

September 2005First international summer school on constraint processing94 Boosting search with LC BT(X,D,C) if (X=  ) then Top :=f  else x j := selectVar(X) forall a  D j do  f S  C s.t. x j  S f S := f S [x j =a] if (LC) then BT(X-{x j },D-{D j },C)

September 2005First international summer school on constraint processing95 BT MNC MAC/MDAC MFDAC MEDAC

Experiments Evaluating bounds/algorithms Why is WCSP so hard ? (informally)

September 2005First international summer school on constraint processing97 Experimenting with soft CN  A good lb is a compromise between its strength and its computational cost academic problems (from n-queens to graph coloring) real problems (frequency allocation, pedigree analysis…) random problems (Max CSP, Max SAT, weighted Max SAT) Phase transitions ?

September 2005First international summer school on constraint processing98 Classic CSP

September 2005First international summer school on constraint processing99 Optimization vs. satisfaction

September 2005First international summer school on constraint processing100 Why is it so hard ? Problem P(alpha): is there an assignment of cost lower than alpha ? Proof of inconsistency (simplest) Proof of optimality (hardest) Harder than finding an optimum

September 2005First international summer school on constraint processing101 CPU time n. of variables Random Max CSP

September 2005First international summer school on constraint processing102 Random Max-2SAT PBS, OPBDP,CPLEX use a natural SaH formulation

September 2005First international summer school on constraint processing103 Frequency assignment  n communication links  set of available frequencies  close links must use sufficiently different frequencies (|x i -x j | > d ij ) Minimize weighted sum of violated constraints CELAR/GRAPH instances: still open problems

September 2005First international summer school on constraint processing104 Boosting Systematic Search with Local consistency Frequency assignment problem CELAR6-sub4 #var: 22, #val: 44, #constraints: 477 toolbar: MNC*  1 yearMFDAC*  1 hour Unweighted PFCMRDAC  154.5” MFDAC* toolbar  36.2” MEDAC* toolbar  4.4” AOMFDAC REES  2574” (SVO)

September 2005First international summer school on constraint processing105 Uncapacited warehouse location  We consider n stores and m possible warehouse locations.  Each warehouse yields to a cost c e for the maintenance, and c em for the supply of a store.  Each store must be supplied by exactly one warehouse. Which warehouses should be opened? NP-hard problem.

September 2005First international summer school on constraint processing106 Model  n integer variables (stores), domain size m  m boolean for candidates warehouses  m+n unary cost functions (costs)  n.m hard binary constraints relating stores and warehouses

September 2005First international summer school on constraint processing107 Results CSPLIBcap71cap101cap131 Size16 x 5025 x 5050 x 50 MFDAC MEDAC CPLEX Kratica et al. 2001MO1MO2MO3MP1 Size100 x x200 MFDAC MEDAC CPLEX

Complexity & Polynomial classes Tree = induced width 1 Idempotent  or not…

September 2005First international summer school on constraint processing109 Idempotent VCSP (  = max)  A new operation on soft constraints: The  -cut of a soft constraint f is the hard constraint c  that forbids t iff f(t) >   Can be applied to a whole soft CN xixi xjxj f(x i,x j ) bb0.4 bg0.7 gb0.9 gg0.5 Cut(f, 0.6) xixi xjxj c(x i,x j ) bb0 bgT gbT gg0

September 2005First international summer school on constraint processing110 Solving a min/max CN by slicing  Let  be the minimum  s.t. the  -cut is consistent.  The set of solutions of the (classical)  -cut = the set of all optimal solutions of the min/max CN  A possibilistic/fuzzy CN can be solved in O(log(# different costs)) CSP.  Can be used to lift polynomial classes Binary search

September 2005First international summer school on constraint processing111 Polynomial classes Idempotent VCSP: min-max CN  Can use  -cuts for lifting CSP classes Sufficient condition: the polynomial class is «conserved» by  -cuts  Simple TCSP are TCSP where all constraints use 1 interval: x i -x j ∊[a ij,b ij ]  Fuzzy STCN: any slice of a cost function is an interval (semi-convex function)

September 2005First international summer school on constraint processing112 Polynomial classes Non idempotent (weighted CN)  MaxSat is MAXSNP complete (no PTAS)  Weighted MaxSAT is FP NP - complete  MaxSAT is FP NP [O(log(n))] complete: weights !  MaxSAT tractable langages fully characterized (Creignou 2001)  MaxCSP langage: f eq (x,y) : (x = y) ? 0 : 1 is NP-hard.

September 2005First international summer school on constraint processing113 Generalized intervals…  Assume domains are ordered f c [a,b] (x,y): (x ∊[a,b]) ? c : 0 is a generalized interval function Tractable  u ≤ x, v ≤ y f(u,v)+f(x,y) ≤ f(u,y)+f(x,v) is a submodular function: decomposes in GI Tractable O(n 3 d 3 ) ax+by+c, sqrt(x 2 +y 2 ),|x-y| r (r≥1)…

Ressources Solvers Benchmarks …

September 2005First international summer school on constraint processing115 Ressources (solvers, benchs…)  Con'Flex: fuzzy CSP system with integer, symbolic and numerical constraints ( ).  clp(FD,S): semi-ring CLP ( pauillac.inria.fr/.georget/clp_fds/clp_fds.html ).  LVCSP: Common-Lisp library for Valued CSP with an emphasis on strictly monotonic operators ( ftp.cert.fr/pub/lemaitre/LVCSP ).  Choco: was a claire library for CSP. Existing layers above Choco implements some WCSP algorithms ( choco.sourceforge.net ).  REES: a C++ library by Radu Matuescu (R. Dechter lab.)  Incop : incomplete algorithms (go with the winners)

September 2005First international summer school on constraint processing116 Toolbar and the soft constraints wiki Open source library  Accessible from the Soft wiki site :  Implements MNC,MAC,MDAC,MFDAC,MEDAC  Reads MaxCSP/SAT (weighted or not) and ERGO bayes nets file format  Together with thousands of benchmarks in standardized format  Source forge at

September 2005First international summer school on constraint processing117 Conclusion  A large subpart of classic CN body of knowledge has been extended adapted to soft CN and efficient solving tool exist.  Much remains to be done: Extension: to other problems than optimization (counting, quantification…) Techniques: symmetries, learning, knowledge compilation… Algorithmic: still better lb, other local consistencies or dominance. Exploiting problem structure. Implementation: better integration with classic CN solver (Choco, Solver) Applications: problem modelling, solving, heuristic guidance, partial solving (see Tools (de Givry et al., 2004)).