Download presentation
Presentation is loading. Please wait.
1
Soft constraint processing Thomas Schiex INRA – Toulouse France Javier Larrosa UPC – Barcelona Spain Thanks to … School organizers Francesca, Michela, Kryzstof… These slides are provided as a teaching support for the community. They can be freely modified and used as far as the original authors (T. Schiex and J. Larrosa) contribution is clearly mentionned and visible and that any modification is acknowledged by the author of the modification.
2
September 2005First international summer school on constraint processing2 Overview Introduction and definitions Why soft constraints and dedicated algorithms? Generic and specific models Handling soft constraint problems Fundamental operations on soft CN Solving by search (systematic and local) Complete and incomplete inference Hybrid (search+inference) Polynomial classes Ressources
3
September 2005First international summer school on constraint processing3 Why soft constraints? CSP framework: for decision problems Many problems are overconstrained or optimization problems Economics (combinatorial auctions) Given a set G of goods and a set B of bids… Bid (B i,V i ), B i requested goods, V i value … find the best subset of compatible bids Best = maximize revenue (sum)
4
September 2005First international summer school on constraint processing4 Why soft constraints? Satellite scheduling Spot 5 is an earth observation satellite It has 3 on-board cameras Given a set of requested pictures (of different importance)… Resources, data-bus bandwidth, setup-times, orbiting …select a subset of compatible pictures with max. importance (sum)
5
September 2005First international summer school on constraint processing5 Why soft constraints? Probabilistic inference (bayesian nets) Given a probability distribution defined by a DAG of conditional probability tables And some evidence …find the most probable explanation for the evidence (product)
6
September 2005First international summer school on constraint processing6 Why soft constraints? Ressource allocation (frequency assignment) Given a telecommunication network …find the best frequency for each communication link avoiding interferences Best can be: Minimize the maximum frequency (max) Minimize the global interference (sum)
7
September 2005First international summer school on constraint processing7 Why soft constraints Even in decision problems: the problem may be unfeasible users may have preferences among solutions It happens in most real problems. Experiment: give users a few solutions and they will find reasons to prefer some of them.
8
September 2005First international summer school on constraint processing8 Notations and definitions X={x 1,..., x n } variables ( n variables) D={D 1,..., D n } finite domains (max size d ) Y⊆X ℓ(Y) = ∏ x i ∊Y D i t∊ℓ(Y) = t Y Z⊆Y t Y [Z] = projection on Z t Y [-x i ] = t Y [Y-{x i }] Relation R ⊆ ℓ(Y), scope Y, denoted R Y Projection extended to relations
9
Generic and specific models Valued CN Semiring CN Soft as Hard
10
September 2005First international summer school on constraint processing10 Combined local preferences Constraints are local cost functions Costs combine with a dedicated operator max: priorities +: additive costs *: factorized probabilities… Goal: find an assignment with the « best » combined cost Fuzzy/possibilistic CN Weighted CN Probabilistic CN, BN
11
September 2005First international summer school on constraint processing11 Soft constraint network (CN) (X,D,C)(X,D,C) X={x 1,..., x n } variables D={D 1,..., D n } finite domains C={f,...} e cost functions f S, f ij, f i f ∅ scope S,{x i,x j },{x i }, ∅ f S (t): E (ordered by ≼, ≼ T ) Obj. Function: F(X)= f S (X[S]) Solution: F(t) T Soft CN: find minimal solution identity annihilator commutative associative monotonic
12
September 2005First international summer school on constraint processing12 Specific frameworks Lexicographic CN, probabilistic CN… Instance E ≼T≼T Classic CN {t,f}{t,f} and t ≼ f Possibilistic [0,1] max 0≼10≼1 Fuzzy CN [0,1] usual min 1≼01≼0 Weighted CN [0,k]+ 0≼k0≼k Bayes net [0,1] × 1≼01≼0
13
September 2005First international summer school on constraint processing13 Weighted CSP example ( = +) x3x3 x2x2 x5x5 x1x1 x4x4 F(X): number of non blue vertices For each vertex For each edge: xixi f(x i ) b0 g1 r1 xixi xjxj f(x i,x j ) bbT bg0 br0 gb0 ggT gr0 rb0 rg0 rrT
14
September 2005First international summer school on constraint processing14 Possibilistic constraint network ( =max) x3x3 x2x2 x5x5 x1x1 x4x4 F(X): highest color used (b<g<r<y) For each vertex Connected vertices must have different colors xixi f(x i ) b0.2 g0.4 r0.6 y0.8
15
September 2005First international summer school on constraint processing15 Some important details T = maximum violation. Can be set to a bounded max. violation k Solution: F(t) < k = Top Empty scope soft constraint f (a constant) Gives an obvious lower bound on the optimum If you do not like it: f = Additional expression power
16
September 2005First international summer school on constraint processing16 Weighted CSP example ( = +) x3x3 x2x2 x5x5 x1x1 x4x4 F(X): number of non blue vertices For each vertex For each edge: xixi c(x i ) b0 g1 r1 xixi xjxj c(x i,x j ) bbT bg0 br0 gb0 ggT gr0 rb0 rg0 rrT T=6 f = 0 T=3 Optimal coloration with less than 3 non-blue
17
September 2005First international summer school on constraint processing17 Microstructural graph (binary network) b g r x5x5 x4x4 x2x2 x1x1 x3x3 T=6 f =0 6 0 6 6 11 011 011 0 11 011
18
September 2005First international summer school on constraint processing18 Valued CSP and Semiring CSP Valued CN total order Semiring CN Order specified by ACI operator + c-semiring a + b = b b preferred Lattice order
19
September 2005First international summer school on constraint processing19 Examples of P.O. structures Multiple criteria E 1, E 2 (v 1, v 2 ) ≼ (w 1,w 2 ) v 1 ≼ v 2 and w 1 ≼ w 2 If E 1, E 2 are c-semiring then E 1 x E 2 too E i totally ordered, then multiple criteria = multiple valued CN Set lattice order Criteria =set of satisfied constraints = ∩+ = ∪(order = ⊇) c-semiring only
20
September 2005First international summer school on constraint processing20 Soft constraints as hard constraints one extra variable x s per cost function f S all with domain E f S ➙ c S ∪{x S } allowing (t,f S (t)) for all t∊ℓ(S) one variable x C = x s (global constraint)
21
September 2005First international summer school on constraint processing21 Soft as Hard (SaH) Criterion represented as a variable Multiple criteria = multiple variables Constraints on/between criteria Can be used also in soft CN Extra variables (domains), increased arities SaH constraints give weak propagation
22
September 2005First international summer school on constraint processing22 A (more) general picture valued semiring weighted probabilistic lexicographic lattice HCLP Partial CSP General fuzzy CN … Soft as Hard Bayes net fuzzy possibilistic classic Idempotent (a a = a) set criteria
23
September 2005First international summer school on constraint processing23 Cost function implication f P implied by g Q ( f P ≼g Q, g Q tighter than f P ) iff for all t∊ℓ(P∪Q), f P (t[P])≼g Q (t[Q]) xixi xjxj g(x i,x j ) bb2 bg3 br2 gb4 gg5 gr4 rbT=6 rg rr xixi f(x i ) b1 g2 r3 xixi g[x i ] b2 g4 rT=6 impliesor strong implication T=1 (classic case): implied = redundant
24
September 2005First international summer school on constraint processing24 Idempotent soft CN a a = a (for any a ) For any f S implied by CN (X,D,C) (X,D,C) ≡ (X,D,C∪{f S }) where ≡ means ≼ and ≽ idempotent:implied = redundant (good) = max (total order)(bad)
25
Handling soft CN Fundamental operations - assignment - combination/join - projection
26
September 2005First international summer school on constraint processing26 Assignment (conditioning) xixi xjxj f(x i,x j ) bbT=6 bg0 br0 gb0 gg gr0 rb0 rg0 rr f[x i =b] xjxj bT=6 g0 r0 g(x j ) g[x j =r] 0 0 hh
27
September 2005First international summer school on constraint processing27 Combination (join with , + here ) xixi xjxj f(x i,x j ) bb6 bg0 gb0 gg6 xjxj xkxk g(x j,x k ) bb6 bg0 gb0 gg6 xixi xjxj xkxk h(x i,x j,x k ) bbb12 bbg6 bgb0 bgg6 gbb6 gbg0 ggb6 ggg = 0 6
28
September 2005First international summer school on constraint processing28 Projection (elimination) xixi xjxj f(x i,x j ) bb4 bg6 br0 gb2 gg6 gr3 rb1 rg0 rr6 f[x i ] xixi g(x i ) b g r 0 0 2 g[ ] hh 0 Min
29
Pure systematic search Branch and bound(s) A pause ?
30
September 2005First international summer school on constraint processing30 Systematic search (LB) Lower Bound (UB) Upper Bound If UB then prune variables under estimation of the best solution in the sub-tree = best solution so far Each node is a soft constraint subproblem LB ff = f = T
31
September 2005First international summer school on constraint processing31 Assigning x 3 x3x3 x2x2 x5x5 x1x1 x4x4 x2x2 x5x5 x1x1 x4x4 x2x2 x5x5 x1x1 x4x4 x2x2 x5x5 x1x1 x4x4
32
September 2005First international summer school on constraint processing32 T Assign g to X 0 x5x5 x4x4 x2x2 x1x1 x3x3 T=6 f=f= 0 6 6 6 b g r 1 1 01 110 11 0 11 0 11 T T b g r 34 T Backtrack
33
September 2005First international summer school on constraint processing33 Depth First Search (DFS) BT(X,D,C) if ( X= ) then Top :=f else x j := selectVar(X) forall a D j do f S C s.t. x j S f := f[x j =a] f := g S C s.t. S= g S if (LB<Top) then BT(X-{x j },D-{D j },C ) variable heuristics improve LB good UB ASAP value heuristics
34
September 2005First international summer school on constraint processing34 The three queens
35
September 2005First international summer school on constraint processing35 Improving the LB Current node (X,D,C): Unassigned constraints Assigned constraints arity 0: f arity one: ignored Gives a stronger LB lb fc = f ( xi∊X min a∊Di f i (a))
36
September 2005First international summer school on constraint processing36 Three queens Unary assigned constraints: pruning guidance (value ordering) value deletion
37
September 2005First international summer school on constraint processing37 Still improving the LB Assigned constraints ( f i,f ) Original constraints Solve Optimal cost lb fc Gives a stronger LB Can be solved beforehand
38
September 2005First international summer school on constraint processing38 Russian Doll Search static variable ordering solves increasingly large subproblems uses an improved version of the previous LB recursively May speedup search by several orders of magnitude X1X1 X2X2 X3X3 X4X4 X5X5 LightRDS nice CP implementation for unary costs
39
September 2005First international summer school on constraint processing39 Other approaches Taking into account unassigned constraints: PFC-DAC (Wallace et al. 1994) PFC-MRDAC (Larrosa et al. 1999…) Integration into the « SaH » schema (Régin et al. 2001)
40
Local search Nothing really specific
41
September 2005First international summer school on constraint processing41 Local search Based on perturbation of solutions in a local neighborhood Simulated annealing Tabu search Variable neighborhood search Greedy rand. adapt. search (GRASP) Evolutionary computation (GA) Ant colony optimization… See: Blum & Roli, ACM comp. surveys, 35(3), 2003
42
September 2005First international summer school on constraint processing42 Boosting Systematic Search with Local Search Local search (X,D,C) time limit Sub-optimal solution Do local search prior systematic search Use best cost found as initial T If optimal, we just prove optimality In all cases, we may improve pruning
43
September 2005First international summer school on constraint processing43 Boosting Systematic Search with Local Search Ex: Frequency assignment problem Instance: CELAR6-sub4 #var: 22, #val: 44, Optimum: 3230 Solver: toolbar with default options UB initialized to 100000 3 hours UB initialized to 3230 1 hour Optimized local search can find the optimum in a few minutes Other combination: use systematic search in large neighborhoods
44
Complete inference What is it Variable (bucket) elimination Graph structural parameters
45
September 2005First international summer school on constraint processing45 Inference… Automated production of implied constraints Cost function implication Cost function implication Complete… Is able to produce the strongest implied constraint on the empty scope Gives the optimal cost Classic case: detects infeasibility
46
September 2005First international summer school on constraint processing46 Variable elimination (aka bucket elimination) Solves the problem by a sequence of problem transformations Each step yields a problem with one less variable and the same optimum. When all variables have been eliminated, the problem is solved Optimal solutions of the original problem can be recomputed
47
September 2005First international summer school on constraint processing47 CC Variable/bucket elimination Select a variable x i Compute the set K i of constraints that involves this variable Add Remove variable and K i Time: (exp(deg i +1)) Space: (exp(deg i )) X 4 X 3 X5X5 X 2 X 1
48
September 2005First international summer school on constraint processing48 Three queens…. combine
49
September 2005First international summer school on constraint processing49 Three queens… eliminate, add
50
September 2005First international summer school on constraint processing50 Three queens
51
September 2005First international summer school on constraint processing51 Elimination order influence Order G,D,F,B,C,A Order G,B,C,D,F,A
52
September 2005First international summer school on constraint processing52 Induced width For G=(V,E) and a given elimination (vertex) ordering, the largest degree encountered is the induced width of the ordered graph Minimizing induced width (dimension) is NP-hard. Heuristics, better criteria, hypergraphs
53
September 2005First international summer school on constraint processing53 How does it looks like ? A graph has induced width (k-tree #, max- clique #…) it is a partial k-tree A k-tree is either a k-clique or a k-tree with an additional vertex connected to all vertices of a k-clique in it.
54
September 2005First international summer school on constraint processing54 History / terminology Non serial dynamic programming (Bertelé Brioschi, 72) for cost functions (VE, block) Acyclic DB (Beeri et al 1983) Bayesian nets (Pearl 88, Lauritzen et Spiegelhalter 88) Constraint nets (Dechter and Pearl 88) Semirings (Shenoy and Shafer 90) C-semirings (Bistarelli et al. 95)…
55
September 2005First international summer school on constraint processing55 Real example (CELAR SCEN06)
56
Incomplete inference Mini buckets Local consistency (idempotent or …)
57
September 2005First international summer school on constraint processing57 Incomplete inference Tries to trade completeness for space/time Produces only specific classes of implied constraints Often in order to produce a lb on the optimum And usually in polynomial time/space 1.Keep these constraints outside: mini-buckets 2.Add them to the network: local consistency a.directly (idempotent: implied = redundant) b.Try to accomodate for non idempotency
58
September 2005First international summer school on constraint processing58 Eliminate x vs. « mini-eliminate » x Repeat: produces a lower bound (and an upper bound) Complexity/strength controlled by the # of variables in each mini bucket. Mini buckets (Dechter 97) x x
59
September 2005First international summer school on constraint processing59 Local consistency Very useful in classical CSP Node consistency Arc consistency… Produces an equivalent more explicit problem: Same optimum (consistency): gives a lb (may detect unfeasibility) Compositional: can be synergetically combined with other techniques « transparently »
60
September 2005First international summer school on constraint processing60 Classical arc consistency (binary CSP) for any x i and c ij c=(c ij ⋈ c j )[x i ] brings no new information on x i w v v w 0 0 0 0 ij T T xixi xjxj c ij vv0 vw0 wvT wwT xixi c(x i ) v0 wT c ij ⋈ c j T (c ij ⋈ c j )[-x j ]
61
September 2005First international summer school on constraint processing61 Arc consistency and soft constraints w v v w 0 0 0 0 ij 1 0.5 xixi xjxj f ij vv0 vw0 wv1 ww0.5 XiXi f(x i ) v0 w0.5 f ij f j 0.5 Always equivalent iff idempotent for any x i and f ij f=(f ij f j )[x i ] brings no new information on x i (f ij f j )[x i ]
62
September 2005First international summer school on constraint processing62 Idempotent soft CN (Bistarelli et al. 97) The previous operational extension works on any idempotent semiring CN Repeated application of the local enforcing process until quiescence Terminates and yields an equivalent problem Extends to non binary constraints, to higher level of consistency (k-consistency) Total order: idempotent ( = max)
63
September 2005First international summer school on constraint processing63 Non idempotent: weighted CN for any x i and f ij f=(f ij f j )[x i ] brings no new information on x i w v v w 0 0 0 0 ij 2 1 xixi xjxj f ij vv0 vw0 wv2 ww1 xixi f(x i ) v0 w1 f ij f j 1 EQUIVALENCE LOST f ij f j [x i ]
64
September 2005First international summer school on constraint processing64 Combination+Extraction: equivalence preserving transformation Requires the existence of a pseudo difference (a b) b = a: fair valued CN. Extraction of implied constraints 2 w v v w 0 0 0 ij 1 01 1 xixi xjxj f ij vv0 vw0 wv2 ww1 xixi f(x i ) v0 w1 f ij f j f ij f j [x i ] xixi xjxj f ij vv0 vw0 wv1 ww0
65
September 2005First international summer school on constraint processing65 Fair and unfair Classic CN: a ⊖b = or (max) Fuzzy CN: a⊖b = max ≼ Weighted CN: a⊖b = a-b (a≠T) else T Bayes nets:a⊖b = / … (N∪{ ,T},≽) n years of prison life imprisonment death penalty m n=m+n n = = T Set of differences of and is N, no maximum difference
66
September 2005First international summer school on constraint processing66 (K,Y) equivalence preserving inference For a set K of constraints and a scope Y Replace K by ( K) Add ( K)[Y] to the CN (implied by K) Extract ( K)[Y] from ( K) Yields an equivalent network All implicit information on Y in K is explicit Repeat for a class of (K,Y) until fixpoint
67
September 2005First international summer school on constraint processing67 Node Consistency (NC * ): (f i, ) EPI For any variable X i a, f + f i (a)<T a, f i (a)= 0 Complexity: O(nd) w v v v w w C =C = T = 3 2 2 1 1 1 1 1 0 0 1 x y z 0 0 0 1 4
68
September 2005First international summer school on constraint processing68 0 1 Full AC (FAC * ): ({f ij,f j },x i ) EPI NC * For all f ij a b f ij (a,b) + f j (b) = 0 (full support) w v v w C =0 T=4 0 1 0 1 0 x z 1 1 That’s our starting point! No termination !!!
69
September 2005First international summer school on constraint processing69 0 Arc Consistency (AC * ): ({f ij },x i ) EPI NC * For all f ij a b f ij (a,b)= 0 b is a support complexity: O(n 2 d 3 ) w v v w w C =C = T=4 2 1 1 1 0 0 0 0 1 x y z 1 1 2 0
70
September 2005First international summer school on constraint processing70 Confluence is lost Finding an AC closure that maximizes the lb is an NP- hard problem (Cooper & Schiex 2004)
71
September 2005First international summer school on constraint processing71 1 1 Directional AC (DAC * ): ({f ij,f j },x i ) i<j EPI NC * For all f ij (i<j) a b f ij (a,b) + f j (b) = 0 b is a full-support complexity: O(ed 2 ) w v v v w w C =C = T=4 2 2 2 1 1 1 0 0 0 0 x y z x<y<zx<y<z 0 1 1 12
72
September 2005First international summer school on constraint processing72 0 0 1 1 Full DAC (FDAC * ): DAC + AC NC * For all f ij (i<j) a b f ij (a,b) + f j (b) = 0 (full support) For all f ij (i>j) a b, f ij (a,b) = 0 (support) Complexity: O(end 3 ) w v v v w w C =C = T=4 2 2 2 1 1 1 0 0 0 x y z x<y<zx<y<z 2 1 1 2
73
September 2005First international summer school on constraint processing73 Implementation example for AC Data structures A queue Q of variables (pruned domains) S(i,a,j): current support for a∊ D i on f ij (AC) S(i): current support for f ∅ on x i (NC)
74
September 2005First international summer school on constraint processing74 AC 2001 based version Support for Node consistency Support for Arc consistency
75
September 2005First international summer school on constraint processing75 Value deletion (cost back propagation)
76
September 2005First international summer school on constraint processing76 The fixpoint loop
77
September 2005First international summer school on constraint processing77 Space complexity Space can be reduced to O(ed) O(erd) for arity r (not only tables)
78
September 2005First international summer school on constraint processing78 Hierarchy NC * O(nd) AC * O(n 2 d 3 )DAC * O(ed 2 ) FDAC * O(end 3 ) AC NC DAC Special case: CSP (Top=1) EDAC (ijcai 2005)
79
September 2005First international summer school on constraint processing79 Other results AC defined for arbitrary fair structures and arbitrary arities (Schiex 2000, Cooper & Schiex 2004) k-consistency: from all constraints with a scope included in a set of n variables to a subset of n-1 variables (Cooper 2005) Cyclic consistency (Cooper 2004), Existential AC (de Givry et al. 2005) : not easily described as EPI.
80
Global soft constraints Using SaH schema Compare with Soft AC
81
September 2005First international summer school on constraint processing81 Global soft constraints Existing global soft constraints are «Soft as Hard» (Petit et al. 2000) Three questions What is the semantics of the global constraints What level of consistency is enforced And how ?
82
September 2005First international summer school on constraint processing82 The all-diff (Petit et al 2001, van Hoeve 2004) Semantics ? a) number of variables whose value needs to be changed to satisfy the constraint (max n). b) primal graph violations: # of pair of variables with the same value (max n(n-1)/2). Level of consistency: hyper arc (GAC) Algorithm: (minimum cost) maximum flow algorithms on the value graph (with cost).
83
September 2005First international summer school on constraint processing83 Other soft global constraints Soft global cardinality (van Hoeve et al. 2004) Soft regular (van Hoeve et al. 2004) Other possible semantics for soft constraints (some with NP-hard consistency checks) (Beldiceanu & Petit 2004)
84
September 2005First international summer school on constraint processing84 GAC on SaH vs. Soft AC ? 1 1 1 1 1 1 f ∅ =1 a b x1x1 x3x3 x2x2 x1x1 x3x3 x2x2 1 0 1 0 1 0 x 12 x 23 xcxc x c =x 12 +x 23 2
85
September 2005First international summer school on constraint processing85 So… Is it possible… That soft AC is always at least as good as SaH GAC ? (my conjecture is yes) To offer cost variables and nicely integrate soft CN (local consistency, unary constraints for heuristics…) (my conjecture is yes) To improve current global soft constraints handling so that they reach (or exceed) soft AC, DAC, FDAC, EDAC…
86
Hybrid search (search+inference) + variable elimination + mini buckets + local consistency
87
September 2005First international summer school on constraint processing87 Cycle cutset Tree-structured problem are easy Assign variables until all cycles are cut (all possible ways) Solve the remaining tree using VE Time exponential in the cutset size (NP- hard to minimize) Space economical
88
September 2005First international summer school on constraint processing88 Boosting search with variable elimination: BB-VE(k) At each node Select an unassigned variable x i If deg i ≤ k then eliminate x i Else branch on the values of x i Properties BE-VE(-1) is BB BE-VE(w*) is VE BE-VE(1) is similar to cycle-cutset
89
September 2005First international summer school on constraint processing89 Example BB-VE(2)
90
September 2005First international summer school on constraint processing90 Example BB-VE(2) cc cc cc
91
September 2005First international summer school on constraint processing91 Boosting search with variable elimination Ex: still-life (academic problem) Instance: n=14 #var:196, #val:2 Ilog Solver 5 days Variable Elimination 1 day BB-VE(18) 2 seconds Other combination: Backtrack with Tree Decomposition (BTD, Jégou and Terrioux 2003).
92
Use incomplete inference Lower bound inside BB
93
September 2005First international summer school on constraint processing93 Using mini buckets (Kask & Dechter 99) Assume a DFS tree search using a static variable order x 1 …x n Compute mini-buckets (elim. x n …x 1 ) mini-eliminating x generates a family of cost functions f x i j = ( K x i j )[-x i ] When (x 1 …x p ) assigned, any f x i j Whose scope is included in (x 1 …x p ) Such that i>p (implied by non completely assigned constraints) Has a known value non redundant with lb
94
September 2005First international summer school on constraint processing94 Boosting search with LC BT(X,D,C) if (X= ) then Top :=f else x j := selectVar(X) forall a D j do f S C s.t. x j S f S := f S [x j =a] if (LC) then BT(X-{x j },D-{D j },C)
95
September 2005First international summer school on constraint processing95 BT MNC MAC/MDAC MFDAC MEDAC
96
Experiments Evaluating bounds/algorithms Why is WCSP so hard ? (informally)
97
September 2005First international summer school on constraint processing97 Experimenting with soft CN A good lb is a compromise between its strength and its computational cost academic problems (from n-queens to graph coloring) real problems (frequency allocation, pedigree analysis…) random problems (Max CSP, Max SAT, weighted Max SAT) Phase transitions ?
98
September 2005First international summer school on constraint processing98 Classic CSP
99
September 2005First international summer school on constraint processing99 Optimization vs. satisfaction
100
September 2005First international summer school on constraint processing100 Why is it so hard ? Problem P(alpha): is there an assignment of cost lower than alpha ? Proof of inconsistency (simplest) Proof of optimality (hardest) Harder than finding an optimum
101
September 2005First international summer school on constraint processing101 CPU time n. of variables Random Max CSP
102
September 2005First international summer school on constraint processing102 Random Max-2SAT PBS, OPBDP,CPLEX use a natural SaH formulation
103
September 2005First international summer school on constraint processing103 Frequency assignment n communication links set of available frequencies close links must use sufficiently different frequencies (|x i -x j | > d ij ) Minimize weighted sum of violated constraints CELAR/GRAPH instances: still open problems
104
September 2005First international summer school on constraint processing104 Boosting Systematic Search with Local consistency Frequency assignment problem CELAR6-sub4 #var: 22, #val: 44, #constraints: 477 toolbar: MNC* 1 yearMFDAC* 1 hour Unweighted PFCMRDAC 154.5” MFDAC* toolbar 36.2” MEDAC* toolbar 4.4” AOMFDAC REES 2574” (SVO)
105
September 2005First international summer school on constraint processing105 Uncapacited warehouse location We consider n stores and m possible warehouse locations. Each warehouse yields to a cost c e for the maintenance, and c em for the supply of a store. Each store must be supplied by exactly one warehouse. Which warehouses should be opened? NP-hard problem.
106
September 2005First international summer school on constraint processing106 Model n integer variables (stores), domain size m m boolean for candidates warehouses m+n unary cost functions (costs) n.m hard binary constraints relating stores and warehouses
107
September 2005First international summer school on constraint processing107 Results CSPLIBcap71cap101cap131 Size16 x 5025 x 5050 x 50 MFDAC 0.010.0365.02 MEDAC 0.000.020.11 CPLEX 0.02 0.05 Kratica et al. 2001MO1MO2MO3MP1 Size100 x 100200x200 MFDAC-11775.615123.7- MEDAC216.5744.4574.603403.37 CPLEX202.4836.67199.842296.65
108
Complexity & Polynomial classes Tree = induced width 1 Idempotent or not…
109
September 2005First international summer school on constraint processing109 Idempotent VCSP ( = max) A new operation on soft constraints: The -cut of a soft constraint f is the hard constraint c that forbids t iff f(t) > Can be applied to a whole soft CN xixi xjxj f(x i,x j ) bb0.4 bg0.7 gb0.9 gg0.5 Cut(f, 0.6) xixi xjxj c(x i,x j ) bb0 bgT gbT gg0
110
September 2005First international summer school on constraint processing110 Solving a min/max CN by slicing Let be the minimum s.t. the -cut is consistent. The set of solutions of the (classical) -cut = the set of all optimal solutions of the min/max CN A possibilistic/fuzzy CN can be solved in O(log(# different costs)) CSP. Can be used to lift polynomial classes Binary search
111
September 2005First international summer school on constraint processing111 Polynomial classes Idempotent VCSP: min-max CN Can use -cuts for lifting CSP classes Sufficient condition: the polynomial class is «conserved» by -cuts Simple TCSP are TCSP where all constraints use 1 interval: x i -x j ∊[a ij,b ij ] Fuzzy STCN: any slice of a cost function is an interval (semi-convex function)
112
September 2005First international summer school on constraint processing112 Polynomial classes Non idempotent (weighted CN) MaxSat is MAXSNP complete (no PTAS) Weighted MaxSAT is FP NP - complete MaxSAT is FP NP [O(log(n))] complete: weights ! MaxSAT tractable langages fully characterized (Creignou 2001) MaxCSP langage: f eq (x,y) : (x = y) ? 0 : 1 is NP-hard.
113
September 2005First international summer school on constraint processing113 Generalized intervals… Assume domains are ordered f c [a,b] (x,y): (x ∊[a,b]) ? c : 0 is a generalized interval function Tractable u ≤ x, v ≤ y f(u,v)+f(x,y) ≤ f(u,y)+f(x,v) is a submodular function: decomposes in GI Tractable O(n 3 d 3 ) ax+by+c, sqrt(x 2 +y 2 ),|x-y| r (r≥1)…
114
Ressources Solvers Benchmarks …
115
September 2005First international summer school on constraint processing115 Ressources (solvers, benchs…) Con'Flex: fuzzy CSP system with integer, symbolic and numerical constraints ( www.inra.fr/bia/T/conflex ). clp(FD,S): semi-ring CLP ( pauillac.inria.fr/.georget/clp_fds/clp_fds.html ). LVCSP: Common-Lisp library for Valued CSP with an emphasis on strictly monotonic operators ( ftp.cert.fr/pub/lemaitre/LVCSP ). Choco: was a claire library for CSP. Existing layers above Choco implements some WCSP algorithms ( choco.sourceforge.net ). REES: a C++ library by Radu Matuescu (R. Dechter lab.) Incop : incomplete algorithms (go with the winners)
116
September 2005First international summer school on constraint processing116 Toolbar and the soft constraints wiki Open source library Accessible from the Soft wiki site : http://carlit.toulouse.inra.fr/cgi-bin/awki.cgi/SoftCSP Implements MNC,MAC,MDAC,MFDAC,MEDAC Reads MaxCSP/SAT (weighted or not) and ERGO bayes nets file format Together with thousands of benchmarks in standardized format Source forge at http://mulcyber.toulouse.inra.fr/projects/toolbar/
117
September 2005First international summer school on constraint processing117 Conclusion A large subpart of classic CN body of knowledge has been extended adapted to soft CN and efficient solving tool exist. Much remains to be done: Extension: to other problems than optimization (counting, quantification…) Techniques: symmetries, learning, knowledge compilation… Algorithmic: still better lb, other local consistencies or dominance. Exploiting problem structure. Implementation: better integration with classic CN solver (Choco, Solver) Applications: problem modelling, solving, heuristic guidance, partial solving (see Tools (de Givry et al., 2004)).
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.