Presentation is loading. Please wait.

Presentation is loading. Please wait.

Constraint Programming

Similar presentations


Presentation on theme: "Constraint Programming"— Presentation transcript:

1 Constraint Programming
Thanks for inviting me I assume you are familiar with “classical” CSP I will introduce some DisCSP concepts Pedro Meseguer IIIA-CSIC Bellaterra, Spain

2 Overview Definitions Tree search: backtracking Arc consistency
Hybrids (arc consistency + tree search): FC, MAC Modelling Global constraints Soft constraints Branch and bound

3 Some Definitions Constraint Network (CN): (X, D, C)
X = {x1, x2,…, xn} variables D = {d1, d2,…,dn} domains (finite) C = {c1,c2,…,cr } constraints c C var(c) = {xi, xj,…, xk} scope rel(c)  di x dj x .. x dk permitted tuples arity (c) = |var (c)| (unary, binary, ternary,…) Constraint Satisfaction Problem (CSP): CN solving: assig. satisfying every constraint NP-complete task

4 Example: n-queens GOAL: Locate n queens in an n x n chessboard, such that they do not attack each other x1 x2 x3 x4 4-queens Formulation: Variables: one queen per row Domains: available columns Constraints: different columns and different diagonals xi  xj | xi - xj |  | i - j | x1 x2 x3 x4 Constraint Graph:

5 Search Tree State space: explored as a tree Tree: root: empty
one variable per level sucessors of a node: one sucessor per value of the level variable meaning: variable  value x a b c Tree: each branch defines an assignment depth n (number of variables) branching factor d (domain size)

6 Search tree for 4-queens
x1 1 2 3 4 x2 x3 x4 (1,1,1,1) (2,1,1,1) (3,1,1,1) (4,1,1,1) (4,4,4,4)

7 Backtracking Algorithm
Depth-first tree traversal (DFS) At each node: check every completely assigned constraint if consistent, continue DFS otherwise, prune current branch continue DFS Complexity: O(dn)

8 Backtracking on 4-queens
1 2 x1 1 2 3 4 1 2 4 3 x1 x2 x3 x4 Q Q x2 1 2 3 4 1 2 3 4 1 Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q Q x3 1 2 3 4 1 2 3 Q Q Q Q Q Q Q x4 solution 25 nodes

9 Inference Inference: P P’ Inference can be:
legal operations on variables, domains, constraints Inference: P P’ P’ is equivalent to P: Sol(P) = Sol(P’) P’ is presumably easier to solve than P smaller search space constraints are more explicit Inference can be: complete: produces the solution adaptive consistency incomplete: requires further search arc consistency

10 Incomplete Inference: Local Consistency
P constraint network of n variables: Is P solvable ? Simpler problem: P1i, P2j, P3k,…subnetworks of P of 1, 2,3,…variables Are they solvable? YES, but there are values that do not appear in any solution  they can be removed from P NO,  P has no solution #vars in subnet: ……. node arc path consistency consistency consistency Empty domain: P has no solution !!

11 Binary Arc Consistency (AC)
Constraint Cij is directional arc consistent (i j) iff for all aDi there is bDj such that (a, b)Rij Constraint Cij is AC iff it is directional AC in both directions A problem is AC iff every constraint is AC

12 AC: Example Microstructure: permitted a b c Xi Xj Xk AC AC ¬AC ¬AC AC

13 Filtering by Arc Consistency
If for a Di there not exists b Dj such that (a, b)Rij , a can be removed from Di (a will not be in any sol) Domain filtering: Remove arc-inconsistent values Until no changes blue,red, green blue, green blue blue, red X2 X3 X4 X1 Example:

14 Example: 3-queens c12 is not arc-consistent because value 2 of d1
x1 x2 x3 x1 x2 x3 x1 x2 x3 x1 x2 x3 c21 is not arc-consistent because value 2 of d2 c32 is not arc-consistent because value 2 of d3

15 Constraint Propagation
AC(c): procedure to make c arc consistent To make P arc-consistent, process each constraint ? AC c c2 … cr c c2 … cr c c2 … cr c c2 … cr But AC(c) may render other constraints arc-inconsistent To make P arc-consistent, iterate: Apply AC on {c1,c2,…,cr } Until no changes in domains: fix point

16 Example: 3-queens value 2 of d3 was removed (to make c23 arc-consistent) x1 x2 x3 x1 x2 x3 x1 x2 x3 x1 x2 x3 x1 x2 x3 x1 x2 x3 this makes c13 arc-inconsistent c13 is not arc-consistent because value 1 of d1 c13 is not arc-consistent because value 3 of d1 domain d1 empty no solution !!

17 Example: Equations on Integers
x + y = 9 2x + 4y = 24 x + y = 9 2x + 4y = 24 2x + 4y = 24 x + y = 9 2x + 4y = 24 x + y = 9 2x + 4y = 24 x + y = 9 x + y = 9 2x + 4y = 24 2x + 4y = 24 x + y = 9 x + y = 9 2x + 4y = 24 2x + 4y = 24 x + y = 9 y x y x y x y x y x y x fix point

18 Generalized Arc Consistency
c is arc-consistent iff: every possible value of every variable in var (c) appears in rel(c) domain filtering If c is not arc-consistent because a Dx : a will not be in any solution a can be removed: Dx  Dx – {a} if Dx becomes empty, P has no solution inference P is arc-consistent iff: every constraint is arc-consistent incomplete inference !! If P is arc-consistent P has solution

19 Forward Checking FC is a combination of: When a domain becomes empty :
Search: backtracking Inference: at each node, AC on constraints with assigned and unassigned variables When a domain becomes empty : No solutions following current branch Prune current branch and backtrack Caution: Values removed by AC at level i, have to be restored when bactracking at level i or above

20 Example: FC on 4-queens x1 x1 x2 x3 x4 x1 x2 x3 x4 x1 x2 x3 x4 x1 x2
Q x1 x2 x3 x4 Q x1 x2 x3 x4 x1 x2 x3 x4 Q x1 x2 x3 x4 Q x1 x2 x3 x4 Q x1 x2 x3 x4 Q x1 x2 x3 x4 Q x1 x2 x3 x4 Q x1 x2 x3 x4 Q x1 x2 x3 x4 Q 3 4 4 x2 2 1 x3 3 x4 solution 8 nodes

21 Maintaining Arc Consistency
MAC is a combination of: Search: backtracking Inference: at each node, AC on all constraints Preprocess: subproblems are AC When a domain becomes empty : No solutions following current branch Prune current branch and backtrack Caution: Values removed by AC at level i, have to be restored when bactracking at level i or above

22 MAC vs FC: AC on futures Q Q AC (2,3) AC (2,3) Q Q empty domain

23 Example: MAC on 4-queens
1 2 x1 x1 x2 x3 x4 Q x1 x2 x3 x4 Q x1 x2 x3 x4 Q x1 x2 x3 x4 Q x1 x2 x3 x4 Q 4 x1 x2 x3 x4 x2 1 x3 3 x4 solution 5 nodes

24 Constraint Programming
CP: provides a platform for solving CSPs proven useful in many real applications Platform: set of common structures to reuse best known algorithms for propagation & solving Two stages: modelling solving

25 CP: Modelling Modelling decisions: select among alternatives
the choice of the variables the choice of the domains how we state the constraints search space size space reduction Example: Map Colouring variables: are regions or colours? Any CSP can be modelled in different ways Efficiency of algorithms can vary dramatically No strong results are known Formulating an effective model is not easy, requires considerable skills in modelling

26 N-queens: Model 1 Variables: n2, one per cell, matrix B n x n
Domains: {0,1}, B[a,b]=0, no queen B[a,b]=1, queen Constraints: If B[a,b] = 1 then same row B[_,b]=0  same column B[a,_]=0  same diagonal B[a+d,b+d]=0, B[a-d,b-d]=0  same diagonal B[a-d,b+d]=0, B[a+d,b-d]=0

27 N-queens: Model 2 x1 x2 x3 x4 Variables: n, one per row 1 2 3 4
Domains: {0,1,…,n-1}, queen column Constraints: different columns xi  xj  different diagonals | xi - xj |  | i - j | x1 x2 x3 x4 Different row constraint is included in the formulation!!

28 N-queens: Model 3 x1 x2 x3 x4 Variables: n, one per row 1 2 3 4
Domains: {0,1,…,n-1}, queen column Constraints: different columns all-different(x1, x2 ,…,xn ) different diagonals | xi - xj |  | i - j | x1 x2 x3 x4 Different row constraint is included in the formulation!!

29 N-queens Models Model 1 Model 2 Model 3 Search space size d#vars 2n2
,536 1.27 E30 ERROR!! nn 256 1.00 E10 1.05 E26 Constraints number prunning n rows n columns 2(n-1) diagonals Equal model 1 1 all-diff More than model 2

30 Constraint Formulations
Binary (arity  2) : conceptually simple, easy to implement may generate weak formulations Non-binary (arity > 2) : more complex constraints GAC: stronger (filter more) than AC on equivalent binary decomposition Equivalence: any non-binary CSP can be reformulated as a binary one

31 Global Constraints Real-life constraints: often complex, non-binary
c is global iff: arity(c) > 2 c is logically equivalent to {c1,c2,…,ck } binary AC(c) prunes more than AC(c1,c2,…,ck ) Propagation: specialized algorithms exploit constraint semantics decrease AC complexity

32 Example: all-different
Var: F, N, S; Val: { }; Ctrs: N ≠ S ≠ F ≠ N F { } S { } N { } all-different F { } S { } N { } logically equivalent 3 binary constraints, they are AC, no pruning 1 ternary constraint, not GAC, GAC pruning ® empty domain no solution!!

33 Example: all-different
Enforcing arc-consistency: n variables, d values n(n-1)/2 binary constraints : O(n2 d2) 1 n-ary constraint: general purpose algorithm O(dn) specialized algorithm O(n2 d2)

34 CP Solving: Some Guidelines
Easy/hard problems: hybrid search dynamic variable ordering: min domain / degree easy: FC / hard: MAC One solution/All solutions: one solution: hybrid search all solutions: hybrid search or complete inference For specific problems (scheduling, routing…) check: formulation, global constraints heuristics, experiences

35 CP: Declarative Programming
Declarative Programming: you declare Variables Domains Constraints and ask the SOLVER to find a solution!! SOLVER offers: Implementation for variables / domains / constraints Hybrid algorithm: backtracking + incomplete inference Global constraints + optimized AC propagation Empty domain detection Embedded heuristics

36 Hard vs Soft Constraints [90’s]
Hard constraint: it must be always satisfied: physical constraints if violated, the solution is invalid Soft constraint: it will be satisfied if possible: preferences if violated, the solution is still valid aggregation of violated constraints Classical CSP + soft constraints: optimization problems narrow the set of solutions: user intention

37 Soft CSP (Weighted Model = COP)
Soft CSP: (X,D,C) X set of n variables D variable domains C set of r soft constraints c Î C c : ∏ di ® R+ i var(c) t satisfies completely c c (t) = (0, +¥) t satisfies / violates partially c +¥ t violates completely c c (t) : cost associated with violation of c by t Goal: min ∑ c (t ) (NP-hard) c C

38 Example: 3-queens GOAL: Locate 3 queens in an 3 x 3 chessboard, such that the number of attacks is minimum x1 x2 x3 3-queens Formulation: Variables: one queen per row Domains: available columns Constraints: different columns and different diagonals xi  xj | xi - xj |  | i - j |

39 Constraints -> Cost Functions
x1 x2 x1 x3 x2 x3 3-queens: CSP: x1 x2 COP: x1 x3 x2 x3

40 COP Search: Branch and Bound
Depth-first tree search: internal node: partial assignment leaf: total assignment At each node: Upper bound (UB): minimum cost of visited leaves: cost of the current best solution Lower bound (LB): underestimation of minimum cost among leaves below current node Pruning: UB ≤ LB Simplest LB: LBsearch = cost (t) = ∑ c (t) c CP 1 2 current node n

41 Search tree for 3-queens
x1 1 2 3 x2 x3 (1,1,1) (2,1,1) (3,1,1) (3,3,3)

42 BnB for 3-queens x1 x2 x3 LB 3 2 2 3 2 3 1 1 2 2 1 1 24 NODES UB 1 1 0
LB x2 x3 UB 24 NODES

43 BnB + soft AC* for 3-queens
x1 C = 1 C = 1 C = 1 x2 C = 2 C = 2 C = 1 x3 9 NODES


Download ppt "Constraint Programming"

Similar presentations


Ads by Google