Presentation is loading. Please wait.

Presentation is loading. Please wait.

CGRASS A System for Transforming Constraint Satisfaction Problems Alan Frisch, Ian Miguel AI Group University of York Toby Walsh 4C.

Similar presentations


Presentation on theme: "CGRASS A System for Transforming Constraint Satisfaction Problems Alan Frisch, Ian Miguel AI Group University of York Toby Walsh 4C."— Presentation transcript:

1 CGRASS A System for Transforming Constraint Satisfaction Problems Alan Frisch, Ian Miguel AI Group University of York Toby Walsh 4C

2 Overview Motivation Proof Planning CGRASS The Golomb Ruler Results Conclusion/Future Work

3 Motivation Transforming a model can greatly reduce the effort of solving a problem. So, experts work hard to identify useful transformations, e.g. –Adding implied constraints. –Breaking symmetry. –Removing redundant constraints. –Replacing constraints with their logical equivalents.

4 CGRASS Constraint GeneRation And Symmetry-breaking. Designed to automate the task of finding useful model transformations. Based on, and extends, Bundy’s proof planning. –Modelling expertise captured in PP methods. –Methods selectively applied to transform the problem.

5 Challenges Want to make only useful transformations. –Not always easy to tell what will help How much work to do before we resort to search?

6 Proof Planning Used to guide search for proof in ATP. Patterns in proofs captured in methods. –Strong preconditions limit applicability. –Prevent combinatorially explosive search. Easily adaptable to problem transformation. –Methods now encapsulate common patterns in hand-transformation.

7 Proof Planning - Advantages Strong preconditions restrict us to useful model transformations. Methods act at a high level. Search control separate from inference steps.

8 Proof Planning - Operation Given a goal to prove: –Select a method. –Check pre-conditions. –Execute post-conditions. –Construct output goal(s). Associated tactic constructs actual proof. CGRASS makes transformations rather than constructing sub-goals to prove.

9 An Example – Strengthen Inequality Preconditions 1.There exist two expressions, x  y and x  y. Post-conditions 1.Add a new constraint of the form x < y 2.Delete x  y and x  y. x  y x  y x < y Transformed to

10 Non-monotonicity Sometimes, methods might add a new constraint. At others, they might: –Replace one constraint by a tighter one. –Eliminate a redundant constraint. The set of constraints may increase or decrease. We replace the single output slot of a method with add and delete lists. –Cf. classical planning.

11 Looping and History Unless a method deletes at least one of its input constraints, its preconditions continue to hold. –Hence the method can repeatedly fire. The History mechanism prevents this. –A list of all constraints ever added (including initial set). Intuition: constraints removed when redundant or transformed into a more useful form –Restoring a previously removed constraint is a retrograde step.

12 Pattern Matching Other proof planners use first order unification. CGRASS uses a richer pattern matching language. A method’s input slot can specify: –Any single algebraic constraint (irrespective of type). No individual methods looking for equality, inequations, inequalities… –Subsets of constraints (e.g. all inequations).

13 Other Extensions Constraint Utility: –Constraint arity, complexity and tightness. –Remains difficult. Termination: –At some point, must stop inferring and start searching. –Need an executive to terminate when future rewards look poor. Explanation: –Tactics write out text explaining method application.

14 Syntax Simple input: problem variables and domains, a list of constraints. –In future: OPL. Internally simplified further: –Inequalities always re-arranged to x < y or x  y. –Subtraction replaced by a sum and a –1 coefficient. –Promotes efficiency. –Reduces number of methods required.

15 Normalisation Normal form inspired by that used in HartMath: –www.hartmath.org Necessary to deal with associative and commutative operators (e.g. +, *). Can replace the test for semantic equivalence with a much simpler syntactic comparison (to an extent). A combination of lexicographic ordering and simplification.

16 Lexicographic Ordering We define an order over CGRASS’ types: –Constants < variables < fractions < … < equality < … Objects of different type compared using this order. Objects of same type compared recursively. –Each has an associated method of self-comparison –Base case: two constants or variables compared. Sums/Products/Conjunction/Disjunction in flattened form: –Simply sort to normalise

17 Lexicographic Ordering Example x 8 + x 7  x 6 + x 5 x 4 *2 + x 3 = 2*x 1 + x 2 In normal form: x 2 + 2*x 1 = x 3 + 2*x 4 x 5 +x 6  x 7 + x 8 Equality is higher than an inequation. The sums are ordered internally and recursively. The lhs of an equation is lexicographically least.

18 Simplification Collection of like terms. Cancellation. Removal of common factors. Example: 2*6*x 1 + 4*x 2 = 6*x 1 + x 3 *2*2 + 6*x 1 –Normalise: 2*6*x 1 + 4*x 2 = 6*x 1 + 6*x 1 + 2*2* x 3

19 Simplification Example Given: 2*6*x 1 + 4*x 2 = 6*x 1 + 6*x 1 + 2*2* x 3 Collect constants and occurrences of x 1 : 4*x 2 + 12*x 1 = 4*x 3 + 12*x 1 Next we perform cancellation: 4*x 2 = 4*x 3 Remove common factor: x 2 = x 3

20 Normalisation - Summary These simple steps performed recursively until no further change. Reduce workload of CGRASS.substantially: –Provide a syntactic test for equality. –Avoid such simplification being written as explicit methods.

21 The Golomb Ruler A set of n ticks at integer points on a ruler of length m. –All inter-tick distances must be unique. Practical applications: astronomy, crystallography Previously, useful implied constraints found by hand [Smith et al 2000].

22 Golomb Ruler – A Concise Model Minimise: max i (x i ) Constraints of the form: (x i – x j  x k – x l ) Subject to: i  j k  l i  k  j  l Domains: [0 – n 2 ]

23 The Concise Model is Poor The constraints are quaternary: –Delayed by most solvers. Symmetry: (x 1 – x 2  x 3 – x 4 ) –Is the same as: (x 3 – x 4  x 1 – x 2 ) Serves to illustrate how CGRASS can automatically improve a basic model.

24 The 3-tick Golomb Ruler Basic model produces 30 constraints. Initial normalisation of the constraint set reduces this to just 12. –Constraints with reflection symmetry across an inequation identical following normalisation. –Some cancellation also possible E.g. x 1 – x 2  x 1 – x 3

25 Initial State (After Normalisation) x1  x2x1  x2 x1  x3x1  x3 x2  x3x2  x3 x 1 - x 2  x 2 – x 1 x 1 - x 2  x 2 – x 3 x 1 - x 2  x 3 – x 1 x 1 - x 3  x 2 – x 1 x 1 - x 3  x 3 – x 1 x 1 - x 3  x 3 – x 2 x 2 - x 1  x 3 – x 2 x 2 - x 3  x 3 – x 1 x 2 - x 3  x 3 – x 2

26 Symmetry-breaking Often useful implied constraints can be derived only when some/all symmetry broken. CGRASS detects/breaks symmetry as a pre- processing step. –Symmetrical variables. –Symmetrical sub-terms.

27 Symmetrical Variables Have identical domains. If all occurrences of x 1, x 2 are exchanged, re- normalised constraint set returns to its original state. Transitivity of symmetry reduces number of comparisons: –x 1 ≡ x 2, x 2 ≡ x 3 → x 1 ≡ x 3 Pairwise comparisons of normalised constraint sets also increases efficiency.

28 Breaking Variable Symmetry Result of symmetry detection: a set of lists of symmetrical variables. Break by creating a partial order between adjacent pairs of variables in each list. –Bounds consistency maintains consistency on transitive closure.

29 Golomb Ruler: Symmetrical Variables Symmetry-testing on the 3-tick version: –x 1, x 2, x 3 are symmetrical. Hence we add: –x 1  x 2 –x 2  x 3

30 Symmetrical Sub-terms Potentially expensive. Heuristics based on structural equivalence. –Identical when explicit variable names replaced by a common `marker’. Becomes: Swap corresponding pairs throughout and re- normalise.

31 Firing `Strengthen Inequality` Preconditions 1.There exist two expressions, x  y and x  y. Post-conditions 1.Add a new constraint of the form x < y 2.Delete x  y and x  y. x 1  x 2 x 1  x 2 x 2  x 3 x 2  x 3 Transformed to x 1 < x 2 x 2 < x 3

32 Method Application Order `StrengthenInequality` an example of a number of simple but useful methods: –CGRASS ascribes a high priority to these when making method selection. Other examples: `NodeConsistency`, `BoundsConsistency`. Cheap to fire, often reduce constraint set size: –Promotes efficiency, leaving fewer constraints for more complicated methods to match against.

33 Firing `ArcConsistency` Given x 1 < x 2, x 2 < x 3, update domains: –x 1 ε {0, 1, 2, 3, 4, 5, 6, 7} –x 2 ε {1, 2, 3, 4, 5, 6, 7, 8} –x 3 ε {2, 3, 4, 5, 6, 7, 8, 9} x 1 < x 2 x 2 < x 3 x1  x3x1  x3 x 1 - x 2  x 2 – x 1 x 1 - x 2  x 2 – x 3 x 1 - x 2  x 3 – x 1 x 1 - x 3  x 2 – x 1 x 1 - x 3  x 3 – x 1 x 1 - x 3  x 3 – x 2 x 2 - x 1  x 3 – x 2 x 2 - x 3  x 3 – x 1 x 2 - x 3  x 3 – x 2

34 Some Redundancy Given: x 1 < x 2 and x 2 < x 3, x 1  x 3 is redundant. Could add a simple method: –Input: a set of strict inequalities and inequation. –Removes redundant inequation.

35 Variable Introduction The model still contains 9 quaternary constraints. Can reduce arity by introducing new variables: –New variable bound to a sub-expression of the quaternary constraints. –E.g. z 0 = x 1 – x 2 `Eliminate` method substitutes new variables into the quaternary constraints, reducing their arity.

36 The `Introduce` Method Preconditions 1.Exp has arity greater than 1, occurs more than once in the constraint set. 2.someVariable = Exp not already present. Post-conditions 1.Generate new var, x, domain defined by Exp. 2.Add constraint: x = Exp.

37 Application of `Introduce` Potentially Explosive. Only applied when all simpler (reductive) methods inapplicable. Golomb Example: –Sub-term: x 1 – x 2 meets preconditions. –Hence: z 0 = x 1 – x 2, z 0 ε {-8 … 6}. In order to make use of z 0, the companion `Eliminate` method is necessary.

38 An Instance of `Eliminate` Preconditions: 1.Lhs = CommonExp s.t. CommonExp also present in another constraint c. 2.c new obtained by replacing all occurrences of CommonExp by Lhs in c. 3.Complexity of c new less than that of c. 4.c new not obviously redundant. Post-conditions 1.Add c new, remove c.

39 Application of `Eliminate` Strong preconditions mean this method simplifies the constraint set. Hence, must be exhausted before `Introduce` applied. Combination of `Introduce` and `Eliminate` gives: x 1 < x 2 x 2 < x 3 x1  x3x1  x3 z 0 = x 1 – x 2 z 1 = x 2 – x 3 z 2 = x 3 – x 1 z0  z1z0  z1 z0  z2z0  z2 z1  z2z1  z2

40 Other Instances of `Eliminate` Can also eliminate with inequalities. Given: z 0 = x 1 – x 2 Eliminate x 1 in favour of x 2 using x 1 < x 2. Giving: z 0 < 0. `NodeConsistency` method immediately halves the domain of z 0.

41 The `All-different` Method Generates an all-different constraint from a clique of inequations. Maximal clique identification is NP-complete –CGRASS uses a greedy procedure. Input: inequations with single variables on both sides. Generates a list for each variable of the variables with which it is not equal. Looks at variables with equal-sized lists.

42 Final Problem State x 1 < x 2 x 2 < x 3 x1  x3x1  x3 z 0 = x 1 – x 2 z 1 = x 2 – x 3 z 2 = x 3 – x 1 All-different(z 0, z 1, z 2 )

43 Results Model3-tick4-tick5-tick6-tick BasicFails111413651131438 Nodes131443655131444 Time0.06s 0.5s28.8s TransformedFails351764 Nodes582170 Solve0.05s 0.06s CGRASS 0.4s5.1s74s782s

44 Results - Analysis Size of input generated by the basic model increases dramatically with number of ticks. –6 ticks: 870 constraints. –Hence the increase in effort for CGRASS. Once generated, the new models are far easier to solve. –Gap increases rapidly with number of ticks.

45 Future Work Direct support for quantified expressions. –Reduces size of input to a single constraint in this case. Allows reasoning about a class of problems. Writing methods typically more complicated. –Some easier: First steps: Arrays.


Download ppt "CGRASS A System for Transforming Constraint Satisfaction Problems Alan Frisch, Ian Miguel AI Group University of York Toby Walsh 4C."

Similar presentations


Ads by Google