Presentation is loading. Please wait.

Presentation is loading. Please wait.

Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 2 Ryan Kinworthy CSCE 990-06 Advanced Constraint Processing.

Similar presentations


Presentation on theme: "Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 2 Ryan Kinworthy CSCE 990-06 Advanced Constraint Processing."— Presentation transcript:

1 Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 2 Ryan Kinworthy CSCE 990-06 Advanced Constraint Processing

2 2 Outline Chapter Introduction Greedy Local Search (SLS) Random Walk Properties of Local Search Properties of Local Search Empirical Evaluation Empirical Evaluation Hybrids of Local Search and Inference Effects of Constraint Propagation on SLS Effects of Constraint Propagation on SLS Local Search on Cycle-Cutset Local Search on Cycle-Cutset Chapter Summary

3 3 Example(2): Simulated Annealing Uses a noise model from statistical mechanics At each step, the algorithm computes the change in the cost function (  ) when the value of the variable is changed to the value picked. If the change improves or doesn’t affect the cost function the change is made Otherwise the change is made with probability e -  t where T is Temperature Temp can be held constant, or slowly reduced from a high temp to a low temp according to some schedule. This algorithm converges to the exact solution if the temperature T is reduced gradually.

4 4 Empirical Evaluation Evaluating Local Search Algorithms Empirically Empirically Use either benchmarks or randomly generated problems Recent trends are to generate hard random problems from phase transition Empirical evaluation of GSAT algorithms Empirical evaluation of GSAT algorithms As the number of variables and clauses increases, the running time of the algorithms increases SLS algorithms listed in order of efficiency (most to least) Simulated Annealing Simulated Annealing With Random Walk with Noise With Random Walk with Noise with Random Walk with Random Walk Basic GSAT Basic GSAT

5 5 Hybrids of Local Search & Inference Inference with general search is very effective, how well will inference work with local search? Effects of constraint propagation on SLS Certain classes of problems, easy for BT, are very hard for SLS Certain classes of problems, easy for BT, are very hard for SLS Example: certain variations of 3SAT are extremely hard for SLS. Example: certain variations of 3SAT are extremely hard for SLS. However, when inference methods are combined with SLS, these 3SAT problems become trivial However, when inference methods are combined with SLS, these 3SAT problems become trivial

6 6 Why Is Inference so Effective for Local Search? Not definitively explained Conjectures given Enforcing local consistency eliminates many “near” solutions and thusly reduces the search space. Enforcing local consistency eliminates many “near” solutions and thusly reduces the search space. That is, assignments that satisfy almost all clauses whose cost function would normally be near zero become high cost due to consistency enforcing. That is, assignments that satisfy almost all clauses whose cost function would normally be near zero become high cost due to consistency enforcing. Doesn’t always hold Problems with uniform structure perform worse with inference Problems with uniform structure perform worse with inference Application of combining inference and SLS Cycle-cutset Cycle-cutset

7 7 What is Cycle-Cutset? First description in Chapter 5 (pp. 146) Definition: Given an undirected graph, a subset of nodes in the graph is a cycle-cutset if its removal results in a graph with no cycles. The Cycle-Cutset scheme alternates between two algorithms BT search on the cutset portion BT search on the cutset portion Tree inference on the rest Tree inference on the rest Key benefits Once a variable is instantiated, it can be removed from the constraint graph. Once a variable is instantiated, it can be removed from the constraint graph. If the set of instantiated variables forms a cycle-cutset, then we know that the remaining nodes in the graph form a tree and we can use directional rather than full consistency algorithms to solve it. If the set of instantiated variables forms a cycle-cutset, then we know that the remaining nodes in the graph form a tree and we can use directional rather than full consistency algorithms to solve it. Applicable to Local Search/Inference hybrids If we can guarantee that the constraint graph is a tree, we can then use directional arc-consistency as an inference method for solving the constraint graph. If we can guarantee that the constraint graph is a tree, we can then use directional arc-consistency as an inference method for solving the constraint graph. Example on page 206 Example on page 206

8 8 Local Search on Cycle-Cutset Instantiated variables cut the flow of information on any path they are on. In other words, the network is equivalent to one in which the instantiated variable is deleted from the network and the influence of its value is propagated to all neighboring nodes. In other words, the network is equivalent to one in which the instantiated variable is deleted from the network and the influence of its value is propagated to all neighboring nodes. So when the group of instantiated variables removes all cycles in the graph, the remaining network can be viewed as a tree and consequently can be solved by a tree-inference algorithm (e.g., arc-consistency). So when the group of instantiated variables removes all cycles in the graph, the remaining network can be viewed as a tree and consequently can be solved by a tree-inference algorithm (e.g., arc-consistency).Complexity Can be bounded exponentially in the size of the cutset. Can be bounded exponentially in the size of the cutset. Analysis not specifically given, but I assume it to be NP-Hard Analysis not specifically given, but I assume it to be NP-Hard

9 9 Hybrid Local Search on Cycle-Cutset Where does SLS fit in? Since SLS approximates search, it replaces BT search Since SLS approximates search, it replaces BT search A mechanism for collaboration in hybrids Tree Algorithm works for networks with cycles Tree Algorithm works for networks with cycles Any assignment it produces will minimize the number of violated constraints across all its subnetworks

10 10 Tree Algorithm Input: An arc consistent network R An arc consistent network R Variables X partitioned, X = Z U Y Variables X partitioned, X = Z U Y into cycle-cutset Y and tree variables Z An assignment Y = y. An assignment Y = y.Output: An assignment Z = z that minimizes the #violated constraints of the entire network when Y = y. An assignment Z = z that minimizes the #violated constraints of the entire network when Y = y.

11 11 Tree Algorithm (cont.) Initialization For any value y[i] of any cutset variable y i, the cost C yi (y[i],y) is 0. For any value y[i] of any cutset variable y i, the cost C yi (y[i],y) is 0. Algorithm body 1) Going from leaves to root in the tree 1) Going from leaves to root in the tree For every variable z i, and any value a i in D zi, compute the cost of each assignment 2) Compute, from root to leaves new assignments for every tree variable z i 2) Compute, from root to leaves new assignments for every tree variable z i For a tree variable z i, let D zi be its consistent values with v pi the value assigned to its parent pi, assign each variable a value based on the results of the previous calculation

12 12 Tree Algorithm (AAAI’96) More clearly explained in Dechter and Kask’s paper (today’s handout) A Graph-Based Method for Improving GSAT A Graph-Based Method for Improving GSAT Appeared in the AAAI’96 conference proceedings Tree Algorithm Generalization of Mackworth & Freuder ’85 Generalization of Mackworth & Freuder ’85 Generalized to work with cyclic networks In acyclic (tree) networks functions the same as M&F In acyclic (tree) networks functions the same as M&F In cyclic networks In cyclic networks Finds an assignment that minimizes the sum of unsatisfied constraints over all its tree subnetworks

13 13 Tree Algorithm (AAAI’96) cont. Example on board Still unclear on how the weight of a constraint is calculated Still unclear on how the weight of a constraint is calculatedDiscussion

14 14 SLS With Cycle-Cutset Benefit of using tree algorithm Minimizes the cost of tree subnetworks given a cycle-cutset assignment Minimizes the cost of tree subnetworks given a cycle-cutset assignment This means we can replace BT search with an SLS search Need to combine the tree algorithm with SLS Results in a concrete algorithm: Results in a concrete algorithm: SLS + CC

15 15 Overview of SLS + CC Algorithm executes a certain #tries For each try For each try Start from a random initialization Alternate between SLS and tree algorithm SLS chooses initial assignment for Y variables SLS chooses initial assignment for Y variables TA find min cost of assignment to Z variables TA find min cost of assignment to Z variables SLS fix Z, choose best y, fix y SLS fix Z, choose best y, fix y TA find best assignment for Z, fix z TA find best assignment for Z, fix z SLS on Y… SLS on Y… TA on Z… TA on Z…Note: Only adjacent tree variables affect the behavior of SLS Only adjacent tree variables affect the behavior of SLS The algorithm must enforce this property Otherwise the performance of SLS + CC deteriorates by several orders of magnitude Otherwise the performance of SLS + CC deteriorates by several orders of magnitude Cycle-Cutset idea can be generalized Cycle-Cutset idea can be generalized SLS + CC is an algorithm specific to the case when w* = 1(tree) In cases where w* >1, SLS cannot be used and a general backtracking search must be used instead In cases where w* >1, SLS cannot be used and a general backtracking search must be used instead

16 16 SLS + CC Input: An arc consistent network R An arc consistent network R Variables X partitioned, X = Z  Y Variables X partitioned, X = Z  Y into cycle cutset Y and tree variables Z Output: An assignment Z = z, Y = y that is a local minimum of the #violated constraints C(z,y) An assignment Z = z, Y = y that is a local minimum of the #violated constraints C(z,y)

17 17 SLS + CC (cont.) Repeat MAX_TRIES times Algorithm body Algorithm body 1) Random initial assignment for all variables 2) Alternate between these steps until problem is solved, or the TA doesn’t change the values of the variables or no progress is made a) When the values of cycle-cutset variables are fixed, run TA on the Z variables a) When the values of cycle-cutset variables are fixed, run TA on the Z variables b) When the values of tree variables are fixed, run SLS on the Y variables b) When the values of tree variables are fixed, run SLS on the Y variables

18 18 SLS + CC Performance SLS + CC vs. SLS Empirical evaluation (pp. 210-211) Empirical evaluation (pp. 210-211) For problems where the Cycle-Cutset < 30% of the variables: SLS + CC can solve 3-4 times more problems then SLS alone given equal CPU time SLS + CC can solve 3-4 times more problems then SLS alone given equal CPU time When Cycle-Cutset  30% SLS + CC performs about the same as plain SLS SLS + CC performs about the same as plain SLS For problems where the Cycle-Cutset > 30% of the variables: SLS is better than SLS + CC SLS is better than SLS + CC

19 19 Summary of Local Search The good Significantly faster in some problem domains than BT Significantly faster in some problem domains than BT Can solve previously unsolvable problems Can solve previously unsolvable problems Does more with less CPU time Does more with less CPU time Hybrid algorithms even more efficient Hybrid algorithms even more efficient The bad Not complete or sound (doesn’t guarantee a solution) Not complete or sound (doesn’t guarantee a solution) Not applicable to all domains Not applicable to all domains Can get stuck in local minima Can get stuck in local minima The ugly If applied to the wrong domain, can be a waste of time If applied to the wrong domain, can be a waste of time Code carefully or performance will deteriorate rapidly!! Code carefully or performance will deteriorate rapidly!!

20 20 Discussion Questions?Thoughts?Opinions?


Download ppt "Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 2 Ryan Kinworthy CSCE 990-06 Advanced Constraint Processing."

Similar presentations


Ads by Google