Local Search CPSC 322 – CSP 5 Textbook §4.8 February 7, 2011.

Slides:



Advertisements
Similar presentations
Constraint Satisfaction Problems
Advertisements

Local Search Jim Little UBC CS 322 – CSP October 3, 2014 Textbook §4.8
CPSC 322, Lecture 14Slide 1 Local Search Computer Science cpsc322, Lecture 14 (Textbook Chpt 4.8) Oct, 5, 2012.
1 Constraint Satisfaction Problems A Quick Overview (based on AIMA book slides)
1 Finite Constraint Domains. 2 u Constraint satisfaction problems (CSP) u A backtracking solver u Node and arc consistency u Bounds consistency u Generalized.
Department of Computer Science Undergraduate Events More
Review: Constraint Satisfaction Problems How is a CSP defined? How do we solve CSPs?
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) February, 9, 2009.
CPSC 322, Lecture 14Slide 1 Local Search Computer Science cpsc322, Lecture 14 (Textbook Chpt 4.8) February, 3, 2010.
CPSC 322, Lecture 13Slide 1 CSPs: Arc Consistency & Domain Splitting Computer Science cpsc322, Lecture 13 (Textbook Chpt 4.5,4.6) February, 01, 2010.
CPSC 322, Lecture 15Slide 1 Stochastic Local Search Computer Science cpsc322, Lecture 15 (Textbook Chpt 4.8) February, 6, 2009.
4 Feb 2004CS Constraint Satisfaction1 Constraint Satisfaction Problems Chapter 5 Section 1 – 3.
CPSC 322, Lecture 12Slide 1 CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12 (Textbook Chpt ) January, 29, 2010.
CPSC 322, Lecture 13Slide 1 CSPs: Arc Consistency & Domain Splitting Computer Science cpsc322, Lecture 13 (Textbook Chpt 4.5,4.8) February, 02, 2009.
CPSC 322, Lecture 14Slide 1 Local Search Computer Science cpsc322, Lecture 14 (Textbook Chpt 4.8) February, 4, 2009.
Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 1 Ryan Kinworthy CSCE Advanced Constraint Processing.
Constraint Satisfaction Problems
Constraint Satisfaction Problems
Slide 1 CSPs: Arc Consistency & Domain Splitting Jim Little UBC CS 322 – Search 7 October 1, 2014 Textbook §
Stochastic Local Search CPSC 322 – CSP 6 Textbook §4.8 February 9, 2011.
Department of Computer Science Undergraduate Events More
Constraint Satisfaction Problems (CSPs) CPSC 322 – CSP 1 Poole & Mackworth textbook: Sections § Lecturer: Alan Mackworth September 28, 2012.
CS 484 – Artificial Intelligence1 Announcements Homework 2 due today Lab 1 due Thursday, 9/20 Homework 3 has been posted Autumn – Current Event Tuesday.
Constraint Satisfaction Problems Chapter 6. Review Agent, Environment, State Agent as search problem Uninformed search strategies Informed (heuristic.
Chapter 5 Section 1 – 3 1.  Constraint Satisfaction Problems (CSP)  Backtracking search for CSPs  Local search for CSPs 2.
Computer Science CPSC 322 Lecture 13 Arc Consistency (4.5, 4.6 ) Slide 1.
1 Shanghai Jiao Tong University Informed Search and Exploration.
Constraint Satisfaction CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Hande ÇAKIN IES 503 TERM PROJECT CONSTRAINT SATISFACTION PROBLEMS.
Computer Science CPSC 322 Lecture 16 CSP: wrap-up Planning: Intro (Ch 8.1) Slide 1.
Chapter 5: Constraint Satisfaction ICS 171 Fall 2006.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Fall 2006 Jim Martin.
Local Search Algorithms
1 Chapter 5 Constraint Satisfaction Problems. 2 Outlines  Constraint Satisfaction Problems  Backtracking Search for CSPs  Local Search for CSP  The.
CHAPTER 4, Part II Oliver Schulte Summer 2011 Local Search.
1 Constraint Satisfaction Problems Chapter 5 Section 1 – 3 Grand Challenge:
CHAPTER 5 SECTION 1 – 3 4 Feb 2004 CS Constraint Satisfaction 1 Constraint Satisfaction Problems.
Constraint Satisfaction Problems University of Berkeley, USA
Arc Consistency CPSC 322 – CSP 3 Textbook § 4.5 February 2, 2011.
1. 2 Outline of Ch 4 Best-first search Greedy best-first search A * search Heuristics Functions Local search algorithms Hill-climbing search Simulated.
Arc Consistency and Domain Splitting in CSPs CPSC 322 – CSP 3 Textbook Poole and Mackworth: § 4.5 and 4.6 Lecturer: Alan Mackworth October 3, 2012.
Computing & Information Sciences Kansas State University Friday, 08 Sep 2006CIS 490 / 730: Artificial Intelligence Lecture 7 of 42 Friday, 08 September.
Chapter 5 Team Teaching AI (created by Dewi Liliana) PTIIK Constraint Satisfaction Problems.
Constraint Propagation Artificial Intelligence CMSC January 22, 2002.
Lecture 6 – Local Search Dr. Muhammad Adnan Hashmi 1 24 February 2016.
Local Search Algorithms and Optimization Problems
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) Oct, 11, 2013.
Domain Splitting, Local Search CPSC 322 – CSP 4 Textbook §4.6, §4.8 February 4, 2011.
CPSC 420 – Artificial Intelligence Texas A & M University Lecture 5 Lecturer: Laurie webster II, M.S.S.E., M.S.E.e., M.S.BME, Ph.D., P.E.
1 Constraint Satisfaction Problems (CSP). Announcements Second Test Wednesday, April 27.
Constraint Satisfaction Problems
CMPT 463. What will be covered A* search Local search Game tree Constraint satisfaction problems (CSP)
CPSC 322, Lecture 15Slide 1 Stochastic Local Search Computer Science cpsc322, Lecture 15 (Textbook Chpt 4.8) Oct, 9, 2013.
CS 561, Session 8 1 This time: constraint satisfaction - Constraint Satisfaction Problems (CSP) - Backtracking search for CSPs - Local search for CSPs.
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Computer Science cpsc322, Lecture 13
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Lecture 11 SLS wrap up, Intro To Planning
Local Search Algorithms
Computer Science cpsc322, Lecture 14
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Computer Science cpsc322, Lecture 13
Computer Science cpsc322, Lecture 14
Constraint Satisfaction Problems
Constraint Satisfaction Problems. A Quick Overview
Domain Splitting CPSC 322 – CSP 4 Textbook §4.6 February 4, 2011.
Local Search Algorithms
CS 8520: Artificial Intelligence
Constraint Satisfaction Problems
Presentation transcript:

Local Search CPSC 322 – CSP 5 Textbook §4.8 February 7, 2011

P.S. on Discussion of feedback “Sometimes we don’t cover the full lecture” –I vary the pace depending on class understanding Measured, e.g., by answers to coloured cards questions –I intentionally err on the side of preparing more material If class understanding is unexpectedly good we’ll get through all slides Typically, I don’t expect to cover everything! –Rather stop at a natural point, and continue next class –This way you also know what’s coming up next time –We are making good progress We are one lecture ahead of last term’s schedule 2

Lecture Overview Domain splitting: recap, more details & pseudocode Local Search Time-permitting: Stochastic Local Search (start) 3

If domains with multiple values Split on one domain Searching by domain splitting 4 CSP, apply AC CSP 1, apply AC CSP n, apply AC If domains with multiple values Split on one domain If domains with multiple values Split on one domain Recurse until no domains with multiple values

3 variables: A, B, C Domains: all {1,2,3,4} A=B, B=C, A  C We traced arc consistency + domain splitting for this network last lecture Example “simple problem 2” in AIspace 5

Arc consistency + domain splitting: example ({1,2,3,4}, {1,2,3,4}, {1,2,3,4}) ({1,3}, {1,2,3,4}, {1,2,3,4}) ({1,3}, {1,3}, {1,3}) ({2,4}, {1,2,3,4}, {1,2,3,4}) ({1,3}, {1}, {1,3}) AC ({}, {}, {}) ({1,3}, {3}, {1,3}) ({}, {}, {}) AC ({2,4}, {2,4}, {2,4}) ({2,4}, {2}, {2,4}) AC ({}, {}, {}) ({2,4}, {4}, {2,4}) ({}, {}, {}) AC A {1,3}A {2,4} B {1}B {3}B {2} B {4} 3 variables: A, B, C Domains: all {1,2,3,4} A=B, B=C, A  C No solution ({1,2,3,4}, {1,2,3,4}, {1,2,3,4}) AC (arc consistency)

Arc consistency + domain splitting: another example ({1,2,3,4}, {1,2,3,4}, {1,2,3,4}) ({1,3}, {1,2,3,4}, {1,2,3,4}) ({1,3}, {1,3}, {1,3}) ({2,4}, {1,2,3,4}, {1,2,3,4}) ({1,3}, {1}, {1,3}) AC ({1}, {1}, {1}) ({1,3}, {3}, {1,3}) ({3}, {3}, {3}) AC ({2,4}, {2,4}, {2,4}) ({2,4}, {2}, {2,4}) AC ({2}, {2}, {2}) ({2,4}, {4}, {2,4}) ({4}, {4}, {4}) AC A {1,3}A {2,4} B {1}B {3}B {2} B {4} 3 variables: A, B, C Domains: all {1,2,3,4} A=B, B=C, A = C Solution ({1,2,3,4}, {1,2,3,4}, {1,2,3,4}) AC (arc consistency)

3 rd formulation of CSP as search Arc consistency with domain splitting States: vector (D(V 1 ), …, D(V n )) of remaining domains, with D(V i )  dom(V i ) for each V i Start state: vector of original domains (dom(V 1 ), …, dom(V n )) Successor function: –reduce one of the domains + run arc consistency Goal state: vector of unary domains that satisfies all constraints –That is, only one value left for each variable –The assignment of each variable to its single value is a model Solution: that assignment 8

Procedure AC_DS( V, dom,C, TDA) Inputs V : a set of variables dom: a function such that dom(V) is the domain of variable V C: set of constraints to be satisfied TDA: set of possibly arc inconsistent edges of the constraint network Output set of models of the CSP (empty if no model exists) 1: dom ← GAC( V,dom,C,TDA) // run arc consistency initialized with TDA 2: If dom includes an empty domain then return {} 3: If for all V 1, …, V n ∈ V, dom(V i ) has a single value v i then 4: return the model {V 1 = v 1, …, V n = v n } 5: Choose a variable V ∈ V 6: [D 1, …, D n ] ← Partition dom(V) into non-empty domains 7: models ← {} 8: for i=1, …, n do 9: dom i ← dom with dom(V) replaced by D i 10: TDA = { | Z∈ V and Z ∈ scope(c) and V ∈ scope(c)} 11: models ← models ∪ AC_DS(V, dom i, C, TDA) 12: return models Arc consistency with domain splitting algorithm (pseudocode for your Sudoku solver) 9 Arcs that could become inconsistent by reducing V’s domain. Z is some variable, c is some constraint involving both Z and V Domain splitting Base case 1: no solution Base case 2: single model Recursive case

Learning Goals for arc consistency Define/read/write/trace/debug the arc consistency algorithm. Compute its complexity and assess its possible outcomes Define/read/write/trace/debug domain splitting and its integration with arc consistency 10

Lecture Overview Domain splitting: recap, more details & pseudocode Local Search Time-permitting: Stochastic Local Search (start) 11

Solving CSPs is NP hard -Search space for many CSPs is huge -Exponential in the number of variables -Even arc consistency with domain splitting is often not enough Alternative: local search –Often finds a solution quickly –But cannot prove that there is no solution Useful method in practice –Best available method for many constraint satisfaction and constraint optimization problems –Extremely general! –Works for problems other than CSPs –E.g. arc consistency only works for CSPs Local Search: Motivation

Some Successful Application Areas for Local Search 13 Probabilistic Reasoning Protein Folding Propositional satisfiability (SAT) University Timetabling RNA structure design Scheduling of Hubble Space Telescope: 1 week  10 seconds

Local Search Idea: –Consider the space of complete assignments of values to variables (all possible worlds) –Neighbours of a current node are similar variable assignments –Move from one node to another according to a function that scores how good each assignment is

Local Search Problem: Definition 15 Definition: A local search problem consists of a: CSP: a set of variables, domains for these variables, and constraints on their joint values. A node in the search space will be a complete assignment to all of the variables. Neighbour relation: an edge in the search space will exist when the neighbour relation holds between a pair of nodes. Scoring function: h(n), judges cost of a node (want to minimize) - E.g. the number of constraints violated in node n. - E.g. the cost of a state in an optimization context.

Example: Sudoku as a local search problem CSP: usual Sudoku CSP -One variable per cell; domains {1,…,9}; -Constraints: each number occurs once per row, per column, and per 3x3 box Neighbour relation: value of a single cell differs Scoring function: number of constraint violations

V 1 = v 1,V 2 = v 1,.., V n = v 1 Search Space V 1 = v 2,V 2 = v 1,.., V n = v 1 V 1 = v 4,V 2 = v 1,.., V n = v 1 V 1 = v 1,V 2 = v n,.., V n = v 1 V 1 = v 4,V 2 = v 2,.., V n = v 1 V 1 = v 4,V 2 = v 3,.., V n = v 1 V 1 = v 4,V 2 = v 1,.., V n = v 2 Only the current node is kept in memory at each step. Very different from the systematic tree search approaches we have seen so far! Local search does NOT backtrack!

Local search: only use local information 18

Iterative Best Improvement How to determine the neighbor node to be selected? Iterative Best Improvement: –select the neighbor that optimizes some evaluation function Which strategy would make sense? Select neighbour with … Evaluation function: h(n): number of constraint violations in state n Greedy descent: evaluate h(n) for each neighbour, pick the neighbour n with minimal h(n) Hill climbing: equivalent algorithm for maximization problems –Minimizing h(n) is identical to maximizing –h(n) Minimal number of constraint violations Similar number of constraint violations as current state Maximal number of constraint violations No constraint violations

20 Example: Greedy descent for Sudoku Assign random numbers between 1 and 9 to blank fields Repeat –For each cell & each number: Evaluate how many constraint violations changing the assignment would yield –Choose the cell and number that leads to the fewest violated constraints; change it Until solved

Example: Greedy descent for Sudoku Example for one local search step: Reduces #constraint violations by 3: -Two 1s in the first column -Two 1s in the first row -Two 1s in the top-left box

General Local Search Algorithm Random initialization Local search step

General Local Search Algorithm Based on local information. E.g., for each neighbour evaluate how many constraints are unsatisfied. Greedy descent: select Y and V to minimize #unsatisfied constraints at each step

Another example: N-Queens Put n queens on an n × n board with no two queens on the same row, column, or diagonal (i.e attacking each other) Positions a queen can attack

Example: N-queens h = 5 h = ?

Example: N-Queens h = 17 h = 1 5 steps Each cell lists h (i.e. #constraints unsatisfied) if you move the queen from that column into the cell

The problem of local minima Which move should we pick in this situation? -Current cost: h=1 -No single move can improve on this -In fact, every single move only makes things worse (h  2) Locally optimal solution –Since we are minimizing: local minimum 27

Local minima Most research in local search concerns effective mechanisms for escaping from local minima Want to quickly explore many local minima: global minimum is a local minimum, too 28 Evaluation function State Space (1 variable) Evaluation function State Space (1 variable) B

Different neighbourhoods Local minimia are defined with respect to a neighbourhood. Neighbourhood: states resulting from some small incremental change to current variable assignment 1-exchange neighbourhood –One stage selection: all assignments that differ in exactly one variable. How many of those are there for N variables and domain size d? –O(dN). N variables, for each of them need to check d-1 values –Two stage selection: first choose a variable (e.g. the one in the most conflicts), then best value Lower computational complexity: O(N+d). But less progress per step 2-exchange neighbourhood –All variable assignments that differ in exactly two variables. O(N 2 d 2 ) –More powerful: local optimum for 1-exchange neighbourhood might not be local optimum for 2-exchange neighbourhood 29 O(N+d) O(Nd) O(d N )O(N d )

Different neighbourhoods How about an 8-exchange neighbourhood? -All minima with respect to the 8-exchange neighbourhood are global minima -Why? -How expensive is the 8- exchange neighbourhood? -O(N 8 d 8 ) -In general, N-exchange neighbourhood includes all solutions -Where N is the number of variables -But is exponentially large 30

Lecture Overview Domain splitting: recap, more details & pseudocode Local Search Time-permitting: Stochastic Local Search (start) 31

Stochastic Local Search We will use greedy steps to find local minima –Move to neighbour with best evaluation function value We will use randomness to avoid getting trapped in local minima 32

General Local Search Algorithm Extreme case 1: random sampling. Restart at every step: Stopping criterion is “true” Random restart

General Local Search Algorithm Extreme case 2: greedy descent Only restart in local minima: Stopping criterion is “no more improvement in eval. function h” )

Greedy descent vs. Random sampling Greedy descent is –good for finding local minima –bad for exploring new parts of the search space Random sampling is –good for exploring new parts of the search space –bad for finding local minima A mix of the two can work very well 35

Greedy Descent + Randomness Greedy steps –Move to neighbour with best evaluation function value Next to greedy steps, we can allow for: 1. Random restart: reassign random values to all variables (i.e. start fresh) 2. Random steps: move to a random neighbour 36

Which randomized method would work best in each of the these two search spaces? Hill climbing with random steps best on A Hill climbing with random restart best on B Hill climbing with random steps best on B Hill climbing with random restart best on A equivalent Evaluation function State Space (1 variable) Evaluation function State Space (1 variable) A B

But these examples are simplified extreme cases for illustration, in reality you don’t know how you search space looks like Usually integrating both kinds of randomization works best Hill climbing with random steps best on B Hill climbing with random restart best on A Evaluation function State Space (1 variable) Evaluation function State Space (1 variable) A B

Start node: random assignment Goal: assignment with zero unsatisfied constraints Heuristic function h: number of unsatisfied constraints –Lower values of the function are better Stochastic local search is a mix of: –Greedy descent: move to neighbor with lowest h –Random walk: take some random steps –Random restart: reassigning values to all variables Stochastic Local Search for CSPs

Stochastic Local Search for CSPs: details Examples of ways to add randomness to local search for a CSP In one stage selection of variable and value: –instead choose a random variable-value pair In two stage selection (first select variable V, then new value for V): –Selecting variables: Sometimes choose the variable which participates in the largest number of conflicts Sometimes choose a random variable that participates in some conflict Sometimes choose a random variable –Selecting values Sometimes choose the best value for the chosen variable Sometimes choose a random value for the chosen variable 40

Learning Goals for local search (started) Implement local search for a CSP. –Implement different ways to generate neighbors –Implement scoring functions to solve a CSP by local search through either greedy descent or hill-climbing. Implement SLS with –random steps (1-stage, 2-stage versions) –random restart Local search practice exercise is on WebCT Assignment #2 will be available on WebCT later today (due Wednesday, Feb 23 rd ) Coming up: more local search, Section 4.8