Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001.

Slides:



Advertisements
Similar presentations
Informed search algorithms
Advertisements

Local Search Algorithms
Artificial Intelligence Presentation
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) February, 9, 2009.
Local and Stochastic Search based on Russ Greiner’s notes.
Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 1 Ryan Kinworthy CSCE Advanced Constraint Processing.
Nature’s Algorithms David C. Uhrig Tiffany Sharrard CS 477R – Fall 2007 Dr. George Bebis.
1 Approximate Solution to an Exam Timetabling Problem Adam White Dept of Computing Science University of Alberta Adam White Dept of Computing Science University.
Local and Stochastic Search Based on Russ Greiner’s notes.
Introduction to Artificial Intelligence Local Search (updated 4/30/2006) Henry Kautz.
Intelligent Agents What is the basic framework we use to construct intelligent programs?
Review Best-first search uses an evaluation function f(n) to select the next node for expansion. Greedy best-first search uses f(n) = h(n). Greedy best.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
Stochastic greedy local search Chapter 7 ICS-275 Spring 2007.
Informed Search Chapter 4 Adapted from materials by Tim Finin, Marie desJardins, and Charles R. Dyer CS 63.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 25 Instructor: Paul Beame.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 5 Jim Martin.
Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 2 Ryan Kinworthy CSCE Advanced Constraint Processing.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Informed Search Methods
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Local Search CS311 David Kauchak Spring 2013 Some material borrowed from: Sara Owsley Sood and others.
Genetic Algorithm.
Local Search and Optimization
A Comparison of Nature Inspired Intelligent Optimization Methods in Aerial Spray Deposition Management Lei Wu Master’s Thesis Artificial Intelligence Center.
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
INTRODUCTION TO ARTIFICIAL INTELLIGENCE COS302 MICHAEL L. LITTMAN FALL 2001 Satisfiability.
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Escaping Local Optima. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Random restart General.
Simulated Annealing.
Local Search: walksat, ant colonies, and genetic algorithms.
Informed search algorithms Chapter 4. Best-first search Idea: use an evaluation function f(n) for each node –estimate of "desirability"  Expand most.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
Local Search Algorithms
Princess Nora University Artificial Intelligence Chapter (4) Informed search algorithms 1.
Local Search and Optimization Presented by Collin Kanaley.
Stochastic greedy local search Chapter 7 ICS-275 Spring 2009.
Feng Zhiyong Tianjin University Fall  Best-first search  Greedy best-first search  A * search  Heuristics  Local search algorithms  Hill-climbing.
Optimization Problems
Announcement "A note taker is being recruited for this class. No extra time outside of class is required. If you take clear, well-organized notes, this.
Accelerating Random Walks Wei Wei and Bart Selman.
Chapter 4 (Section 4.3, …) 2 nd Edition or Chapter 4 (3 rd Edition) Local Search and Optimization.
Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Local Search Algorithms and Optimization Problems
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) Oct, 11, 2013.
CPSC 420 – Artificial Intelligence Texas A & M University Lecture 5 Lecturer: Laurie webster II, M.S.S.E., M.S.E.e., M.S.BME, Ph.D., P.E.
Escaping Local Optima. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Exhaustive search Hill.
Local Search Algorithms CMPT 463. When: Tuesday, April 5 3:30PM Where: RLC 105 Team based: one, two or three people per team Languages: Python, C++ and.
CMPT 463. What will be covered A* search Local search Game tree Constraint satisfaction problems (CSP)
Constraints Satisfaction Edmondo Trentin, DIISM. Constraint Satisfaction Problems: Local Search In many optimization problems, the path to the goal is.
Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution State space = set of "complete"
1 Intro to AI Local Search. 2 Intro to AI Local search and optimization Local search: –use single current state & move to neighboring states Idea: –start.
Optimization Problems
CSCI 4310 Lecture 10: Local Search Algorithms
Heuristic Optimization Methods
Local Search Algorithms
Local Search Strategies: From N-Queens to Walksat
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Optimization Problems
Multi-Objective Optimization
More on Search: A* and Optimization
Chapter 5. Advanced Search
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
Local Search Algorithms
Local Search Algorithms
Presentation transcript:

Local Search Introduction to Artificial Intelligence COS302 Michael L. Littman Fall 2001

Administration Questions? Reading: Boyan thesis for more information on local search. 2.cs.cmu.edu/afs/cs.cmu.edu/use r/jab/mosaic/thesis/ 2.cs.cmu.edu/afs/cs.cmu.edu/use r/jab/mosaic/thesis/ chap3.ps.gz, appen2.ps.gz

Search Recall the search algorithms for constraint problems (CSP, SAT). Systematically check all states.Systematically check all states. Look for satisfied state.Look for satisfied state. In finite spaces, can determine if no state satisfies constraints (complete).In finite spaces, can determine if no state satisfies constraints (complete).

Local Search Alternative approach: Sometimes more interested in positive answer than negative.Sometimes more interested in positive answer than negative. Once we find it, we’re done.Once we find it, we’re done. Guided random search approach.Guided random search approach. Can’t determine unsatisfiability.Can’t determine unsatisfiability.

Optimization Problems Framework: States, neighbors (symmetric). Add notion of score for states: V(s) Goal: Find state w/ highest (sometimes lowest) score. Generalization of constraint framework because…

SAT as Optimization States are complete assignments. Score is number of satisfied clauses. Know the maximum (if formula is truly satisfiable).

Local Search Idea In contrast to CSP/SAT search, work with complete states. Move from state to state remembering very little (in general).

Hill Climbing Most basic approach Pick a starting state, s.Pick a starting state, s. If V(s) > max s’ in N(s) V(s’), stopIf V(s) > max s’ in N(s) V(s’), stop Else, let s = any s’ s.t.Else, let s = any s’ s.t. V(s’) = max s’’ in N(s) V(s’’) RepeatRepeat Like DFS. Stops at local maximum.

N-queens Optimization Minimize violated constraints. Neighbors: Slide in column

N-queens Search Space How far, at most, from minimum configuration? Never more than n slides from any configuration. How many steps, at most, before local minimum reached? Maximum score is n(n-1).

Reaching a Local Min

Graph Partitioning Separate nodes into 2 groups to minimize edges between them AB CD

Objective Function Penalty for imbalance V(x) = cutsize(x) + (L-R) 2 State space? Neighbors?

State Space 0000: : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 16+0

Bumpy Landscape

Simulated Annealing Jitter out of local minima. Let T be non-negative. Loop: Pick random neighbor s’ in N(s)Pick random neighbor s’ in N(s) Let D = V(s) – V(s’) (improve)Let D = V(s) – V(s’) (improve) If D > 0, (better), s = s’If D > 0, (better), s = s’ Else with prob e D/T, s = s’Else with prob e D/T, s = s’

Temperature As T goes to zero, hill climbing As T goes to infinity, random walk In general, start with large T and decrease it slowly. If decrease infinitely slowly, will find optimal solution. Analogy is with metal cooling.

Aside: Analogies Natural phenomena inspire algs Metal coolingMetal cooling EvolutionEvolution ThermodynamicsThermodynamics Societal marketsSocietal markets Ant coloniesAnt colonies Immune systemsImmune systems Neural networks, …Neural networks, …

Annealing Schedule After each step, update T Logarithmic: T i =  /log(i+2)Logarithmic: T i =  /log(i+2) provable properties, slow Geometric: T i = T 0  int(i/L)Geometric: T i = T 0  int(i/L) lots of parameters to tune AdaptiveAdaptive change T to hit target accept rate

Lam Schedule Ad hoc, but impressive (see Boyan, pg. 190, 191). Parameter is run time. Start target accept rate at 100%Start target accept rate at 100% Decrease exponentially to 44% at 15% through runDecrease exponentially to 44% at 15% through run Continue exponential decline to 0% after 65% through run.Continue exponential decline to 0% after 65% through run.

Locality Assumption Why does local search work? Why not jump around at random? We believe scores change slowly from neighbor to neighbor. Perhaps keep a bunch of states and search around the good ones…

Genetic Algorithms Search : Population genetics :: State : individual :: State set : population :: Score : fitness :: Neighbors : offspring / mutation ? : sexual reproduction, crossover

Algorithmic Options Fitness function Representation of individuals Who reproduces? * How alter individuals (mutation)? How combine individuals (crossover)? *

Representation “Classic” approach: states are fixed-length bit sequences Can be more complex objects: genetic programs, for example. Crossover: One point, two point, uniformOne point, two point, uniform

GA Example reproduction crossover mutation

Generation to Next G is a set of N (even) states Let p(s) = V(s)/sum s’ V(s’), G’ = { } For k = 1 to N/2 choose x, y with prob. p(x), p(y) randomly swap bits to get x’, y’ rarely, randomly flip bits in x’, y’ G’ = G’ U {x’, y’}

GSAT Score: number of unsat clauses. Start with random assignment.Start with random assignment. While not satisfied…While not satisfied… –Flip variable value to satisfy greatest number of clauses. –Repeat (until done or bored) Repeat (until done or bored)Repeat (until done or bored)

GSAT Analysis What kind of local search is this most like? Hill climbing? Simulated annealing? GA? Known to perform quite well for coloring problems.

WALKSAT On each step, with probability p, do a greedy GSAT move. With probability 1-p, “walk” (flip a variable in an unsatisfied clause). Even faster for many problems (planning, random CNF).

WALKSAT Analysis What kind of local search is this most like? Hill climbing? Simulated annealing? GA? Known to perform quite well for hard random CNF problems.

Random k-CNF Variables: n Clauses: m Variables per clause: k Randomly generate all m clauses by choosing k of the n variables for each clause at random. Randomly negate.

Satisfiable Formulae

Explaining Results Three sections m/n < 4.2, under constrained: nearly all satisfiablem/n < 4.2, under constrained: nearly all satisfiable m/n > 4.3, over constrained: nearly all unsatisfiablem/n > 4.3, over constrained: nearly all unsatisfiable m/n, under constrainted: nearly all satisfiablem/n, under constrainted: nearly all satisfiable

Phase Transition Under-constrained problems are easy, just guess an assignment. Over-constrained problems aren’t too bad, just say “unsatisfiable” and try to verify via DPLL. Transition region sharpens as n increases.

Local Search Discussion Often the second best way to solve any problem. Relatively easy to implement. So, good first attempt. If good enough, stop.

What to Learn Definition of hill climbing, simulated annealing, genetic algorithm. Definition of GSAT, WALKSAT. Relative advantages and disadvantages of local search.

Programming Project 1 (due 10/22) Solve Rush Hour problems. Implement BFS, A* with simple blocking heuristic, A* with more advanced heuristic. Input and output examples available on course web site Speed and accuracy count.

Homework 4 (due 10/17) 1.In graph partitioning, call a balanced state one in which both sets contain the same number of nodes. (a) How can we choose an initial state at random that is balanced? (b) How can we define the neighbor set N so that every neighbor of a balanced state is balanced? (c) Show that standard GA crossover (uniform) between two balanced states need not be balanced. (d) Suggest an alternative crossover operator that preserves balance. 2.More soon…