Computer Science CPSC 322 Lecture 16 CSP: wrap-up Planning: Intro (Ch 8.1) Slide 1.

Slides:



Advertisements
Similar presentations
Local Search Algorithms
Advertisements

Local Search Jim Little UBC CS 322 – CSP October 3, 2014 Textbook §4.8
CPSC 322, Lecture 14Slide 1 Local Search Computer Science cpsc322, Lecture 14 (Textbook Chpt 4.8) Oct, 5, 2012.
Department of Computer Science Undergraduate Events More
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
LOCAL SEARCH AND CONTINUOUS SEARCH. Local search algorithms  In many optimization problems, the path to the goal is irrelevant ; the goal state itself.
Stochastic Local Search Algorithms CPSC 322 – CSP 7 Textbook §4.8 February 11, 2011.
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) February, 9, 2009.
CSC344: AI for Games Lecture 5 Advanced heuristic search Patrick Olivier
CPSC 322, Lecture 14Slide 1 Local Search Computer Science cpsc322, Lecture 14 (Textbook Chpt 4.8) February, 3, 2010.
CPSC 322, Lecture 15Slide 1 Stochastic Local Search Computer Science cpsc322, Lecture 15 (Textbook Chpt 4.8) February, 6, 2009.
Iterative improvement algorithms Prof. Tuomas Sandholm Carnegie Mellon University Computer Science Department.
CPSC 322, Lecture 14Slide 1 Local Search Computer Science cpsc322, Lecture 14 (Textbook Chpt 4.8) February, 4, 2009.
Ryan Kinworthy 2/26/20031 Chapter 7- Local Search part 1 Ryan Kinworthy CSCE Advanced Constraint Processing.
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
1 Approximate Solution to an Exam Timetabling Problem Adam White Dept of Computing Science University of Alberta Adam White Dept of Computing Science University.
Introduction to Artificial Intelligence Local Search (updated 4/30/2006) Henry Kautz.
Intelligent Agents What is the basic framework we use to construct intelligent programs?
Trading optimality for speed…
Review Best-first search uses an evaluation function f(n) to select the next node for expansion. Greedy best-first search uses f(n) = h(n). Greedy best.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 5 Jim Martin.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Local Search and Optimization
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Stochastic Local Search CPSC 322 – CSP 6 Textbook §4.8 February 9, 2011.
An Introduction to Artificial Life Lecture 4b: Informed Search and Exploration Ramin Halavati In which we see how information.
Local Search CPSC 322 – CSP 5 Textbook §4.8 February 7, 2011.
Local Search Algorithms This lecture topic Chapter Next lecture topic Chapter 5 (Please read lecture topic material before and after each lecture.
Department of Computer Science Undergraduate Events More
CS 484 – Artificial Intelligence1 Announcements Homework 2 due today Lab 1 due Thursday, 9/20 Homework 3 has been posted Autumn – Current Event Tuesday.
Informed search algorithms
Escaping Local Optima. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Random restart General.
CP Summer School Modelling for Constraint Programming Barbara Smith 2. Implied Constraints, Optimization, Dominance Rules.
Informed search algorithms Chapter 4. Best-first search Idea: use an evaluation function f(n) for each node –estimate of "desirability"  Expand most.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
Artificial Intelligence for Games Online and local search
Local Search Algorithms
Local Search Pat Riddle 2012 Semester 2 Patricia J Riddle Adapted from slides by Stuart Russell,
Single-solution based metaheuristics. Outline Local Search Simulated annealing Tabu search …
Princess Nora University Artificial Intelligence Chapter (4) Informed search algorithms 1.
Local Search and Optimization Presented by Collin Kanaley.
Announcement "A note taker is being recruited for this class. No extra time outside of class is required. If you take clear, well-organized notes, this.
Local Search. Systematic versus local search u Systematic search  Breadth-first, depth-first, IDDFS, A*, IDA*, etc  Keep one or more paths in memory.
Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Lecture 6 – Local Search Dr. Muhammad Adnan Hashmi 1 24 February 2016.
Local Search Algorithms and Optimization Problems
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) Oct, 11, 2013.
Domain Splitting, Local Search CPSC 322 – CSP 4 Textbook §4.6, §4.8 February 4, 2011.
CPSC 420 – Artificial Intelligence Texas A & M University Lecture 5 Lecturer: Laurie webster II, M.S.S.E., M.S.E.e., M.S.BME, Ph.D., P.E.
Escaping Local Optima. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Exhaustive search Hill.
Local Search Algorithms CMPT 463. When: Tuesday, April 5 3:30PM Where: RLC 105 Team based: one, two or three people per team Languages: Python, C++ and.
CMPT 463. What will be covered A* search Local search Game tree Constraint satisfaction problems (CSP)
Constraints Satisfaction Edmondo Trentin, DIISM. Constraint Satisfaction Problems: Local Search In many optimization problems, the path to the goal is.
Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution State space = set of "complete"
1 Intro to AI Local Search. 2 Intro to AI Local search and optimization Local search: –use single current state & move to neighboring states Idea: –start.
CPSC 322, Lecture 15Slide 1 Stochastic Local Search Computer Science cpsc322, Lecture 15 (Textbook Chpt 4.8) Oct, 9, 2013.
Optimization via Search
Stochastic Local Search Computer Science cpsc322, Lecture 15
Lecture 11 SLS wrap up, Intro To Planning
Local Search Algorithms
Computer Science cpsc322, Lecture 14
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Computer Science cpsc322, Lecture 14
More on Search: A* and Optimization
Stochastic Local Search Variants Computer Science cpsc322, Lecture 16
Artificial Intelligence
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
Local Search Algorithms
Local Search Algorithms
Presentation transcript:

Computer Science CPSC 322 Lecture 16 CSP: wrap-up Planning: Intro (Ch 8.1) Slide 1

Announcements Practice short questions for midterm available in Vista Important to test basic understanding of material, some questions on the exam will be very similar/related to these Assignment 2 due this 11:59pm Remember all the rules on plagiarism discussed during the first class and in the syllabus You are allowed to collaborate OFFICIALLY with one partner Any other form of using someone else work is plagiarism, and will be dealt with as per UBC regulations Both those who copy and those who allow others to copy are considered at fault. Slide 2

Lecture Overview Recap of Lecture 15 SLS wrap up One more algorithm Planning Intro Slide 3

What are we going to look at in AIspace Can play with these Alternatives in the Hill AI Space applet Random sampling Random walk Greedy Descent Greedy Descent Min conflict Greedy Descent with random walk Greedy Descent with random restart ….. One stage selection of variable and value: Sometime chose the best neighbour Sometime choose a random variable-value pair Two stage selection (first select variable V, then new value for V): Selecting variables: Sometimes choose the variable which participates in the largest number of conflicts Sometimes choose a random variable that participates in some conflict Sometimes choose a random variable Selecting values S ometimes choose the best value for the chosen variable Sometimes choose a random value for the chosen variable Slide 4

Use Runtime distributions Perform many runs (e.g. below: 1000 runs) Consider the empirical distribution of the runtimes Comparing SLS algorithms Slide 5

Comparing runtime distributions x axis: runtime (or number of steps) y axis: proportion (or number) of runs solved in that runtime Typically use a log scale on the x axis Fraction of solved runs, i.e. P(solved by this # of steps/time) # of steps 28% solved after 10 steps, then stagnate 57% solved after 80 steps, then stagnate Slow, but does not stagnate Crossover point: if we run longer than 80 steps, green is the best algorithm If we run less than 10 steps, red is the best algorithm Can see how this works in AIspace Slide 6

SLS limitations Typically no guarantee to find a solution even if one exists SLS algorithms can sometimes stagnate Get caught in one region of the search space and never terminate Very hard to analyze theoretically Not able to show that no solution exists SLS simply won’t terminate You don’t know whether the problem is infeasible or the algorithm has stagnated Slide 7

SLS Advantage: anytime algorithms When do you stop? When you know the solution found is optimal (e.g. no constraint violations) Or when you’re out of time: you have to act NOW Anytime algorithm: maintain the node with best h found so far (the “incumbent”) given more time, can improve its incumbent Slide 8

SLS pros: dynamically changing problems The problem may change over time Particularly important in scheduling E.g., schedule for airline: Thousands of flights and thousands of personnel assignments A storm can render the schedule infeasible Goal: Repair the schedule with minimum number of changes Often easy for SLS starting from the current schedule Other techniques usually: Require more time Might find solution requiring many more changes Slide 9

Lecture Overview Recap of Lecture 18 SLS wrap up One more algorithm Planning Intro Slide 10

Simulated Annealing Annealing: a metallurgical process where metals are hardened by being slowly cooled. Analogy: start with a high ``temperature'': a high tendency to take random steps Over time, cool down: more likely to follow the scoring function Temperature reduces over time, according to an annealing schedule Key idea: Change the degree of randomness during execution. Slide 11

Simulated Annealing: algorithm Here's how it works You are at a node n. Pick a variable at random and a new value at random. You generate n' If it is an improvement i.e., h(n) – h(n’), adopt it. If it isn't an improvement, adopt it probabilistically depending on the difference h(n) – h(n’) and a temperature parameter, T. move to n' with probability Slide 12

Simulated Annealing: algorithm if there is no improvement, move to n' with probability The higher T, the higher the probability of selecting a non improving node for the same h(n) – h(n’) The higher |h(n) – h(n’)|, the more negative The lower the probability of selecting a non improving node for the same T h(n)-h(n’) Slide 13

Properties of simulated annealing search If T decreases slowly enough, then simulated annealing search will find a global optimum with probability approaching 1 - But “slowly enough” usually means longer than the time required by an exhaustive search of the assignment space  Finding a good annealing schedule is “an art” and very much problem dependent Widely used in VLSI layout, airline scheduling, etc. Slide 14

Tabu Search Mark partial assignments as tabu (taboo) Prevents repeatedly visiting the same (or similar) local minima Maintain a queue of k variable/value pairs that are taboo E.g., when changing a variable V’s value from 2 to 4, we cannot change it back to 2 for the next k steps k is a parameter that needs to be optimized empirically Slide 15

Population Based SLS Often we have more memory than the one required for maintaining just current node (e.g., best so far + tabu list) Key Idea: maintain a population of k individuals At every stage, update your population. Whenever one individual is a solution, report it. E.g., stochastic Bean Search, Genetic Algorithms Not required for this course but you can see how they work in Ch. 4.9 if interested Slide 16

SLS for Constraint Optimization Problems Constraint Satisfaction Problems Hard constraints: need to satisfy all of them All models are equally good Constraint Optimization Problems Hard constraints: need to satisfy all of them Soft constraints: need to satisfy them as well as possible Can have weighted constraints Minimize h(n) = sum of weights of constraints unsatisfied in n Hard constraints have a very large weight Some soft constraints can be more important than other soft constraints  larger weight All local search methods discussed work just as well for constraint optimization all they need is an evaluation function h Slide 17

Example for constraint optimization problem Exam scheduling Hard constraints: Cannot have an exam in too small a room Cannot have multiple exams in the same room in the same time slot … Soft constraints Student should not have to write two exams back to back (important) Students should not have multiple exams on the same day It would be nice if students had their exams spread out … 18

Successful application of SLS Scheduling of Hubble Space Telescope: reducing time to schedule 3 weeks of observations: from one week to around 10 sec. Slide 19

RNA strand GUCCCAUAGGAUGUCCCAUAGGA 20 Example: SLS for RNA secondary structure design RNA strand made up of four bases: cytosine (C), guanine (G), adenine (A), and uracil (U) Which 2D/3D structure an RNA strand folds into is important for its function Predicting structure for a strand is “easy”: O(n 3 ) But what if we want a strand that folds into a certain structure? Local search over strands Search for one that folds into the right structure Evaluation function for a strand Run O(n 3 ) prediction algorithm Evaluate how different the result is from our target structure Secondary structure Easy Hard One of best algorithm to date: Local search algorithm RNA-SSD developed at UBC [Andronescu, Fejes, Hutter, Condon, and Hoos, Journal of Molecular Biology, 2004]

Learning Goals for Local Search Implement local search for a CSP. Implement different ways to generate neighbors Implement scoring functions to solve a CSP by local search through either greedy descent or hill-climbing. Implement SLS with random steps (1-step, 2-step versions) random restart Compare SLS algorithms with runtime distributions Explain functioning of Simulated Annealing Slide 21