Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.

Slides:



Advertisements
Similar presentations
Informed search algorithms
Advertisements

Local Search Algorithms
LOCAL SEARCH AND CONTINUOUS SEARCH. Local search algorithms  In many optimization problems, the path to the goal is irrelevant ; the goal state itself.
CSC344: AI for Games Lecture 5 Advanced heuristic search Patrick Olivier
Local search algorithms
Local search algorithms
Two types of search problems
CS 460 Spring 2011 Lecture 3 Heuristic Search / Local Search.
CS 4700: Foundations of Artificial Intelligence
Trading optimality for speed…
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Review Best-first search uses an evaluation function f(n) to select the next node for expansion. Greedy best-first search uses f(n) = h(n). Greedy best.
CSC344: AI for Games Lecture 4: Informed search
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Informed Search Next time: Search Application Reading: Machine Translation paper under Links Username and password will be mailed to class.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
1 CS 2710, ISSP 2610 R&N Chapter 4.1 Local Search and Optimization.
Artificial Intelligence Problem solving by searching CSC 361
Local Search and Optimization
Vilalta&Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
Artificial Intelligence Problem solving by searching CSC 361
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Search CSE When you can’t use A* Hill-climbing Simulated Annealing Other strategies 2 person- games.
INTRODUÇÃO AOS SISTEMAS INTELIGENTES Prof. Dr. Celso A.A. Kaestner PPGEE-CP / UTFPR Agosto de 2011.
An Introduction to Artificial Life Lecture 4b: Informed Search and Exploration Ramin Halavati In which we see how information.
Local Search Algorithms This lecture topic Chapter Next lecture topic Chapter 5 (Please read lecture topic material before and after each lecture.
Informed search algorithms
Informed search algorithms
1 Shanghai Jiao Tong University Informed Search and Exploration.
Chapter 4.1 Beyond “Classic” Search. What were the pieces necessary for “classic” search.
Informed search algorithms Chapter 4. Best-first search Idea: use an evaluation function f(n) for each node –estimate of "desirability"  Expand most.
1 CS 2710, ISSP 2610 Chapter 4, Part 2 Heuristic Search.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
Artificial Intelligence for Games Online and local search
Local Search Algorithms
Local Search Pat Riddle 2012 Semester 2 Patricia J Riddle Adapted from slides by Stuart Russell,
Princess Nora University Artificial Intelligence Chapter (4) Informed search algorithms 1.
Local Search and Optimization Presented by Collin Kanaley.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
When A* doesn’t work CIS 391 – Intro to Artificial Intelligence A few slides adapted from CS 471, Fall 2004, UBMC (which were adapted from notes by Charles.
4/11/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 4, 4/11/2005 University of Washington, Department of Electrical Engineering Spring 2005.
A General Introduction to Artificial Intelligence.
Feng Zhiyong Tianjin University Fall  Best-first search  Greedy best-first search  A * search  Heuristics  Local search algorithms  Hill-climbing.
Local search algorithms In many optimization problems, the state space is the space of all possible complete solutions We have an objective function that.
Optimization Problems
Announcement "A note taker is being recruited for this class. No extra time outside of class is required. If you take clear, well-organized notes, this.
Local Search. Systematic versus local search u Systematic search  Breadth-first, depth-first, IDDFS, A*, IDA*, etc  Keep one or more paths in memory.
Chapter 4 (Section 4.3, …) 2 nd Edition or Chapter 4 (3 rd Edition) Local Search and Optimization.
Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Lecture 6 – Local Search Dr. Muhammad Adnan Hashmi 1 24 February 2016.
Local Search Algorithms and Optimization Problems
CPSC 420 – Artificial Intelligence Texas A & M University Lecture 5 Lecturer: Laurie webster II, M.S.S.E., M.S.E.e., M.S.BME, Ph.D., P.E.
Department of Computer Science Lecture 5: Local Search
Local Search and Optimization Chapter 4 Mausam (Based on slides of Padhraic Smyth, Stuart Russell, Rao Kambhampati, Raj Rao, Dan Weld…) 1.
Local Search Algorithms CMPT 463. When: Tuesday, April 5 3:30PM Where: RLC 105 Team based: one, two or three people per team Languages: Python, C++ and.
Constraints Satisfaction Edmondo Trentin, DIISM. Constraint Satisfaction Problems: Local Search In many optimization problems, the path to the goal is.
Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution State space = set of "complete"
Games: Expectimax MAX MIN MAX Prune if α ≥ β. Games: Expectimax MAX MIN MAX
Genetic Algorithms.
CSCI 4310 Lecture 10: Local Search Algorithms
Local Search Algorithms
Artificial Intelligence (CS 370D)
Local Search and Optimization
Artificial Intelligence
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
Local Search Algorithms
Beyond Classical Search
Local Search Algorithms
Presentation transcript:

Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information Systems

Dr.Abeer Mahmoud (CHAPTER-4) BEYOND CLASSICAL SEARCH

Dr.Abeer Mahmoud LOCAL SEARCH STRATEGY  Hill-Climbing Search.  Simulated Annealing Search.  Local Beam Search.  Genetic Algorithms. 3

Dr.Abeer Mahmoud 4 Classical search versus Local Search Classical searchLocal Search  systematic exploration of search space.  Keeps one or more paths in memory.  Records which alternatives have been explored at each point along the path.  The path to the goal is a solution to the problem.  In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution.  State space = set of "complete" configurations.  Find configuration satisfying constraints, Find best state according to some objective function h(s). e.g., n-queens, h(s)= number of attacking queens. In such cases, we can use Local Search Algorithms.

Dr.Abeer Mahmoud Local Search Algorithms  Local Search Algorithms keep a single "current" state, and move to neighboring states in order to try improve it.  Solution path needs not be maintained.  Hence, the search is “local”.  Local search suitable for problems in which path is not important; the goal state itself is the solution.  It is an optimization search 5

Dr.Abeer Mahmoud Example: n-queens 6  Put n queens on an n × n board with no two queens on the same row, column, or diagonal.  In the 8-queens problem, what matters is the final configuration of queens, not the order in which they are added.

Dr.Abeer Mahmoud Local Search: Key Idea Key idea: 1.Select (random) initial state (generate an initial guess). 2.Make local modification to improve current state (evaluate current state and move to other states). 3. Repeat Step 2 until goal state found (or out of time). 7

Dr.Abeer Mahmoud 8 AdvantagesDrawback:  Use very little memory – usually a constant amount.  Can often find reasonable solutions in large or infinite state spaces (e.g., continuous). For which systematic search is unsuitable.  Local Search can get stuck in local maxima and not find the optimal solution. Local Search: Key Idea

Dr.Abeer Mahmoud State Space Landscape A state space landscape: is a graph of states associated with their costs. 9

Dr.Abeer Mahmoud Hill-Climbing Search 10

Dr.Abeer Mahmoud Hill-Climbing Search  Main Idea: Keep a single current node and move to a neighboring state to improve it.  Uses a loop that continuously moves in the direction of increasing value (uphill):  Choose the best successor, choose randomly if there is more than one.  Terminate when a peak reached where no neighbor has a higher value.  It also called greedy local search, steepest ascent/descent. 11

Dr.Abeer Mahmoud 12 Hill-Climbing Search

Dr.Abeer Mahmoud Hill-Climbing in Action … 13 cost States Current best solution

Dr.Abeer Mahmoud Hill-Climbing in Action … 14 cost States Current best solution

Dr.Abeer Mahmoud Hill-Climbing in Action … 15 cost States Current best solution

Dr.Abeer Mahmoud Hill-Climbing in Action … 16 cost States Current best solution

Dr.Abeer Mahmoud Hill-Climbing in Action … 17 cost States Current solution Global minima Local minima

Dr.Abeer Mahmoud Hill-Climbing in Action … 18 cost States Current solution Global minima Local minima Drawback: Depending on initial state, it can get stuck in local maxima/minimum or flat local maximum and not find the solution. Cure: Random restart.

Dr.Abeer Mahmoud Simulated Annealing Search 19

Dr.Abeer Mahmoud The Problem  Most minimization strategies find the nearest local minimum  Standard strategy Generate trial point based on current estimates Evaluate function at proposed location Accept new value if it improves solution 20

Dr.Abeer Mahmoud The Solution  We need a strategy to find other minima  This means, we must sometimes select new points that do not improve solution  How? 21

Dr.Abeer Mahmoud Annealing  One manner in which crystals are formed  Gradual cooling of liquid … At high temperatures, molecules move freely At low temperatures, molecules are "stuck" If cooling is slow Low energy, organized crystal lattice formed 22

Dr.Abeer Mahmoud Simulated annealing Search  Main Idea: escape local maxima by allowing some "bad" moves but gradually decrease their frequency.  Instead of picking the best move, it picks a random move.. 23

Dr.Abeer Mahmoud 24 Simulated annealing Search  say the change in objective function is   if  is positive, then move to that state  otherwise: move to this state with probability proportional to  thus: worse moves (very large negative  ) are executed less often

Dr.Abeer Mahmoud 25 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 26 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 27 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 28 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 29 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 30 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 31 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 32 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 33 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 34 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 35 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 36 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 37 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 38 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 39 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 40 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 41 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 42 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 43 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 44 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 45 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 46 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud 47 Simulated Annealing Cost States Best

Dr.Abeer Mahmoud Lets say there are 3 moves available, with changes in the objective function of d1 = -0.1, d2 = 0.5, d3 = -5. (Let T = 1). pick a move randomly: if d2 is picked, move there. if d1 or d3 are picked, probability of move = exp(d/T) move 1: prob1 = exp(-0.1) = 0.9, i.e., 90% of the time we will accept this move move 3: prob3 = exp(-5) = 0.05 i.e., 5% of the time we will accept this move 48 Simulated annealing Search

Dr.Abeer Mahmoud Properties of Simulated Annealing Cooling Schedule: determines rate at which the temperature T is lowered. 49

Dr.Abeer Mahmoud Local Beam Search 50

Dr.Abeer Mahmoud Local Beam Search  Main Idea: Keep track of k states rather than just one.  Start with k randomly generated states.  At each iteration, all the successors of all k states are generated.  If any one is a goal state, stop; else select the k best successors from the complete list and repeat.  Drawback: the k states tend to regroup very quickly in the same region  lack of diversity. 51

Dr.Abeer Mahmoud 52 Local Beam Search Cost States

Dr.Abeer Mahmoud 53 Local Beam Search

Dr.Abeer Mahmoud 54 Local Beam Search

Dr.Abeer Mahmoud 55 Local Beam Search

Dr.Abeer Mahmoud 56 Local Beam Search

Dr.Abeer Mahmoud 57 Local Beam Search

Dr.Abeer Mahmoud 58 Local Beam Search

Dr.Abeer Mahmoud 59 Local Beam Search

Dr.Abeer Mahmoud 60 Local Beam Search

Dr.Abeer Mahmoud 61 Local Beam Search

Dr.Abeer Mahmoud 62 Local Beam Search

Dr.Abeer Mahmoud 63 Local Beam Search

Dr.Abeer Mahmoud 64 Local Beam Search

Dr.Abeer Mahmoud 65 Local Beam Search

Dr.Abeer Mahmoud 66 Local Beam Search

Dr.Abeer Mahmoud 67 Local Beam Search

Dr.Abeer Mahmoud Genetic Algorithm Search 68

Dr.Abeer Mahmoud 69 Genetic Algorithms Formally introduced in the US in the 70s by John Holland. GAs emulate ideas from genetics and natural selection and can search potentially large spaces. Before we can apply Genetic Algorithm to a problem, we need to answer: - How is an individual represented? - What is the fitness function? - How are individuals selected? - How do individuals reproduce?

Dr.Abeer Mahmoud 70 Genetic Algorithms: Representation of states (solutions) Each state or individual is represented as a string over a finite alphabet. It is also called chromosome which Contains genes Solution: 607 Encoding Chromosome: Binary String genes

Dr.Abeer Mahmoud 71 Genetic Algorithms: Fitness Function Each state is rated by the evaluation function called fitness function. Fitness function should return higher values for better states: Fitness(X) should be greater than Fitness(Y) !! [Fitness(x) = 1/Cost(x)] Cost States XY

Dr.Abeer Mahmoud GA Parent Selection - Roulette Wheel Roulette Wheel Selection Sum the fitnesses of all the population members, TF Generate a random number, m, between 0 and TF Return the first population member whose fitness added to the preceding population members is greater than or equal to m

Dr.Abeer Mahmoud 73 Genetic Algorithms: Selection How are individuals selected ? Roulette Wheel Selection Rnd[0..18] = 7 Chromosome4 Rnd[0..18] = 12 Chromosome6

Dr.Abeer Mahmoud 74 Genetic Algorithms: Cross-Over and Mutation How do individuals reproduce ?

Dr.Abeer Mahmoud 75 Genetic Algorithms Crossover - Recombination Crossover single point - random Parent1 Parent2 Offspring1 Offspring2 With some high probability (crossover rate) apply crossover to the parents. (typical values are 0.8 to 0.95)

Dr.Abeer Mahmoud 76 Stochastic Search: Genetic Algorithms Mutation Offspring1 Offspring Offspring1 Offspring2 With some small probability (the mutation rate) flip each bit in the offspring (typical values between 0.1 and 0.001) mutate Original offspring Mutated offspring

Dr.Abeer Mahmoud 77 Genetic Algorithms Algorithm: 1. Initialize population with p Individuals at random 2. For each Individual h compute its fitness 3. While max fitness < threshold do Create a new generation Ps 4. Return the Individual with highest fitness

Dr.Abeer Mahmoud Genetic algorithms Has the effect of “jumping” to a completely different new part of the search space (quite non-local)

Dr.Abeer Mahmoud Genetic programming Search 79

Dr.Abeer Mahmoud 80 Genetic Programming Genetic programming (GP) Programming of Computers by Means of Simulated Evolution How to Program a Computer Without Explicitly Telling It What to Do? Genetic Programming is Genetic Algorithms where solutions are programs …

Dr.Abeer Mahmoud 81 Genetic programming When the chromosome encodes an entire program or function itself this is called genetic programming (GP) In order to make this work,encoding is often done in the form of a tree representation Crossover entials swaping subtrees between parents

Dr.Abeer Mahmoud 82 Genetic programming It is possible to evolve whole programs like this but only small ones. Large programs with complex functions present big problems

Dr.Abeer Mahmoud Thank you End of Chapter 4 83