Computer Science CPSC 322 Lecture A* and Search Refinements (Ch 3.6.1, 3.7.1, 3.7.2) Slide 1.

Slides:



Advertisements
Similar presentations
Lights Out Issues Questions? Comment from me.
Advertisements

Informed search algorithms
CPSC 322, Lecture 7Slide 1 Heuristic Search Computer Science cpsc322, Lecture 7 (Textbook Chpt 3.5) January, 19, 2009.
Review: Search problem formulation
Informed Search Algorithms
Informed search strategies
Informed search algorithms
A* Search. 2 Tree search algorithms Basic idea: Exploration of state space by generating successors of already-explored states (a.k.a.~expanding states).
Problem Solving: Informed Search Algorithms Edmondo Trentin, DIISM.
CPSC 322, Lecture 5Slide 1 Uninformed Search Computer Science cpsc322, Lecture 5 (Textbook Chpt 3.5) Sept, 14, 2012.
Heuristic Search Jim Little UBC CS 322 – Search 4 September 17, 2014 Textbook §
Solving Problem by Searching
CPSC 322, Lecture 5Slide 1 Uninformed Search Computer Science cpsc322, Lecture 5 (Textbook Chpt 3.4) January, 14, 2009.
Uninformed Search Jim Little UBC CS 322 – Search 2 September 12, 2014
Slide 1 Heuristic Search: BestFS and A * Jim Little UBC CS 322 – Search 5 September 19, 2014 Textbook § 3.6.
CPSC 322, Lecture 9Slide 1 Search: Advanced Topics Computer Science cpsc322, Lecture 9 (Textbook Chpt 3.6) January, 23, 2009.
CPSC 322 Introduction to Artificial Intelligence October 27, 2004.
CPSC 322, Lecture 9Slide 1 Search: Advanced Topics Computer Science cpsc322, Lecture 9 (Textbook Chpt 3.6) January, 22, 2010.
Review: Search problem formulation
CS 460 Spring 2011 Lecture 3 Heuristic Search / Local Search.
Problem Solving and Search in AI Heuristic Search
CPSC 322, Lecture 8Slide 1 Heuristic Search: BestFS and A * Computer Science cpsc322, Lecture 8 (Textbook Chpt 3.5) January, 21, 2009.
Uninformed Search (cont.)
Class of 28 th August. Announcements Lisp assignment deadline extended (will take it until 6 th September (Thursday). In class. Rao away on 11 th and.
Multiple Path Pruning, Iterative Deepening CPSC 322 – Search 7 Textbook § January 26, 2011.
Graphs II Robin Burke GAM 376. Admin Skip the Lua topic.
CPSC 322, Lecture 8Slide 1 Heuristic Search: BestFS and A * Computer Science cpsc322, Lecture 8 (Textbook Chpt 3.6) Sept, 23, 2013.
Computer Science CPSC 322 Lecture Heuristic Search (Ch: 3.6, 3.6.1) Slide 1.
Lecture 2 Search (textbook Ch: 3)
Informed search algorithms
Computer Science CPSC 322 Lecture 9 (Ch , 3.7.6) Slide 1.
Informed search algorithms Chapter 4. Best-first search Idea: use an evaluation function f(n) for each node –estimate of "desirability"  Expand most.
Informed search strategies Idea: give the algorithm “hints” about the desirability of different states – Use an evaluation function to rank nodes and select.
Search (cont) & CSP Intro Tamara Berg CS 560 Artificial Intelligence Many slides throughout the course adapted from Dan Klein, Stuart Russell, Andrew Moore,
Informed searching. Informed search Blind search algorithms do not consider any information about the states and the goals Often there is extra knowledge.
Informed Search Methods. Informed Search  Uninformed searches  easy  but very inefficient in most cases of huge search tree  Informed searches  uses.
Informed Search Strategies Lecture # 8 & 9. Outline 2 Best-first search Greedy best-first search A * search Heuristics.
For Monday Read chapter 4, section 1 No homework..
Search with Costs and Heuristic Search 1 CPSC 322 – Search 3 January 17, 2011 Textbook §3.5.3, Taught by: Mike Chiang.
Review: Tree search Initialize the frontier using the starting state While the frontier is not empty – Choose a frontier node to expand according to search.
Heuristic Search: A* 1 CPSC 322 – Search 4 January 19, 2011 Textbook §3.6 Taught by: Vasanth.
Advanced Artificial Intelligence Lecture 2: Search.
Computer Science CPSC 322 Lecture 6 Iterative Deepening and Search with Costs (Ch: 3.7.3, 3.5.3)
Informed Search and Heuristics Chapter 3.5~7. Outline Best-first search Greedy best-first search A * search Heuristics.
Best-first search Idea: use an evaluation function f(n) for each node –estimate of "desirability"  Expand most desirable unexpanded node Implementation:
CPSC 322, Lecture 8Slide 1 Heuristic Search: BestFS and A * Computer Science cpsc322, Lecture 8 (Textbook Chpt 3.6) Sept, 21, 2010.
Heuristic Functions. A Heuristic is a function that, when applied to a state, returns a number that is an estimate of the merit of the state, with respect.
CPSC 322, Lecture 6Slide 1 Uniformed Search (cont.) Computer Science cpsc322, Lecture 6 (Textbook finish 3.5) Sept, 17, 2012.
A* optimality proof, cycle checking CPSC 322 – Search 5 Textbook § 3.6 and January 21, 2011 Taught by Mike Chiang.
1 CSC 384 Lecture Slides (c) , C. Boutilier and P. Poupart CSC384: Lecture 10  Last time A*  Today complete A*, iterative deepening  Announcements:
Chapter 3.5 and 3.6 Heuristic Search Continued. Review:Learning Objectives Heuristic search strategies –Best-first search –A* algorithm Heuristic functions.
Heuristic Functions.
CPSC 322, Lecture 5Slide 1 Uninformed Search Computer Science cpsc322, Lecture 5 (Textbook Chpt 3.5) Sept, 13, 2013.
G5AIAI Introduction to AI Graham Kendall Heuristic Searches.
For Monday Read chapter 4 exercise 1 No homework.
CPSC 322, Lecture 7Slide 1 Heuristic Search Computer Science cpsc322, Lecture 7 (Textbook Chpt 3.6) Sept, 20, 2013.
Computer Science CPSC 322 Lecture 6 Analysis of A*, Search Refinements (Ch ) Slide 1.
Computer Science CPSC 322 Lecture 5 Heuristic Search and A* (Ch: 3.6, 3.6.1) Slide 1.
Computer Science cpsc322, Lecture 7
Review: Tree search Initialize the frontier using the starting state
Uniformed Search (cont.) Computer Science cpsc322, Lecture 6
Department of Computer Science
Heuristic Search Introduction to Artificial Intelligence
Uniformed Search (cont.) Computer Science cpsc322, Lecture 6
HW #1 Due 29/9/2008 Write Java Applet to solve Goats and Cabbage “Missionaries and cannibals” problem with the following search algorithms: Breadth first.
The A* Algorithm Héctor Muñoz-Avila.
Informed search algorithms
Informed search algorithms
Iterative Deepening CPSC 322 – Search 6 Textbook § 3.7.3
Iterative Deepening and Branch & Bound
Presentation transcript:

Computer Science CPSC 322 Lecture A* and Search Refinements (Ch 3.6.1, 3.7.1, 3.7.2) Slide 1

Course Announcements Assignment1: Posted on Connect on Monday Due: Monday Feb. 2, 2PM Slide 2 Remember, post questions on the assignment or course material on the Connect board, do not them to me or the TAs Also, remember that you should expect 24h as response time from the assignment team. you post a question less than 24h before the assignment is due, you might not get an answer from the teaching team.

Lecture Overview Recap of Lecture 7 A* Properties Cycle Checking and Multiple Path Pruning (time permitting) Slide 3

How to Make Search More Informed? Def.: A search heuristic h(n) is an estimate of the cost of the optimal (cheapest) path from node n to a goal node. Estimate: h(n1) Estimate: h(n2) Estimate: h(n3) n3 n2 n1 Slide 4 h can be extended to paths: h(  n 0,…,n k  )=h(n k ) h(n) should leverage readily obtainable information (easy to compute) about a node.

Always choose the path on the frontier with the smallest h value. BestFS treats the frontier as a priority queue ordered by h. Can get to the goal pretty fast if it has a good h but… It is not complete, nor optimal still has time and space worst-case complexity of O(b m ) Best First Search (BestFS) 5

A * search takes into account both the cost of the path to a node c(p) the heuristic value of that path h(p). Let f(p) = c(p) + h(p). f(p) is an estimate of the cost of a path from the start to a goal via p. A* Search c(p) h(p) f(p ) A* always chooses the path on the frontier with the lowest estimated distance from the start to a goal node constrained to go via that path. Compare A* and LCFS on the Vancouver graph Slide 6

F-value of ubc  kd  jb? 10 Computing f-valeues Slide 7

A* is complete (finds a solution, if one exists) and optimal (finds the optimal path to a goal) if the branching factor is finite arc costs are > 0 h(n) is admissible Optimality of A* Slide 8

Admissibility of a heuristic 9 Def.: Let c(n) denote the cost of the optimal path from node n to any goal node. A search heuristic h(n) is called admissible if h(n) ≤ c(n) for all nodes n, i.e. if for all nodes it is an underestimate of the cost to any goal. Example: is the straight- line distance admissible? - The shortest distance between two points is a line. YES

Admissibility of a heuristic 10 Def.: Let c(n) denote the cost of the optimal path from node n to any goal node. A search heuristic h(n) is called admissible if h(n) ≤ c(n) for all nodes n, i.e. if for all nodes it is an underestimate of the cost to any goal. example: the goal is Urzizeni (red box), but all we know is the straight- line distances to Bucharest (green box) NO Possible h(n) = sld(n, Bucharest) + cost(Bucharest, Urzineni) Admissible? Cost of going from Vastul to Urzineni is shorter than this estimate

Example 3: Eight Puzzle Another possible h(n): Sum of number of moves between each tile's current position and its goal position (we can move over other tiles in the grid) Sum ( Slide 11

Example 3: Eight Puzzle Another possible h(n): Sum of number of moves between each tile's current position and its goal position sum ( ) = 18 Admissible? A. Yes C. It depends B. No Slide 12

Example 3: Eight Puzzle Another possible h(n): Sum of number of moves between each tile's current position and its goal position sum = 18 Admissible? YES! Slide 13

How to Construct an Admissible Heuristic 14 Identify relaxed version of the problem: -where one or more constraints have been dropped -problem with fewer restrictions on the actions Grid world: the agent can move through walls Driver: the agent can move straight 8 puzzle: -“number of misplaced tiles”: tiles can move everywhere and occupy same spot as others -“sum of moves between current and goal position”: tiles can occupy same spot as others Why does this lead to an admissible heuristic? -The problem only gets easier! -Less costly to solve it

A* is complete (finds a solution, if one exists) and optimal (finds the optimal path to a goal) if the branching factor is finite arc costs are > 0 h(n) is admissible Back to A* Slide 15

Lecture Overview Recap of Lecture 7 A* Properties Cycle Checking and Multiple Path Pruning (time permitting) Slide 16

S p* p 1. f(p*) = c(p*) + h(p*) = c(p*) Because? Proof Let’s p be a subpath of an optimal path p* We can show that If h is admissible f(p) ≤ f(p*) 17

S p* p 1. f(p*) = c(p*) + h(p*) = c(p*) 2. f(p) = c(p) + h(p) <= c(p*) = f(p*) Because h is admissible, thus cannot be > 0 at the goal, must be h(p*) = 0 Proof Let’s p be a subpath of an optimal path p* We can show that If h is admissible f(p) ≤ f(p*) 18

S p* p 1. f(p*) = c(p*) + h(p*) = c(p*) 2. f(p) = c(p) + h(p) <= c(p*) = f(p*) Because h(p) does not overestimate the cost of getting from p to the goal. Thus f(p) <= f(p*) For every subpath of an optimal path p* Because h is admissible, thus cannot be > 0 at the goal, must be h(p*) = 0 Proof Let’s p be a subpath of an optimal path p* We can show that If h is admissible f(p) ≤ f(p*) 19

It halts (does not get caught in cycles) Let f min be the cost of the (an) optimal solution path s (unknown but finite if there exists a solution) Let c min > 0 be the minimal cost of any arc Each sub-path p of s has f(p) ≤ f min -Due to admissibility (see previous slide) A* expands paths on the frontier with minimal f(n), and -Always a subpath of s on the frontier -Only expands paths p with f(p) ≤ f min -Terminates when expanding s Because, with positive arc cost, the cost of any other path p on the frontier would eventually exceed f min, at depth less no greater than (f min / c min ) See how it works on the “misleading heuristic” problem in AI space: Why is A* complete 20

Why is A* complete? A* does not get caught into the cycle because f(n) of sub paths in the cycle eventually (at depth <= 55.4/6.9) exceed the cost of the optimal solution 55.4 (N0->N6->N7->N8) 21

22 A* does not get caught into the cycle because f(n) of sub paths in the cycle eventually (at depth <= 55.4/6.9) exceed the cost of the optimal solution 55.4 (N0->N6->N7->N8)

Let p* be the optimal solution path, with cost c*. Let p’ be a suboptimal solution path. That is c(p’) > c*. We are going to show that any sub-path p" of p* on the frontier will be expanded before p’ Therefore, A* will find p* before p’ Why A* is optimal p’ p* p” 23

And because Let p* be the optimal solution path, with cost c*. Let p’ be a suboptimal solution path. That is c(p’) > c*. Let p” be a sub-path of p* on the frontier. Why A* is optimal we know that because at a goal node f(p’’) f(p*) p’ p* p” f (goal) Thus f* f(p’) f(p”) f(p’) Any sup-path of the optimal solution path will be expanded before p’ 24

And because h is admissible Let p* be the optimal solution path, with cost c*. Let p’ be a suboptimal solution path. That is c(p’) > c*. Let p” be a sub-path of p* on the frontier. Why A* is optimal we know that because at a goal node f(p’’) < f(p*) p’ p* p” f (goal) = c(goal) Thus f* < f(p’) f(p”) < f(p’) Any sup-path of the optimal solution path will be expanded before p’ 25

Recap: A* is optimal and complete f min = cost of the (an) optimal solution path s (unknown but finite if there exists a solution) Any subpath p of s has f(p) ≤ f min (due to admissibility, as showed in previous slide) Always one such p on the frontier These properties allowed us to prove that A* only expands paths p with f(p) ≤ f min Then we re basically done! Only finite number of such paths ( ⇒ completeness) Solution with cost > f min won`t be expanded ( ⇒ optimality) 26

Run A* on this example to see how A* starts off going down the suboptimal path (through N5) but then recovers and never expands it, because there are always subpaths of the optimal path through N2 on the frontier Slide 27 27

Run A* on this example to see how A* starts off going down the suboptimal path (through N5) but then recovers and never expands it, because there are always subpaths of the optimal path through N2 on the frontier Slide 28 28

If fact, we can say something even stronger about A* (when it is admissible) A* is optimally efficient among the algorithms that extend the search path from the initial state. It finds the goal with the minimum # of expansions Analysis of A* 29

Why A* is Optimally Efficient No other optimal algorithm is guaranteed to expand fewer nodes than A* (given the same heuristic function) This is because any algorithm that does not expand every node with f(n) < f* risks missing the optimal solution 30

Analysis of A* If the heuristic is completely uninformative and the edge costs are all the same, A* is equivalent to…. A.BFS B.LCFS C.DFS D.None of the Above 31

Analysis of A* If the heuristic is completely uninformative and the edge costs are all the same, A* is equivalent to…. BFS but I will accept LCFS as correct as well because when edge costs are all equal it is equivalent to BFS 32

Time Space Complexity of A * Time complexity is O(b m ) the heuristic could be completely uninformative and the edge costs could all be the same, meaning that A * does the same thing as BFS Space complexity is O(b m ) like BFS, A * maintains a frontier which grows with the size of the tree 33

Effect of Search Heuristic A search heuristic that is a better approximation on the actual cost reduces the number of nodes expanded by A* Example: 8puzzle: (1) tiles can move anywhere (h 1 : number of tiles that are out of place) (2) tiles can move to any adjacent square (h 2 : sum of number of squares that separate each tile from its correct position) average number of paths expanded: (d = depth of the solution) d=12IDS = 3,644,035 paths A * (h 1 ) = 227 paths A * (h 2 ) = 73 paths d=24 IDS = too many paths A * (h 1 ) = 39,135 paths A * (h 2 ) = 1,641 paths 34

Apply basic properties of search algorithms: - completeness, optimality, time and space complexity CompleteOptimalTimeSpace DFSN (Y if no cycles) NO(b m )O(mb) BFSYYO(b m ) IDSYYO(b m )O(mb) LCFS (when arc costs available) Y Costs > 0 Y Costs >=0 O(b m ) Best First (when h available) NNO(b m ) A* (when arc costs > 0 and h admissible) YYO(b m ) Learning Goals 35

More Sophisticated Search: Ch , TODO Do practice exercise 3C in AIspace HMW 1 is posted in Connect (Due Monday Feb. 2) 36