Computer Science cpsc322, Lecture 7

Slides:



Advertisements
Similar presentations
Heuristic Functions By Peter Lane
Advertisements

CPSC 322, Lecture 7Slide 1 Heuristic Search Computer Science cpsc322, Lecture 7 (Textbook Chpt 3.5) January, 19, 2009.
Review: Search problem formulation
Informed Search Algorithms
Informed search strategies
Heuristic Search Jim Little UBC CS 322 – Search 4 September 17, 2014 Textbook §
CPSC 322, Lecture 4Slide 1 Search: Intro Computer Science cpsc322, Lecture 4 (Textbook Chpt ) Sept, 11, 2013.
Solving Problem by Searching
Slide 1 Heuristic Search: BestFS and A * Jim Little UBC CS 322 – Search 5 September 19, 2014 Textbook § 3.6.
CPSC 322, Lecture 9Slide 1 Search: Advanced Topics Computer Science cpsc322, Lecture 9 (Textbook Chpt 3.6) January, 23, 2009.
Review: Search problem formulation
CPSC 322, Lecture 8Slide 1 Heuristic Search: BestFS and A * Computer Science cpsc322, Lecture 8 (Textbook Chpt 3.5) January, 21, 2009.
CMU Snake Robot
Problem Solving and Search Andrea Danyluk September 11, 2013.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
CPSC 322, Lecture 8Slide 1 Heuristic Search: BestFS and A * Computer Science cpsc322, Lecture 8 (Textbook Chpt 3.6) Sept, 23, 2013.
Computer Science CPSC 322 Lecture Heuristic Search (Ch: 3.6, 3.6.1) Slide 1.
Computer Science CPSC 322 Lecture A* and Search Refinements (Ch 3.6.1, 3.7.1, 3.7.2) Slide 1.
Informed search algorithms Chapter 4. Outline Best-first search Greedy best-first search A * search Heuristics.
Informed search algorithms Chapter 4. Best-first search Idea: use an evaluation function f(n) for each node –estimate of "desirability"  Expand most.
Informed search strategies Idea: give the algorithm “hints” about the desirability of different states – Use an evaluation function to rank nodes and select.
Informed searching. Informed search Blind search algorithms do not consider any information about the states and the goals Often there is extra knowledge.
For Monday Read chapter 4, section 1 No homework..
Search with Costs and Heuristic Search 1 CPSC 322 – Search 3 January 17, 2011 Textbook §3.5.3, Taught by: Mike Chiang.
Review: Tree search Initialize the frontier using the starting state While the frontier is not empty – Choose a frontier node to expand according to search.
Heuristic Search: A* 1 CPSC 322 – Search 4 January 19, 2011 Textbook §3.6 Taught by: Vasanth.
Computer Science CPSC 322 Lecture 6 Iterative Deepening and Search with Costs (Ch: 3.7.3, 3.5.3)
CS 312: Algorithm Design & Analysis Lecture #37: A* (cont.); Admissible Heuristics Credit: adapted from slides by Stuart Russell of UC Berkeley. This work.
Informed Search and Heuristics Chapter 3.5~7. Outline Best-first search Greedy best-first search A * search Heuristics.
CPSC 322, Lecture 8Slide 1 Heuristic Search: BestFS and A * Computer Science cpsc322, Lecture 8 (Textbook Chpt 3.6) Sept, 21, 2010.
Informed Search II CIS 391 Fall CIS Intro to AI 2 Outline PART I  Informed = use problem-specific knowledge  Best-first search and its variants.
Heuristic Functions. A Heuristic is a function that, when applied to a state, returns a number that is an estimate of the merit of the state, with respect.
CPSC 322, Lecture 4Slide 1 Search: Intro Computer Science cpsc322, Lecture 4 (Textbook Chpt ) Sept, 12, 2012.
A* optimality proof, cycle checking CPSC 322 – Search 5 Textbook § 3.6 and January 21, 2011 Taught by Mike Chiang.
Chapter 3.5 and 3.6 Heuristic Search Continued. Review:Learning Objectives Heuristic search strategies –Best-first search –A* algorithm Heuristic functions.
Heuristic Functions.
For Monday Read chapter 4 exercise 1 No homework.
CPSC 322, Lecture 7Slide 1 Heuristic Search Computer Science cpsc322, Lecture 7 (Textbook Chpt 3.6) Sept, 20, 2013.
Chapter 3 Solving problems by searching. Search We will consider the problem of designing goal-based agents in observable, deterministic, discrete, known.
Computer Science CPSC 322 Lecture 5 Heuristic Search and A* (Ch: 3.6, 3.6.1) Slide 1.
Review: Tree search Initialize the frontier using the starting state
Uniformed Search (cont.) Computer Science cpsc322, Lecture 6
Computer Science cpsc322, Lecture 4
Search: Advanced Topics Computer Science cpsc322, Lecture 9
Last time: Problem-Solving
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Heuristic Functions.
Artificial Intelligence (CS 370D)
For Monday Chapter 6 Homework: Chapter 3, exercise 7.
Department of Computer Science
Heuristic Search Introduction to Artificial Intelligence
Artificial Intelligence Problem solving by searching CSC 361
Search: Advanced Topics Computer Science cpsc322, Lecture 9
Uniformed Search (cont.) Computer Science cpsc322, Lecture 6
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Discussion on Greedy Search and A*
Discussion on Greedy Search and A*
Informed search algorithms
Informed search algorithms
Search: Advanced Topics Computer Science cpsc322, Lecture 9
Iterative Deepening CPSC 322 – Search 6 Textbook § 3.7.3
Announcements This Friday Project 1 due Talk by Jeniya Tabassum
HW 1: Warmup Missionaries and Cannibals
Iterative Deepening and Branch & Bound
Domain Splitting CPSC 322 – CSP 4 Textbook §4.6 February 4, 2011.
HW 1: Warmup Missionaries and Cannibals
Midterm Review.
Reading: Chapter 4.5 HW#2 out today, due Oct 5th
Solving Problems by Searching
Informed Search.
Presentation transcript:

Computer Science cpsc322, Lecture 7 Heuristic Search Computer Science cpsc322, Lecture 7 (Textbook Chpt 3.6) May, 23, 2017 Lecture 7 CPSC 322, Lecture 7

Course Announcements Assignment1: posted If you are confused on basic search algorithm, different search strategies….. Check learning goals at the end of lectures. Work on the Practice Exercises and Please do come to office hours Giuseppe : Fri  830-930, my office CICSR 105 Johnson, David davewj@cs.ubc.ca Office hour: ICCS X141, Wed 1-230pm Johnson, Jordon jordon@cs.ubc.ca Office hour: ICCS X141, Mon 11-1pm Kazemi, S. Mehran smkazemi@cs.ubc.ca Office hour: ICCS X141, Wed 230-4pm Rahman, MD Abed abed90@cs.ubc.ca Office hour: ICCS X141, Fri 3-430pm Wang, Wenyi wenyi.wang@alumni.ubc.ca Office hour: ICCS X141, Mon 1-230pm CPSC 322, Lecture 7

Course Announcements Inked Slides At the end of each lecture I revise/clean-up the slides. Adding comments, improving writing… make sure you check them out CPSC 322, Lecture 7

Lecture Overview Recap Heuristic Search Search with Costs Summary Uninformed Search Heuristic Search CPSC 322, Lecture 7

Recap: Search with Costs Sometimes there are costs associated with arcs. The cost of a path is the sum of the costs of its arcs. Optimal solution: not the one that minimizes the number of links, but the one that minimizes cost Lowest-Cost-First Search: expand paths from the frontier in order of their costs. In this setting we often don't just want to find just any solution Instead, we usually want to find the solution that minimizes cost We call a search algorithm which always finds such a solution optimal CPSC 322, Lecture 7

Recap Uninformed Search Complete Optimal Time Space DFS N O(bm) O(mb) BFS Y IDS LCFS Costs > 0 Costs >=0 Assume b finite And step are all identical R&N in general iterative deepening is the preferred uninformed search strategy when there is a large search space And the depth of the solution is not known CPSC 322, Lecture 7

Recap Uninformed Search Why are all these strategies called uninformed? Because they do not consider any information about the states (end nodes) to decide which path to expand first on the frontier eg (n0, n2, n3 12), (n0, n3  8) , (n0, n1, n4  13) In other words, they are general they do not take into account the specific nature of the problem. An uninformed search algorithm is one that does not take into account the specific nature of the problem. As such, they can be implemented in general, and then the same implementation can be used in a wide range of problems thanks to abstraction. The drawback is that most search spaces are extremely large, and an uninformed search (especially of a tree) will take a reasonable amount of time only for small examples. As such, to speed up the process, sometimes only an informed search will do. CPSC 322, Lecture 7

Lecture Overview Recap Heuristic Search Search with Costs Summary Uninformed Search Heuristic Search CPSC 322, Lecture 7

Beyond uninformed search…. What information we could use to better select paths from the frontier? an estimate of the distance from the last node on the path to the goal an estimate of the distance from the start state to the goal an estimate of the cost of the path None of the above Not looking inside state at the end of the path Asking: How close is this to the goal? CPSC 322, Lecture 6

This is called a heuristic Heuristic Search Uninformed/Blind search algorithms do not take into account the goal until they are at a goal node. Often there is extra knowledge that can be used to guide the search: an estimate of the distance from node n to a goal node. This is called a heuristic Examples of h(n): actual distance from a location, # of atoms to be resolved, in eight puzzle: number of tiles that are out of place sum of the manhattan distance of each tile from its location (ACC did not metion this) ACC: in the assignment for this module, it came up the question of whether h should be > 0, The book says so, and it makes sense because it is a cost. CPSC 322, Lecture 7

More formally Definition (search heuristic) A search heuristic h(n) is an estimate of the cost of the shortest path from node n to a goal node. h can be extended to paths: h(n0,…,nk)=h(nk) For now think of h(n) as only using readily obtainable information (that is easy to compute) about a node. Lecture slides DO NOT contain everything that is said in class, but you are still responsible for this content So, make sure you bring yourself up-to-date if you miss a class CPSC 322, Lecture 7

More formally (cont.) Definition (admissible heuristic) A search heuristic h(n) is admissible if it is never an overestimate of the cost from n to a goal. There is never a path from n to a goal that has path cost less than h(n). another way of saying this: h(n) is a lower bound on the cost of getting from n to the nearest goal. Lecture slides DO NOT contain everything that is said in class, but you are still responsible for this content So, make sure you bring yourself up-to-date if you miss a class CPSC 322, Lecture 7

Example Admissible Heuristic Functions Search problem: robot has to find a route from start location to goal location on a grid (discrete space with obstacles) Final cost (quality of the solution) is the number of steps the distance between two points is the sum of the (absolute) differences of their coordinates G CPSC 322, Lecture 3

Example Admissible Heuristic Functions If no obstacles, cost of optimal solution is… the distance between two points is the sum of the (absolute) differences of their coordinates ``Manhattan distance'‘(L1 distance), the distance between two points is the sum of the (absolute) differences of their coordinates CPSC 322, Lecture 3

Example Admissible Heuristic Functions If there are obstacle, the optimal solution without obstacles is an admissible heuristic the distance between two points is the sum of the (absolute) differences of their coordinates G CPSC 322, Lecture 3

Example Admissible Heuristic Functions Similarly, If the nodes are points on a Euclidean plane and the cost is the distance, we can use the straight-line distance from n to the closest goal as the value of h(n). this also makes sense if there are obstacles, or for other reasons not all adjacent nodes share an arc CPSC 322, Lecture 3

Admissible Heuristic Function for 8-puzzle A reasonable admissible heuristics for the 8-puzzle is? Number of misplaced tiles plus number of correctly place tiles Number of misplaced tiles Number of correctly placed tiles None of the above Not looking inside state at the end of the path Asking: How close is this to the goal? CPSC 322, Lecture 6

Admissible Example Heuristic Functions In the 8-puzzle, we can use the number of misplaced tiles Which one is better? E.g., for the 8-puzzle: h1(n) = number of misplaced tiles h2(n) = total Manhattan distance (i.e., no. of squares from desired location of each tile) h1(S) = ? 8 h2(S) = ? 3+1+2+2+2+3+3+2 = 18 CPSC 322, Lecture 3

Example Admissible Heuristic Functions Another one we can use the number of moves between each tile's current position and its position in the solution Which one is better? E.g., for the 8-puzzle: h1(n) = number of misplaced tiles h2(n) = total Manhattan distance (i.e., no. of squares from desired location of each tile) h1(S) = ? 8 h2(S) = ? 3+1+2+2+2+3+3+2 = 18 1 2 3 4 5 6 7 8 1 2 3 4 5 6 7 8 CPSC 322, Lecture 3

How to Construct an Admissible Heuristic You identify relaxed version of the problem: where one or more constraints have been dropped problem with fewer restrictions on the actions Robot: the agent can move through walls Driver: the agent can move straight 8puzzle: (1) tiles can move anywhere (2) tiles can move to any adjacent square Result: The cost of an optimal solution in the relaxed problem is an admissible heuristic for the original problem (because it is always weakly less costly to solve a less constrained problem!) Overall, a cost-minimizing search problem is a constrained optimization problem e.g., find a path from A to B which minimizes distance traveled, subject to the constraint that the robot can't move through walls e.g., find a path from A to B which minimizes distance traveled, allowing the agent to move through walls Later Another trick for constructing heuristics: if h1(n) is an admissible heuristic, and h2(n) is also an admissible heuristic, then max(h1(n),h2(n)) is also admissible. CPSC 322, Lecture 7

How to Construct an admissible Heuristic (cont.) You should identify constraints which, when dropped, make the problem extremely easy to solve this is important because heuristics are not useful if they're as hard to solve as the original problem! This was the case in our examples Robot: allowing the agent to move through walls. Optimal solution to this relaxed problem is Manhattan distance Driver: allowing the agent to move straight. Optimal solution to this relaxed problem is straight-line distance 8puzzle: (1) tiles can move anywhere Optimal solution to this relaxed problem is number of misplaced tiles (2) tiles can move to any adjacent square…. Admissible heuristics can also be derived from the solution cost of a subproblem Put 1 2 3 4 in place (substantially more accurate than manhattan distance) CPSC 322, Lecture 7

Another approach to construct heuristics Solution cost for a subproblem SubProblem Original Problem 1 3 8 2 5 7 6 4 1 3 @ 2 4 Current node Substantially more accurate than Manhattan distance in some cases So the two can be combined as described before 1 2 3 @ 4 1 2 3 8 4 7 6 5 Goal node CPSC 322, Lecture 3

Heuristics: Dominance If h2(n) ≥ h1(n) for every state n (both admissible) then h2 dominates h1 Which one is better for search ? h1 h2 It depends d = depth of the solution Data are averaged over 100 instances of the 8-puzzle, for various solutions CPSC 322, Lecture 8

Heuristics: Dominance 8puzzle: (1) tiles can move anywhere (2) tiles can move to any adjacent square (Original problem: tiles can move to an adjacent square if it is empty) search costs for the 8-puzzle (average number of paths expanded): d=12 IDS = 3,644,035 paths A*(h1) = 227 paths A*(h2) = 73 paths d=24 IDS = too many paths A*(h1) = 39,135 paths A*(h2) = 1,641 paths Data are averaged over 100 instances of the 8-puzzle, for various solutions CPSC 322, Lecture 8

Combining Admissible Heuristics How to combine heuristics when there is no dominance? If h1(n) is admissible and h2(n) is also admissible then h(n)= ………………… is also admissible … and dominates all its components max (h1(n) , h2(n) ) CPSC 322, Lecture 3

Combining Admissible Heuristics: Example In 8-puzzle, solution cost for the 1,2,3,4 subproblem is substantially more accurate than Manhattan distance in some cases So….. Substantially more accurate than Manhattan distance in some cases So the two can be combined as described before L2 6 subp CPSC 322, Lecture 3

Learning Goals for today’s class Construct admissible heuristics for a given problem. Verify Heuristic Dominance. Combine admissible heuristics From previous classes Define/read/write/trace/debug different search algorithms With / Without cost Uninformed CPSC 322, Lecture 7

Next Class Best-First Search Combining LCFS and BFS: A* (finish 3.6) A* Optimality CPSC 322, Lecture 7