“If I Only had a Brain” Search II

Slides:



Advertisements
Similar presentations
Review: Search problem formulation
Advertisements

An Introduction to Artificial Intelligence
Informed Search Methods How can we improve searching strategy by using intelligence? Map example: Heuristic: Expand those nodes closest in “as the crow.
Optimality of A*(standard proof) Suppose suboptimal goal G 2 in the queue. Let n be an unexpanded node on a shortest path to optimal goal G. f(G 2 ) =
1 Heuristic Search Chapter 4. 2 Outline Heuristic function Greedy Best-first search Admissible heuristic and A* Properties of A* Algorithm IDA*
SE Last time: Problem-Solving Problem solving: Goal formulation Problem formulation (states, operators) Search for solution Problem formulation:
Introduction to Artificial Intelligence A* Search Ruth Bergman Fall 2002.
Introduction to Artificial Intelligence A* Search Ruth Bergman Fall 2004.
CSC344: AI for Games Lecture 4: Informed search
PROBLEM SOLVING BY SEARCHING (2)
CS 561, Session 6 1 Last time: Problem-Solving Problem solving: Goal formulation Problem formulation (states, operators) Search for solution Problem formulation:
Rutgers CS440, Fall 2003 Heuristic search Reading: AIMA 2 nd ed., Ch
Vilalta&Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
Informed State Space Search Department of Computer Science & Engineering Indian Institute of Technology Kharagpur.
Informed search algorithms
Informed search algorithms
Informed search algorithms Chapter 4. Outline Best-first search Greedy best-first search A * search Heuristics.
1 Shanghai Jiao Tong University Informed Search and Exploration.
Informed search algorithms Chapter 4. Best-first search Idea: use an evaluation function f(n) for each node –estimate of "desirability"  Expand most.
Informed searching. Informed search Blind search algorithms do not consider any information about the states and the goals Often there is extra knowledge.
Informed Search Methods Heuristic = “to find”, “to discover” “Heuristic” has many meanings in general How to come up with mathematical proofs Opposite.
Chapter 4 Informed/Heuristic Search
Review: Tree search Initialize the frontier using the starting state While the frontier is not empty – Choose a frontier node to expand according to search.
For Wednesday Read chapter 6, sections 1-3 Homework: –Chapter 4, exercise 1.
For Wednesday Read chapter 5, sections 1-4 Homework: –Chapter 3, exercise 23. Then do the exercise again, but use greedy heuristic search instead of A*
CSC3203: AI for Games Informed search (1) Patrick Olivier
Lecture 5-1CS250: Intro to AI/Lisp “If I Only had a Brain” Search Lecture 5-1 October 26 th, 1999 CS250.
Search (continued) CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Informed Search and Heuristics Chapter 3.5~7. Outline Best-first search Greedy best-first search A * search Heuristics.
CS621: Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 5: Power of Heuristic; non- conventional search.
A General Introduction to Artificial Intelligence.
Feng Zhiyong Tianjin University Fall  Best-first search  Greedy best-first search  A * search  Heuristics  Local search algorithms  Hill-climbing.
Best-first search Idea: use an evaluation function f(n) for each node –estimate of "desirability"  Expand most desirable unexpanded node Implementation:
Informed Search II CIS 391 Fall CIS Intro to AI 2 Outline PART I  Informed = use problem-specific knowledge  Best-first search and its variants.
Informed Search CSE 473 University of Washington.
Searching for Solutions
Chapter 3.5 and 3.6 Heuristic Search Continued. Review:Learning Objectives Heuristic search strategies –Best-first search –A* algorithm Heuristic functions.
Heuristic Search  Best First Search –A* –IDA* –Beam Search  Generate and Test  Local Searches –Hill Climbing  Simple Hill Climbing  Steepest Ascend.
CPSC 420 – Artificial Intelligence Texas A & M University Lecture 5 Lecturer: Laurie webster II, M.S.S.E., M.S.E.e., M.S.BME, Ph.D., P.E.
Solving problems by searching Uninformed search algorithms Discussion Class CS 171 Friday, October, 2nd (Please read lecture topic material before and.
For Monday Read chapter 4 exercise 1 No homework.
Chapter 3.5 Heuristic Search. Learning Objectives Heuristic search strategies –Best-first search –A* algorithm Heuristic functions.
Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
Informed Search Methods
Review: Tree search Initialize the frontier using the starting state
Last time: Problem-Solving
For Monday Chapter 6 Homework: Chapter 3, exercise 7.
Informed Search Methods
Informed Search Chapter 4 (a)
Artificial Intelligence Problem solving by searching CSC 361
Informed Search Chapter 4 (a)
Discussion on Greedy Search and A*
Discussion on Greedy Search and A*
Artificial Intelligence Informed Search Algorithms
Course Outline 4.2 Searching with Problem-specific Knowledge
Informed search algorithms
Informed search algorithms
Artificial Intelligence
More on Search: A* and Optimization
Informed Search Chapter 4 (a)
CSE 473 University of Washington
Informed Search Idea: be smart about what paths to try.
Heuristic Search Generate and Test Hill Climbing Best First Search
CS621: Artificial Intelligence
Artificial Intelligence
CS 416 Artificial Intelligence
Solving Problems by Searching
Informed Search Idea: be smart about what paths to try.
“If I Only had a Brain” Search
Presentation transcript:

“If I Only had a Brain” Search II Lecture 3-2 January 21st, 1999 CS250 Lecture 3-2 CS250: Intro to AI/Lisp

Monotonicity Monotonic heuristic functions are nondecreasing Why might this be an important feature? Non-monotonic? Use pathmax: Given a node, n, and its child, n’ f(n’) = max(f(n), g(n’) + h(n’)) Lecture 3-2 CS250: Intro to AI/Lisp

A* in Action Contoured state space A* starts at initial node Expands leaf node of lowest f(n) Fans out to increasing contours Lecture 3-2 CS250: Intro to AI/Lisp

A* in Perspective If h(n) = 0 everywhere, A* is uniform cost If h(n) is an exact estimate of the remaining cost A* runs in linear time! Different errors lead to different performance factors A* is the best (in terms of expanded nodes) of optimal best-first searches Lecture 3-2 CS250: Intro to AI/Lisp

Proof of A*’s Optimality Suppose that G is an optimal goal state with with path cost f*, and G2 is a suboptimal goal state, where g(G2) > f*. Suppose A* selects G2 from the queue, will A* terminate? Consider a node n that is a leaf node on an optimal path to G Since h is admissible, f*>=f(n), and since G2 was chosen over n: f(n) >= f(G2) Together, they imply f* >= f(G2) But G2 is a goal, so h(G2) = 0, f(G2) = g(G2) Therefore, f* >= g(G2) Lecture 3-2 CS250: Intro to AI/Lisp

A*’s Complexity Depends on the error of h(n) See Figure 4.8 Always 0: Breadth-first search Exactly right: Time O(n) Constant absolute error: Time O(n), but more than exactly right Constant relative error: Time O(nk), Space O(nk) See Figure 4.8 Lecture 3-2 CS250: Intro to AI/Lisp

Branching Factors Where f ’ is the next smaller cost, after f Lecture 3-2 CS250: Intro to AI/Lisp

Inventing Heuristics Dominant heuristics: Bigger is better, if you don’t overestimate How do you create heuristics? Relaxed problem Statistical approach Constraint satisfaction Most-constrained variable Most-constraining variable Least-constraining value Lecture 3-2 CS250: Intro to AI/Lisp

Improving on A* Best of both worlds with DFID Can we repeat with A*? Successive iterations: Increasing search depth (as with DFID) Increasing total path cost Lecture 3-2 CS250: Intro to AI/Lisp

Iterative Deepening A* Good stuff in A* Limited memory Lecture 3-2 CS250: Intro to AI/Lisp

Iterative Improvements Loop through, trying to “zero in” on the solution Hill climbing Climb higher Problems? Solution? Add a touch of randomness Lecture 3-2 CS250: Intro to AI/Lisp

Annealing an.neal vb [ME anelen to set on fire, fr. OE onaelan, fr. on + aelan to set on fire, burn, fr. al fire; akin to OE aeled fire, ON eldr] vt (1664) 1 a: to heat and then cool (as steel or glass) usu. for softening and making less brittle; also: to cool slowly usu. in a furnace b: to heat and then cool (nucleic acid) in order to separate strands and induce combination at lower temperature esp. with complementary strands of a different species 2: strengthen, toughen ~ vi: to be capable of combining with complementary nucleic acid by a process of heating and cooling Lecture 3-2 CS250: Intro to AI/Lisp

Simulated Annealing (defun simulated-annealing-search (problem &optional(schedule (make-exp-schedule))) (let* ((current (create-start-node problem)) (successors (expand current problem)) (best current) next temp delta) (for time = 1 to infinity do (setf temp (funcall schedule time)) (when (or (= temp 0) (null successors)) (RETURN (values (goal-test problem best) best))) (when (< (node-h-cost current) (node-h-cost best)) (setf best current)) (setf next (random-element successors)) (setf delta (- (node-h-cost next) (node-h-cost current))) (when (or (< delta 0.0) ; < because we are minimizing (< (random 1.0) (exp (/ (- delta) temp)))) (setf current next successors (expand next problem)))))) Lecture 3-2 CS250: Intro to AI/Lisp

Let* (let* ((current (create-start-node problem)) (successors (expand current problem)) (best current) next temp delta) BODY ) Lecture 3-2 CS250: Intro to AI/Lisp

The Body (for time = 1 to infinity do (setf temp (funcall schedule time)) (when (or (= temp 0) (null successors)) (RETURN (values (goal-test problem best) best))) (when (< (node-h-cost current) (node-h-cost best)) (setf best current)) (setf next (random-element successors)) (setf delta (- (node-h-cost next) (node-h-cost current))) (when (or (< delta 0.0) ; < because we are minimizing (< (random 1.0) (exp (/ (- delta) temp)))) (setf current next successors (expand next problem)))))) Lecture 3-2 CS250: Intro to AI/Lisp