Download presentation
Presentation is loading. Please wait.
Published byAshlie Carr Modified over 9 years ago
1
1/27 Informed search algorithms Chapter 4 Modified by Vali Derhami
2
2/27 Material Chapter 4
3
3/27 Outline Best-first search Greedy best-first search A * search Heuristics Local search algorithms Hill-climbing search Simulated annealing search Local beam search Genetic algorithms
4
4/27 Review: Tree search A search strategy is defined by picking the order of node expansion
5
5/27 Best-first search جستجوي اول بهترين Idea: use an evaluation function f(n) for each node –estimate of "desirability" Expand most desirable unexpanded node Implementation: Order the nodes in fringe in decreasing order of desirability Special cases: –greedy best-first search –A * search
6
6/27 Romania with step costs in km
7
7/27 Greedy best-first search Evaluation function f(n) = h(n) (heuristic) = estimate of cost from n to goal e.g., h SLD (n) = straight-line distance from n to Bucharest Greedy best-first search expands the node that appears to be closest to goal
8
8/27 Greedy best-first search example
9
9/27 Greedy best-first search example
10
10/27 Greedy best-first search example
11
11/27 Greedy best-first search example
12
12/27 Properties of greedy best-first search Complete? No – can get stuck in loops, e.g., Iasi Neamt Iasi Neamt Time? O(b m ), but a good heuristic can give dramatic improvement Space? O(b m ) -- keeps all nodes in memory Optimal? No
13
13/27 A * search Idea: avoid expanding paths that are already expensive Evaluation function f(n) = g(n) + h(n) g(n) = cost so far to reach n –h(n) = estimated cost from n to goal –f(n) = estimated total cost of path through n to goal
14
14/27 A * search example
15
15/27 A * search example
16
16/27 A * search example
17
17/27 A * search example
18
18/27 A * search example
19
19/27 A * search example
20
20/27 Admissible heuristics A heuristic h(n) is admissible if for every node n, h(n) ≤ h * (n), where h * (n) is the true cost to reach the goal state from n. An admissible heuristic never overestimates the cost to reach the goal, i.e., it is optimistic Example: h SLD (n) (never overestimates the actual road distance) Theorem: If h(n) is admissible, A * using TREE- SEARCH is optimal
21
21/27 عدم بهينگي در جستجوي گراف توجه شود كه روش A* در جستجوي گراف بهينه نيست چرا كه در اين روش ممكن است يك مسير بهينه به يك حالت تكراري كنار گذاشته شود. ياداوري: در روش مذكور حالت تكراري شناسايي شود مسير جديد كشف شده حذف مي گردد. راه حل: گرهي كه مسير پر هزينه تر دارد كنار گذاشته شود. نحوه عملكرد بدان گونه باشد كه تضمين كند مسير بهينه به حالت تكراري هميشه اولين مسيري است كه دنبال شده است. همانند جستجوي هزينه يكنواخت
22
22/27 Consistent heuristics هيوريستيكهاي سازگار A heuristic is consistent if for every node n, every successor n' of n generated by any action a, h(n) ≤ c(n,a,n') + h(n') يعني هميشه حاصلجمع تابع هيوريستيك گره پسين و هزينه رفتن به ان از تابع هيوريستيك والد بيشتر يا مساوي است. »هر هيوريستيك سازگار قابل قبول هم هست. If h is consistent, we have f(n') = g(n') + h(n') = g(n) + c(n,a,n') + h(n') ≥ g(n) + h(n) = f(n) i.e., f(n) is non-decreasing along any path. يعني مقدار f(n)در طول هر مسيري غير نزولي است. Theorem: If h(n) is consistent, A* using GRAPH-SEARCH is optimal
23
23/27 Properties of A* Complete? Yes (unless there are infinitely many nodes with f ≤ f(G) ) Time? Exponential Space? Keeps all nodes in memory الگوريتم بيشتر از آنكه وقت كم بياورد حافظه كم مياورد. Optimal? Yes در ميان الگوريتمهايي كه مسيرهاي جستجو را از ريشه توسعه مي دهند هيچ الگوريتم بهينه ديگري نمي تواند تضمين كند كه تعداد گره هايي كه توسع ميدهد از A* كمتر باشد
24
24/27 Admissible heuristics E.g., for the 8-puzzle: h 1 (n) = number of misplaced tiles h 2 (n) = total Manhattan distance مجموع فاصله كاشيها از موقعيتهاي هدفشان (i.e., no. of squares from desired location of each tile) h 1 (S) = ? h 2 (S) = ?
25
25/27 Admissible heuristics E.g., for the 8-puzzle: h 1 (n) = number of misplaced tiles h 2 (n) = total Manhattan distance (i.e., no. of squares from desired location of each tile) h 1 (S) = ? 8 h 2 (S) = ? 3+1+2+2+2+3+3+2 = 18
26
26/27 Dominance effective branching factor b*. N + 1 = 1 + b* + b* 2 + + (b*) d. N=total number of nodes generated by A*, d= solution depth, b* is the branching factor that a uniform tree of depth d would have to have in order to contain N+ 1 nodes. If h 2 (n) ≥ h 1 (n) for all n (both admissible) then h 2 dominates h 1, and h 2 is better for search Typical search costs (average number of nodes expanded): d=12IDS = 3,644,035 nodes A * (h 1 ) = 227 nodes A * (h 2 ) = 73 nodes b*=1.24 d=24 IDS = too many nodes A * (h 1 ) = 39,135 nodes A * (h 2 ) = 1,641 nodes b*=1.26
27
27/27 Relaxed problems مسائل تعديل شده A problem with fewer restrictions on the actions is called a relaxed problem The cost of an optimal solution to a relaxed problem is an admissible heuristic for the original problem If the rules of the 8-puzzle are relaxed so that a tile can move anywhere, then h 1 (n) gives the shortest solution If the rules are relaxed so that a tile can move to any adjacent square, then h 2 (n) gives the shortest solution تركيب هيوريستيك ها: h(n)=max(h 1 (n), h 2 (n),.. h m (n))
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.