Download presentation
Presentation is loading. Please wait.
Published byUnique Edds Modified over 9 years ago
1
Recent Progress in the Design and Analysis of Admissible Heuristic Functions Richard E. Korf Computer Science Department University of California, Los Angeles
2
Thanks Victoria Cortessis Ariel Felner Rob Holte Kaoru Mulvihill Nathan Sturtevant
3
Heuristics come from Abstractions Admissible (lower bound) heuristic evaluation functions are often the cost of exact solutions to abstract problems. E.g. If we relax the Fifteen Puzzle to allow a tile to move to any adjacent position, the cost of an exact solution to this simplified problem is Manhattan distance.
4
Outline of Talk New methods for the design of more accurate heuristic evaluation functions. A new method to predict the running time of admissible heuristic search algorithms
5
Fifteen Puzzle 123 4567 891011 12131415
6
Fifteen Puzzle Invented by Sam Loyd in 1870s “...engaged the attention of nine out of ten persons of both sexes and of all ages and conditions of the community.” $1000 prize to swap positions of two tiles
7
Swap Two Tiles 123 4567 891011 12 131415 123 4567 891011 13 1412 (Johnson & Storey, 1879) proved it’s impossible.
8
Twenty-Four Puzzle 1234 56789 1011121314 1516171819 2021222324
9
Rubik’s Cube
10
Invented in 1974 by Erno Rubik of Hungary Over 100 million sold worldwide Most famous combinatorial puzzle ever
11
Finding Optimal Solutions Input: A random solvable initial state Output: A shortest sequence of moves that maps the initial state to the goal state Generalized sliding-tile puzzle is NP Complete (Ratner and Warmuth, 1986) People can’t find optimal solutions. Progress measured by size of problems that can be solved optimally.
12
Sizes of Problem Spaces 8 Puzzle: 10 5.01 seconds 2 3 Rubik’s Cube: 10 6.2 seconds 15 Puzzle: 10 13 6 days 3 3 Rubik’s Cube: 10 19 68,000 years 24 Puzzle: 10 25 12 billion years Brute-Force Search Time (10 million nodes/second) ProblemNodes
13
A* Algorithm Hart, Nilsson, and Raphael, 1968 Best-first search with cost function f(n)=g(n)+h(n) If h(n) is admissible (a lower bound estimate of optimal cost), A* guarantees optimal solutions if they exist. A* stores all the nodes it generates, exhausting available memory in minutes.
14
Iterative-Deepening-A* (IDA*) IDA* (Korf, 1985) is a linear-space version of A*, using the same cost function. Each iteration searches depth-first for solutions of a given length. IDA* is simpler, and often faster than A*, due to less overhead per node. Key to performance is accurate heuristic evaluation function.
15
Manhattan Distance Heuristic Manhattan distance is 6+3=9 moves 123 4567 891011 12 131415 123 4567 891011 131412
16
Performance on 15 Puzzle Random 15 puzzle instances were first solved optimally using IDA* with Manhattan distance heuristic (Korf, 1985). Optimal solution lengths average 53 moves. 400 million nodes generated on average. Average solution time is about 50 seconds on current machines.
17
Limitation of Manhattan Distance To solve a 24-Puzzle instance, IDA* with Manhattan distance would take about 65,000 years on average. Assumes that each tile moves independently In fact, tiles interfere with each other. Accounting for these interactions is the key to more accurate heuristic functions.
18
Example: Linear Conflict 1331 Manhattan distance is 2+2=4 moves
19
Example: Linear Conflict 1331 Manhattan distance is 2+2=4 moves
20
Example: Linear Conflict 133 1 Manhattan distance is 2+2=4 moves
21
Example: Linear Conflict 133 1 Manhattan distance is 2+2=4 moves
22
Example: Linear Conflict 133 1 Manhattan distance is 2+2=4 moves
23
Example: Linear Conflict 1331 Manhattan distance is 2+2=4 moves
24
Example: Linear Conflict 1331 Manhattan distance is 2+2=4 moves, but linear conflict adds 2 additional moves.
25
Linear Conflict Heuristic Hansson, Mayer, and Yung, 1991 Given two tiles in their goal row, but reversed in position, additional vertical moves can be added to Manhattan distance. Still not accurate enough to solve 24-Puzzle We can generalize this idea further.
26
More Complex Tile Interactions 3 7 11 12131415 147 3 1512 1113 M.d. is 19 moves, but 31 moves are needed. M.d. is 20 moves, but 28 moves are needed 3 7 11 12131415 713 12 153 1114 M.d. is 17 moves, but 27 moves are needed 3 7 11 12131415 1211 714 133 15
27
Pattern Database Heuristics Culberson and Schaeffer, 1996 A pattern database is a complete set of such positions, with associated number of moves. e.g. a 7-tile pattern database for the Fifteen Puzzle contains 519 million entries.
28
Heuristics from Pattern Databases 123 4567 891011 12131415 510147 8361 15129 211413 31 moves is a lower bound on the total number of moves needed to solve this particular state.
29
Precomputing Pattern Databases Entire database is computed with one backward breadth-first search from goal. All non-pattern tiles are indistinguishable, but all tile moves are counted. The first time each state is encountered, the total number of moves made so far is stored. Once computed, the same table is used for all problems with the same goal state.
30
What About the Non-Pattern Tiles? Given more memory, we can compute additional pattern databases from the remaining tiles. In fact, we can compute multiple pattern databases from overlapping sets of tiles. The only limit is the amount of memory available to store the pattern databases.
31
Combining Multiple Databases 123 4567 891011 12131415 510147 8361 15129 211413 Overall heuristic is maximum of 31 moves 31 moves needed to solve red tiles 22 moves need to solve blue tiles
32
Applications of Pattern Databases On 15 puzzle, IDA* with pattern database heuristics is about 10 times faster than with Manhattan distance (Culberson and Schaeffer, 1996). Pattern databases can also be applied to Rubik’s Cube.
33
Corner Cubie Pattern Database This database contains 88 million entries
34
6-Edge Cubie Pattern Database This database contains 43 million values
35
Remaining 6-Edge Cubie Database This database also contains 43 million values
36
Rubik’s Cube Heuristic All three databases are precomputed in less than an hour, and use less than 100 Mbytes. During problem solving, for each state, the different sets of cubies are used to compute indices into their pattern databases. The overall heuristic value is the maximum of the three different database values.
37
Performance on Rubik’s Cube IDA* with this heuristic found the first optimal solutions to randomly scrambled Rubik’s Cubes (Korf, 1997). Median optimal solution length is 18 moves Most problems can be solved in about a day now, using larger pattern databases.
38
Related Work Culberson and Schaeffer’s 1996 work on pattern databases in the Fifteen Puzzle. Armand Prieditis’ ABSOLVER program discovered a “center-corner” heuristic for Rubik’s Cube in 1993. Herbert Kociemba independently developed a powerful Rubik’s Cube program in 1990s. Mike Reid has built faster Cube programs.
39
Memory Adds Speed More memory can hold larger databases. Larger databases provide more accurate heuristic values. More accurate heuristics speed up search. E.g. Doubling the memory almost doubles the search speed. See (Holte and Hernadvolgyi, 1999, 2000) for more details.
40
Limitation of General Databases The only way to admissibly combine values from multiple general pattern databases is to take the maximum of their values. For more accuracy, we would like to add heuristic values from different databases. We can do this on the tile puzzles because each move only moves one tile at a time. Joint work with Ariel Felner
41
Additive Pattern Databases Culberson and Schaeffer counted all moves needed to correctly position the pattern tiles. In contrast, we count only moves of the pattern tiles, ignoring non-pattern moves. If no tile belongs to more than one pattern, then we can add their heuristic values. Manhattan distance is a special case of this, where each pattern contains a single tile.
42
Example Additive Databases 123 4567 891011 12131514 The 7-tile database contains 58 million entries. The 8-tile database contains 519 million entries.
43
Computing the Heuristic 123 4567 891011 12131415 510147 8361 15129 211413 Overall heuristic is sum, or 20+25=45 moves 20 moves needed to solve red tiles 25 moves needed to solve blue tiles
44
Performance on 15 Puzzle IDA* with a heuristic based on these additive pattern databases can optimally solve random 15 puzzle instances in less than 29 milliseconds on average. This is about 1700 times faster than with Manhattan distance on the same machine.
45
24 Puzzle Additive Databases 1234 56789 1011121314 1516171819 2021222324
46
Performance on 24 Puzzle Each database contains 128 million entries. IDA* using these databases can optimally solve random 24-puzzle problems. Optimal solutions average 100 moves. Billions to trillions of nodes generated. Over 2 million nodes per second. Running times from 18 seconds to 10 days. Average is a day and a half.
47
Optimal Solution Length Nodes Generated
48
Moral of the Story (so far) Our implementation of disjoint additive pattern databases is tailored to tile puzzles. In general, most heuristics assume that subproblems are independent. Capturing some of the interactions between subproblems yields more accurate heuristics We’ve also applied this to graph partioning.
49
Time Complexity of Admissible Heuristic Search Algorithms Joint work with Michael Reid (Brown University)
50
Previous Work on This Problem Pohl 1977, Gaschnig 1979, Pearl 1984 Assumes an abstract analytic model of the problem space Characterizes heuristic by its accuracy as an estimate of optimal solution cost Produces asymptotic results. Doesn’t predict runtime on real problems.
51
How Long does a Heuristic Search Algorithm take to Run? Branching factor of problem space Solution depth of problem instance Heuristic evaluation function Depends on:
52
Branching Factor: average number of children of a node In tile puzzles, a cell has 2,3,or 4 neighbors. Eliminating inverse of last move gives a node branching factor of 1, 2, or 3. Exact asymptotic branching factor depends on relative frequency of each type of node, in limit of large depth. Joint work with Stefan Edelkamp
53
One Wrong Way to do it 15 puzzle has 4 center cells (b=3), 4 corner cells (b=1), and 8 side cells (b=2). Therefore, b= (4*3+4*1+8*2)/16=2. Assumes all blank positions equally likely
54
The Correct Answer The asymptotic branching factor of the Fifteen Puzzle is 2.13040 The derivation is left as an exercise.
55
Derivation of Time Complexity First, consider brute-force search. Then, consider heuristic search complexity. Illustrate result with example search tree.
56
Brute-Force Search Tree depth nodes b0b0 b1b1 1 111 0 1 h(n) g(n) 01 2 34
57
Brute-Force Search Tree depth nodes b0b0 b1b1 b2b2 1 111 23121 0 1 2 h(n) g(n) 01 2 34
58
Brute-Force Search Tree depth nodes b0b0 b1b1 b2b2 b3b3 1 111 23121 67363 0 1 3 2 h(n) g(n) 01 2 34
59
Brute-Force Search Tree depth nodes b0b0 b1b1 b2b2 b3b3 b4b4 1 111 23121 67363 16199169 0 1 3 4 2 h(n) g(n) 01 2 34
60
Brute-Force Search Tree depth nodes b0b0 b1b1 b2b2 b3b3 b4b4 b5b5 1 111 23121 67363 16199169 4451254425 0 1 3 4 5 2 h(n) g(n) 01 2 34
61
Brute-Force Search Tree depth nodes b0b0 b1b1 b2b2 b3b3 b4b4 b5b5 b6b6 1 111 23121 67363 16199169 4451254425 1201396912069 0 1 3 4 5 6 2 h(n) g(n) 01 2 34
62
Heuristic Distribution Function The distribution of heuristic values converges, independent of initial state. Let P(x) be the proportion of states with heuristic value <= x, in limit of large depth. For pattern database heuristics, P(x) can be computed exactly from the databases. For general heuristics, P(x) can be approximated by random sampling.
63
Nodes Expanded by A* or IDA* Running time is proportional to number of node expansions. If C* is the cost of an optimal solution, A* or IDA* will expand all nodes n for which f(n)=g(n)+h(n)<=C*, in the worst case.
64
Brute-Force Search Tree depth nodes b0b0 b1b1 b2b2 b3b3 b4b4 b5b5 b6b6 1 111 23121 67363 16199169 4451254425 1201396912069 0 1 3 4 5 6 2 h(n) g(n) 01 2 34
65
depth nodes b0b0 b1b1 b2b2 b3b3 b4b4 b5b5 b6b6 1 111 23121 67363 16199169 4451254425 1201396912069 0 1 3 4 5 6 2 h(n) g(n) 01 2 34 IDA* Iteration with Depth Limit 6
66
depth nodes b0b0 b1b1 b2b2 b3b3 b4b4 b5b5 b6b6 1 111 23121 67363 16196139 443501925 69440069 0 1 3 4 5 6 2 h(n) g(n) 01 2 34
67
IDA* Iteration with Depth Limit 6 depth nodes b0b0 b1b1 b2b2 b3b3 b4b4 b5b5 b6b6 1 111 23121 67363 16196139 44351925 694469 0 1 3 4 5 6 2 h(n) g(n) 01 2 34
68
IDA* Iteration with Depth Limit 6 depth nodes expanded 1 111 23121 67363 16196139 44351925 694469 0 1 3 4 5 6 2 h(n) g(n) 01 2 34 b0b0 b1b1 b2b2 b 3 P(3) b 4 P(2) b 5 P(1) b 6 P(0)
69
The Main Result b is the branching factor, d is the optimal solution depth, P(x) is the heuristic distribution function The number of nodes expanded by the last iteration of IDA* in the worst case is: i=0 d b i P(d-i)b 0 P(d)+b 1 P(d-1)+...b d P(0)=
70
Assumptions of the Analysis h is consistent, meaning that in any move, h never decreases more than g increases. The graph is searched as a tree (e.g. IDA*). Goal nodes are ignored. The distribution of heuristic values at a given depth approximates the equilibirum heuristic distribution at large depth. We don’t assume that costs are uniform.
71
Experiments on Rubik’s Cube Heuristic function is based on corner cubie and two edge cubie pattern databases. Heuristic distribution is computed exactly from databases, assuming database values are independent of each other. Theory predicts average node expansions by IDA* to within 1%.
72
Experiments on Fifteen Puzzle Heuristic is Manhattan distance. Heuristic distribution is approximated by10 billion random samples of problem space. Theory predicts average node expansions within 2.5% at typical solution depths. Accuracy of theory increases with depth.
73
Experiments on Eight Puzzle Heuristic is Manhattan distance. Heuristic distribution is computed exactly by exhaustive search of entire space. Average node expansions by IDA* is computed exactly by running every solvable initial state as deep as we need to. Theory predicts experimental results exactly
74
Practical Importance Ability to predict performance of heuristic search from heuristic distribution allows us to choose among alternative heuristics. More efficient than running large numbers of searches in the problem domain Can predict performance even if we can’t run any problem instances (e.g. Twenty- Four puzzle with Manhattan distance).
75
Complexity of Heuristic Search Complexity of brute-force search is O(b d ). Previous results predicted O((b-K) d ) for complexity of heuristic search, reducing the effective branching factor. Our theory predicts O(b d-k ), reducing the effective depth of search by a constant. This is confirmed by our experiments. k is roughly the expected value of heuristic.
76
Summary More powerful admissible heuristics can be automatically computed by capturing some of the interactions between subproblems. The time complexity of heuristic search algorithms can be accurately predicted from the branching factor, search depth, and heuristic distribution function.
77
Conclusions Recent progress in this area has come from more accurate heuristic functions. Admissible heuristics can also be used to speed up searches for sub-optimal solutions. New methods have emerged for constructing such heuristic functions. Applying these methods to new problems still requires work.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.