Download presentation
Presentation is loading. Please wait.
Published byDora Thomas Modified over 9 years ago
1
Selective Search in Games of Different Complexity Maarten Schadd
2
Playing Chess
3
Computer vs. Human Intuition Feelings Only few variations Aggressive Pruning Selective Search Calculator Fast Examines most variations Safe Pruning Brute-Force Search
4
Problem Statement How can we improve selective-search methods in such a way that programs increase their performance in domains of different complexity ?
5
Domains of Different Complexity One-Player Game No Chance Perfect Information Two-Player Game No Chance Perfect Information Two-Player Game Chance or Imperfect Information Multi-Player Game No Chance Perfect Information
6
One-Player Game No Chance – Perfect Information Research question 1 How can we adapt Monte-Carlo Tree Search for a one-player game?
7
One player games No opponent! No uncertainty! Why not use all time at the beginning? Deviation on the score of moves
8
SameGame
9
Single-Player Monte Carlo Tree Search Selection Strategy – Expansion Strategy –Same Simulation Strategy –TabuColourRandom Policy Back-Propagation Strategy –Average Score, Sum of Squared Results and Best Result achieved so far
10
Experiments – Simulation Strategy 250 random positions 10 million nodes in memory
11
One search or several?
12
Parameter tuning
13
Highscores DBS 72,816 SP-MCTS(1)73,998 SP-MCTS(2)76,352 MC-RWS76,764 Nested MC77,934 SP-MCTS(3)78,012 Spurious AI84,414 HGSTS84,718
14
Position 1 – Move 0
15
Position 1 – Move 10
16
Position 1 – Move 20
17
Position 1 – Move 30
18
Position 1 – Move 40
19
Position 1 – Move 52
20
Position 1 – Move 53
21
Position 1 – Move 63
22
Two-Player Game No Chance – Perfect Information Research Question 2 How can we solve a two-player game by using Proof-Number Search in combination with endgame database?
23
Two-Player Game No Chance – Perfect Information Proof-Number SearchEndgame Databases
24
Fanorona
25
Average Branchin Factor
26
Average Number of Pieces
27
Endgame Database Statistics
28
Two-Player Game No Chance – Perfect Information 130,820,097,938 nodes Fanorona solved – Draw!
29
Two-Player Game Chance or Imperfect Information Research Question 3 How can we perform forward pruning at chance nodes in the expectimax framework? 0.90.1
30
Two-Player Game Chance or Imperfect Information ChanceProbCut Predictions based on shallow search
31
ChanceProbCut
32
Stratego
33
Predicting Stratego
34
Node Reduction
35
Performance gain
36
Multi-Player Game No Chance – Perfect Information Research Question 4 How can we improve search for multi-player games?
37
What games do you play?
38
Coalitions
39
Multi-Player Game No Chance – Perfect Information MaxN
40
Multi-Player Game No Chance – Perfect Information Paranoid
41
Max^n 1 22 3333 44444444 6,2,6,35,5,1,24,1,6,81,1,3,17,2,9,54,5,6,71,5,0,85,2,1,4 6,2,6,3 4,1,6,8 7,2,9,5 6,2,6,3 5,2,1,4 7,2,9,5
42
Paranoid 1 7,2,9,5 7-9
43
Paranoid 1 22 3333 44444444 -5-3-110-9-14-12-2 -5 -11 -14 -11<= -14 -11
44
Multi-Player Game No Chance – Perfect Information Best-Reply Search
45
Only 1 opponent plays Chose opponent –Strongest counter move Other opponents have to pass Long term planning Less paranoid Pruning possible
46
Best-Reply Search 1 2,3,4 -5 -11<= -14 -11 2,3,4 11111111111 -4-11-62-7-14-33-4-7 223344223344 1
47
Chinese Checkers
48
Focus
49
Rolit
50
Experiments 3 Players: 6 setups 4 Players: 14 setups 6 Players: 62 setups
51
Validation
52
Average Depth
55
BRS vs. Max^n
58
BRS vs. Paranoid
61
BRS vs. Max^n vs. Paranoid
64
Multi-Player Game No Chance – Perfect Information New Search Algorithm: Best-Reply Search Ignoring Opponents Long-Term Planning Illegal Positions don’t disturb Generally Stronger than Max^n and Paranoid
65
Conclusions We have investigated four ways to improve selective search methods –Single-Player Monte-Carlo Tree Search –Proof-Number search + endgame databases –ChanceProbCut –Best-Reply Search
66
Future Research Testing selective search methods in other domains –Other games in the same complexity level –Games of other complexity levels
67
Thank you for your attention!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.