Download presentation
Presentation is loading. Please wait.
1
1 Abstract Reasoning for Multiagent Coordination and Planning Bradley J. Clement
2
2 Overview Problem description Summary of approach Related work Representations and supporting algorithms –CHiPs –Metric resources –Summary information Coordination algorithm –Complexity analyses –Decomposition search techniques –Applications and experiments Planning –Concurrent hierarchical refinement and local search planners –Scheduling complexity –Mars rovers experiments Conclusion
3
3 Manufacturing Example Production, Inventory, and Facility Managers
4
4 Manufacturing Example Production Manager’s Plan
5
5 Manufacturing Example Inventory Manager’s Plan
6
6 Manufacturing Example Facility Manager’s Plan
7
7 Managers must coordinate or risk failure. Managers develop plans independently. Managers need sound and complete coordination algorithm. Managers may need to make coordination decisions quickly. Managers must reason about concurrent action to use resources efficiently. Managers may need plans that handle unexpected events. Problem Characteristics
8
8 Problem Coordination (or planning) should be sound & complete. Agents should not coordinate (reason about subgoals) where there are no conflicts. Agents should act as soon as possible. Agents should accomplish goals efficiently. –Agents should act concurrently. –Agents should maximize utility. Agents should be able to handle unexpected events. Find preferable elaborations or modifications to a group of agents’ plans that achieve their goals while striking a balance among the following objectives:
9
9 Overview Problem description Summary of approach Related work Representations and supporting algorithms –CHiPs –Metric resources –Summary information Coordination algorithm –Complexity analyses –Decomposition search techniques –Applications and experiments Planning –Concurrent hierarchical refinement and local search planners –Scheduling complexity –Mars rovers experiments Conclusion
10
10 Approach Reason about plans at abstract levels to reduce the information needed to make efficient coordination and planning decisions –concurrent hierarchical plan representation –summarize constraints of abstract tasks from those of tasks in their decompositions –use this summary information to reason about interactions of abstract plans Construct sound and complete coordination & planning algorithms Explore techniques and heuristics for decomposition search based on summary information Analyze complexity of abstract reasoning Evaluate in different domains
11
11 Approach Complete at high level using summary information to gain flexibility in execution Better solutions may exist at lower levels Summary information aids in pruning subplans to resolve threats coordination levelscrispersolutions lower coordination cost more flexibility
12
12 How Approach Addresses Problem Coordination (or planning) decisions should be sound & complete. Formalize summary information and algorithms Agents should not coordinate (reason about subgoals) where there are no conflicts. Use decomposition techniques and heuristics to focus search Agents should act as soon as possible. Find solutions efficiently at multiple levels of abstraction Agents should accomplish goals efficiently. –Agents should act concurrently. Reason about concurrent interactions at abstract levels –Agents should maximize utility. Use decomposition techniques and heuristics to guide search to better solutions Agents should be able to handle unexpected events. Preserve decomposition choices by finding abstract solutions
13
13 Soundness and Completeness MSW(overlaps, p sum, q sum ) false if postconditions conflict –unsound false if conflicts with p in and q pre q in or p post and q in where –p in is set of must, always inconds of p not achieved by inconds of q –p post is set of postconds of p –q pre is set of must preconds of q not achieved by in- or postconds of p –q in is set of must, always inconds of p not achieved by inconds of p A B 0 1 2 pre: at(A,0,0) in: at(A,0,0), at(A,0,0), at(A,0,1), at(A,0,1), at(A,1,1), at(B,0,0), at(B,0,1), at(B,1,1), at(B,1,0) post: at(A,0,0), at(A,0,1), at(A,1,1), at(A,1,0), at(B,0,0), at(B,0,1), at(B,1,1), at(B,1,0) O - overlaps pre: at(B,2,0) in: at(B,2,0), at(B,2,0), at(B,2,1), at(B,2,1), at(B,1,1), at(A,2,0), at(A,2,1), at(A,1,1), at(A,1,0) post: at(B,2,0), at(B,2,1), at(B,1,1), at(B,1,0), at(A,2,0), at(A,2,1), at(A,1,1), at(A,1,0) 01
14
14 Approach - Limitations Coordination (or planning) decisions should be sound & complete. Do not discuss how to coordinate plan when not all goals are achievable Agents should not coordinate (reason about subgoals) where there are no conflicts. Do not offer heuristics for actively guiding search to optimize plans Agents should act as soon as possible. Do not specifically investigate how to efficiently interleave coordination, planning, and execution Agents should accomplish goals efficiently. –Agents should act concurrently. Do not explain how to use hard durations or deadlines in temporal model –Agents should maximize utility. Agents should be able to handle unexpected events. Do not offer new methods for reactive execution Do not fully incorporate language elements of common execution systems (e.g. loops, execution monitoring)
15
15 Approach - Limitations Do not offer algorithms/protocols that determine optimal balancing of problem objectives –do give mechanisms that enable tradeoffs Do not investigate alternative coordination/negotiation protocols –instead, identify who needs to coordinate, what needs to be coordinated, and alternative settlements Planning language –Only grounded, propositional states formalized mention how uninstantiated variables are implemented –Metric resource usage is instantaneous
16
16 Variables and Summary Information Reasoning about predicates with variables is done with unification. at(A, $loc), at($part, bin1) Unification results in may relationship. If predicates are identical, must relationship. must: at(A, bin1), at(A, bin1) must: at(A, $loc1), at(A, $loc1) may: at(A, $loc1), at(A, $loc2)
17
17 Approach - Limitations Do not offer methods for balancing tradeoffs of problem objectives –do give mechanisms that support tradeoffs Do not investigate alternative coordination/negotiation protocols –instead, identify who needs to coordinate, what needs to be coordinated, and alternative settlements Do not study synergistic opportunities in coordination –hierarchical representations allow for it Do not offer methods for evaluating uncertainty/risk of coordination decisions Only represent grounded, propositional states –mention how uninstantiated variables are represented Metric resource usage is instantaneous
18
18 Contributions Algorithms for deriving and reasoning about summary information Sound and complete concurrent hierarchical coordination & planning algorithms Integration of summary information in a local search planner Search techniques and heuristics that efficiently guide decomposition and prune the search space Complexity analyses and experiments that show where abstract reasoning exponentially reduces cost of computation and communication
19
19 Overview Problem description Summary of approach Related work Representations and supporting algorithms –CHiPs –Metric resources –Summary information Coordination algorithm –Complexity analyses –Decomposition search techniques –Applications and experiments Planning –Concurrent hierarchical refinement and local search planners –Scheduling complexity –Mars rovers experiments Conclusion
20
20 Related Approaches
21
21 Related Approaches Plan merging (Georgeff ‘83, Ephrati & Rosenschein ’94, Tsamardinos, et. al. ‘00) –Agents cannot act until plans are merged at primitive level. –Agents cannot handle unexpected events. HTN planning –Agents cannot act until conflicts are resolved at primitive level. –Agents cannot act concurrently. Hierarchical behavior-space search (Durfee & Montgomery ‘91) –My approach builds on this one –Conflicts are over naturally abstractable resources. –Justified only intuitively with limited analyses/experiments
22
22 Related Approaches OICR: Online Iterative Constraint Relaxation (Pappachan, ’01) –Interleaved coordination and execution –Derives legal temporal interactions for abstract tasks –Sacrificing completeness avoids backtracking –Temporal constraint network must be computed for each coordination episode –Not clear how metric resource constraints can be integrated –Complementary to summary information approach Temporal Constraint Network O S F ? ?
23
23 Related Approaches TÆMS –Explicitly represents quantitative relationships between tasks (enables, hinders, facilitates, etc.) –Enables agents to reason about “progress” and partial goal achievement –Does not exploit abstraction fully GPGP –Builds coordination mechanisms on TÆMS –Evaluates combinations of mechanisms on different domains Distributed NOAH (Corkill ‘79) –Aimed at distributed problem solving. –Algorithm is not complete. –Conflicts must be resolved at primitive level. Social laws –Laws may not be complete. –Agents may try to coordinate where there are no conflicts. –Laws do not actively optimize the execution of agents’ plans. Temporal reasoning & planning (Allen et. al. ‘91) –Agents cannot act until conflicts are resolved involving all details.
24
24 Overview Problem description Summary of approach Related work Representations and supporting algorithms –CHiPs –Metric resources –Summary information Coordination algorithm –Complexity analyses –Decomposition search techniques –Applications and experiments Planning –Concurrent hierarchical refinement and local search planners –Scheduling complexity –Mars rovers experiments Conclusion
25
25 Concurrent Hierarchical Plans (CHiPs) pre, in, & postconditions - sets of literals for a set of propositions type - and, or, primitive subplans - execute all for and, one for or; empty for primitive orderorder - conjunction of point or interval relations B - before B B BB
26
26 CHiP Executions, Histories, and Runs Plan execution e = –preconds hold at t s, postconds at t f, and inconds within (t s, t f ) –decomposition d is a set of subplan executions constrained by order History h = –E - set of plan executions –s I - initial state before any execution Run r : H S –r(h, 0) = s I –postconds asserted at t f –inconds asserted just after t s BB sIsI E = {,,, }
27
27 e asserts in/postcondition l e achieves precondition l of e’ e clobbers pre/in/postcondition l of e’ e undoes postcondition l of e' e ll e l e' l e l l e ll l e ll l e ll l l e ll e ll l e ll l Asserting, Clobbering, Achieving, Undoing must - for all executions in all histories may - for some execution in some history
28
28 Metric Resource Usage Depletable resource –usage carries over after end of task –gas = gas - 5 Non-depletable –usage is only local –zero after end of task –machines = machines - 2 Replenishing a resource –negative usage –gas = gas + 10 –can be depletable or non-depletable interval of task
29
29 Mars Rover morning activities move(A, B) sunbathe soak rays use –4W 20 min soak rays use –5W 20 min soak rays use –6W 20 min low path go(A,1) use 6W 10 min go(1,2) use 6W 10 min go(2,B) use 12W 20 min go(A,B) use 8W 50 min go(A,3) use 8W 15 min go(3,B) use 12W 25 min middle path high path A B C D E F 12 3
30
30 Metric Resource Usage soak rays low -4-5-6 -4270 Battery power usage for three possible decompositions morning activities move(A, B) sunbathe soak rays use –4W 20 min soak rays use –5W 20 min soak rays use –6W 20 min low path go(A,1) use 6W 10 min go(1,2) use 6W 10 min go(2,B) use 12W 20 min go(A,B) use 8W 50 min go(A,3) use 8W 15 min go(3,B) use 12W 25 min middle path high path 6612 216-6 soak rays medium -4-5-6 -40 8 432-6 soak rays high -4-5-6 -4370 812 46-6
31
31 existence: must, may timing: always, sometimes, first, last external preconditions external postconditions Summary Conditions
32
32 existence: must, may timing: always, sometimes, first, last external preconditions external postconditions pre: available(A), available(M2) pre: available(A), available(M2) pre: available(A), available(M1) pre: available(A), available(M1), available(M2) Summary Conditions pre: available(A)
33
33 existence: must, may timing: always, sometimes, first, last external preconditions external postconditions post: available(G) Summary Conditions pre: available(G) post: available(G) pre: available(G) post: available(G)
34
34 existence: must, may timing: always, sometimes, first, last external preconditions external postconditions Summary Conditions in: available(transp1) M M - meets
35
35 existence: must, may timing: always, sometimes, first, last external preconditions external postconditions Summary Conditions in: available(transp1) B B - before
36
36 Deriving Summary Conditions Can be run offline for a domain Recursive algorithm bottoming out at primitives Derived from those of immediate subplans O(n 2 c 2 ) for n non-primitive plans in hierarchy and c conditions in each set of pre, in, and postconditions Properties of summary conditions are proven based on procedure Proven procedures for determining must/may - achieve/undo/clobber
37
37 Metric Resource Usage Depletable resource –usage carries over after end of task –gas = gas - 5 Non-depletable –usage is only local –zero after end of task –machines = machines - 2 Replenishing a resource –negative usage –gas = gas + 10 –can be depletable or non-depletable interval of task
38
38 Mars Rover morning activities move(A, B) sunbathe soak rays use –4W 20 min soak rays use –5W 20 min soak rays use –6W 20 min low path go(A,1) use 6W 10 min go(1,2) use 6W 10 min go(2,B) use 12W 20 min go(A,B) use 8W 50 min go(A,3) use 8W 15 min go(3,B) use 12W 25 min middle path high path A B C D E F 12 3
39
39 Summarizing Resource Usage soak rays low -4-5-6 -4270 Battery power usage for three possible decompositions morning activities move(A, B) sunbathe soak rays use –4W 20 min soak rays use –5W 20 min soak rays use –6W 20 min low path go(A,1) use 6W 10 min go(1,2) use 6W 10 min go(2,B) use 12W 20 min go(A,B) use 8W 50 min go(A,3) use 8W 15 min go(3,B) use 12W 25 min middle path high path 6612 216-6 soak rays medium -4-5-6 -40 8 432-6 soak rays high -4-5-6 -4370 812 46-6
40
40 summarized resource usage Captures uncertainty of decomposition choices and temporal uncertainty of partially ordered actions Can be used to determine if a resource usage may, must, or must not cause a conflict Summarizing Resource Usage 0 40 30 -7 -20 20 10
41
41 Resource Summarization Algorithm Can be run offline for a domain model Run separately for each resource Recursive from leaves up hierarchy Summarizes parent from summarizations of immediate children Considers all legal orderings of children Considers all subintervals where upper and lower bounds of children’s resource usage may be reached Exponential with number of immediate children, so summarization is really constant for one resource and O(r) for r resources
42
42 Resource Summarization Algorithm OR
43
43 Resource Summarization Algorithm Serial AND
44
44 Resource Summarization Algorithm Parallel AND
45
45 Resource Summarization Algorithm for each consistent ordering of endpoints –for each subtask/subinterval summary usage assignment use Parallel-AND to combine subtask/subinterval usages by subinterval use Serial-AND over the chain of subintervals use OR computation to combine profiles Each iteration generates a profile
46
46 Overview Problem description Summary of approach Related work Representations and supporting algorithms –CHiPs –Metric resources –Summary information Coordination algorithm –Complexity analyses –Decomposition search techniques –Applications and experiments Planning –Concurrent hierarchical refinement and local search planners –Scheduling complexity –Mars rovers experiments Conclusion
47
47 Determining Temporal Relations CanAnyWay({relations}, {p sum, q sum }) - relations can hold for any way p and q can be executed MightSomeWay({relations}, {p sum, q sum }) - relations might hold for some way p and q can be executed CanAnyWay({before}, {produce_H, maintenance}) CanAnyWay({overlaps}, {produce_H, maintenance}) MightSomeWay({overlaps}, {produce_H, maintenance}) B - before O - overlaps produce H maintenance
48
48 Concurrent Hierarchical Plan Coordination Agents individually derive summary information for their plan hierarchies Coordinator requests summary information for expansions of agents’ hierarchies from the top down After each expansion, try to resolve threats by adding ordering constraints Algorithm shown to be sound and complete
49
49 Search for Coordinated Plan search state –set of expanded plans –set of blocked subplans –set of temporal constraints search operators –expand –block –constrain blocked temporal constraints
50
50 Improving Performance by Reasoning at Abstract Levels 7
51
51 Easier to Coordinate at Higher Levels Number of summary conditions per plan grows exponentially up the hierarchy O(b d-i c) b - branching factor i - level d - depth c - conditions per plan
52
52 Easier to Coordinate at Higher Levels Number of summary conditions per plan grows exponentially up the hierarchy O(b d-i c) Number of plans per level grows exponentially down the hierarchy O(b i ) b - branching factor i - level d - depth c - conditions per plan
53
53 Easier to Coordinate at Higher Levels Complexity of identifying threats among plans is O(n 2 c´ 2 ) for n plan steps and c´ summary conditions per step or O(b 2d c 2 ) b - branching factor i - level d - depth c - conditions per plan
54
54 Easier to Coordinate at Higher Levels The number of orderings to test grows doubly exponentially down the hierarchy O(b i !) b - branching factor i - level d - depth c - conditions per plan
55
55 Easier to Coordinate at Higher Levels b - branching factor i - level d - depth c - conditions per plan Resolving threats for a partial order plan is NP- complete (reduced from Hamiltonian Path)
56
56 Number of plan steps per level grows exponentially down the hierarchy O(b i ) In worst case, summary information for each plan grows exponentially up the hierarchy O(b d-i c) Number of orderings of plans grows exponentially down hierarchy O(b i !) Resolving threats is NP-complete (reduced from Hamiltonian Path) In worst case, search space reduced by O(k b d- b i ). In best case, O(k b d -b i b 2(d-i) ). Easier to Coordinate at Higher Levels b - branching factor i - level d - depth c - conditions per plan
57
57 d O(1) O(bd!)O(bd!)3c'3c'bdbd O(b2dc'2)O(b2dc'2) i O(b 2d-i c' 2 ) O(bi!)O(bi!)O(b d-i c')bibi O(b2dc'2)O(b2dc'2) 12b 12b.................................................... d-1 d-2 2 1 0 level O(b d-1 b 2 c' 2 ) = O(b d+1 c' 2 ) O(b d-2 b 2 (bc') 2 ) = O(b d+2 c' 2 ) O(b 2 b 2 (b d-3 c') 2 ) = O(b 2d-2 c' 2 ) O(bb 2 (b d-2 c') 2 ) = O(b 2d-1 c' 2 ) O(b 2 (b d-1 c') 2 ) = O(b 2d c' 2 ) #operations to derive summ. info. O(b d-1 !) O(b d-2 !) O(b2!)O(b2!) O(b!)O(b!) 1 solution space 3c'+b3c' = O(bc') b d-1 O(b2c')O(b2c')b d-2 O(b d-2 c')b2b2 O(b d-1 c')b O(bdc')O(bdc')1 #conds / plan #plans 12b......................... O(b 2(d-1) (bc') 2 ) = O(b 2d c' 2 ) O(b 2(d-2) (b 2 c') 2 ) = O(b 2d c' 2 ) O(b 4 (b (d-2) c') 2 ) = O(b 2d c' 2 ) O(b 2 (b (d-1) c') 2 ) = O(b 2d c' 2 ) O(1) #test operations / solution candidate Summary information for each plan grows exponentially up the hierarchy Number of plan steps per level grows exponentially down the hierarchy Complexity of testing an ordering of plans is constant throughout hierarchy Number of orderings of plans grows exponentially down hierarchy Resolving threats is NP-complete (reduced from Hamiltonian Path)
58
58 Improving Performance by Reasoning at Abstract Levels 7
59
59 Search Techniques Prune inconsistent global plans Branch & bound - abstract solutions help prune space where cost is higher “Expand most threats first” (EMTF) –expand subplan involved in most threats –focuses search on driving down to source of conflict “Fewest threats first” (FTF) –search plan states with fewest threats first –or subplans involved in most threats are blocked first
60
60 Evacuation Domain Experiments Compare different strategies of ordering search states and ordering expansions –FAF-FAF –DFS-ExCon –FTF-EMTF –FTF-ExCon 4 - 12 locations 2 - 4 transports no, partial, & complete overlap in locations visited
61
61 evacuateevacuate no switch one switch no switch one switch cwccw go somewhere switch & go to farthest go to safe loc mov Evacuation Domain Experiments go to 1 go to 2 go to 4 … 1 2 6 5 4 3 7 8 mov … movmov … mov go to 1 go to 2 go to 4 … mov … movmov … go to 2 go to 6 movmov …
62
62 Evacuation Domain Experiments EMTF beats random by more than an order of magnitude in some cases
63
63 Evacuation Domain Experiments EMTF and ExCon perform similarly, but EMTF more regularly finds solutions
64
64 Evacuation Domain Experiments FTF outperforms DFS by orders of magnitude
65
65 Evacuation Domain Experiments Decomposition techniques outperform FAF heuristics by orders of magnitude
66
66 Evacuation Domain Experiments Summary information decomposition techniques outperform previous state-of- the-art by orders of magnitude
67
67 Evacuation Domain Experiments Decomposition techniques using summary information dominate previous heuristics in finding optimal solutions –FTF especially effective compared to random, DFS, and FAF –EMTF not especially more effective than ExCon but finds solutions more regularly –Overall performance differs by orders of magnitude
68
68 Communication in Manufacturing Domain Centralized coordinator Measure delay with varying bandwidth and latency: (n-2)l + s/b n = number of messages s = total size of messages l = latency b = bandwidth
69
69 Communication in Manufacturing Domain 70 bandwidth = 100 bytes/s
70
70 Communication in Manufacturing Domain 100 latency = 100 s
71
71 Communication in Manufacturing Domain 10
72
72 Communication in Manufacturing Domain 10 d = 10, i = 5 bandwidth = 100 Kbytes/s
73
73 Communication in Manufacturing Domain 1000 d = 10, i = 5 latency = 1 s
74
74 Communication in Manufacturing Domain Agents can minimize communication by sending summary information at intermediate levels with a particular granularity. Sending all plan information at once can be exponentially more expensive: O(b d-i ). Sending summary information one task at a time can cause exponentially greater latency: O(b i ). However, if summary information does not collapse up the hierarchy, and coordination must occur at primitive levels, sending all at once is best. Domain modeler can perform similar experiment to determine appropriate granularity to send summary information.
75
75 Multi-Level Coordination Agent (MCA) Centrally coordinates plans of requesting agents in episodes Requests summary information as needed or summarizes given hierarchies Displays discovered solutions that are “better” or Pareto-optimal Sends synchronization and decomposition choice constraints to agents upon selection of a solution
76
W E N S Cape Vincent Cape Amstado Caca Kaso Lagoon Amisa Jacal Pra Ankobra Tana Ofin Afram Daka Black Caca Kapowa White Caca Mawli LAKE CACA Forces separated by Firestorm AGADEZ GAO Binni Laki Safari Park Gao forces Agadez Forces False Agadez forces FIRESTORM False Gao forces
77
77 Multi-Level Coordination of Military Coalitions
78
78 Multi-Level Coordination of Military Coalitions MCA recommends coordinated plans that are increasingly detailed and more parallel
79
79 Multi-Level Coordination of Military Coalitions
80
GRID / Agent-enabled Intrastructure / Admin Tools Process Panel Common / Shared Intel Db1 MCA Coalition / JTFHQ Observers D'agent e-gent D'AO D'GO Db2 JFAC HQ AODB AL Plan MBP Plans Db3 CAOC / Combat Ops MBP Ops Event Panel CODB Db4 US National HQ AODB EMAA ALDB CAMPS Dbii Intel US UK National HQ Intel UK Dbi Other National HQ Other Gao Intel Gao Obs Dbiii NOMADS Guarded “Observers” UNSGSR UN Panel CoAX 18-month Demo - Agent Domains CYBERSPACE Weather Ariadne e-gentsMBNLI
81
81 Master Battle Planner
82
82 Multi-Level Coordination of Military Coalitions
83
83 Overview Problem description Summary of approach Related work Representations and supporting algorithms –CHiPs –Metric resources –Summary information Coordination algorithm –Complexity analyses –Decomposition search techniques –Applications and experiments Planning –Concurrent hierarchical refinement and local search planners –Scheduling complexity –Mars rovers experiments Conclusion
84
84 Concurrent Hierarchical Refinement Planner Simple modification to coordination algorithm –discover whether potential internal conflicts exist during summarization –must expand any task with potential internal conflicts Derive summary information for hierarchy expanded to primitive level (iteratively expand for infinite recursion of methods) Expand hierarchy from the top down, selecting or blocking or decomposition choices After each expansion, try to resolve threats –add ordering constraints –check CAW and MSW Sound and complete Same complexity benefits as coordination algorithm
85
85 Concurrent Hierarchical Refinement Planner search state –set of expanded plans –set of temporal constraints –set of blocked subplans –set of variable bindings search operators –expand, block, constrain, bind blocked temporal constraints
86
86 Summary Information in Local Search Planners Local plan-space search involves modifying (e.g. deleting, moving, adding, etc.) tasks in an existing plan. Hierarchy is used to pass parameters, specify temporal constraints, and explore alternative decompositions for subtasks. Planners like ASPEN fix the start times and durations of activities and track states and resources within a time horizon. Algorithms for reasoning about summary states and resources are used to track uncertain states/resources for abstract tasks. Using summary information results in more efficient planning and scheduling. planning levelscrispersolutions lower planning cost more flexibility
87
87 Hierarchical Scheduling... level 0 1 d 12n branching factor b c constraints per hierarchy v variables
88
88 Complexity Analyses Iterative repair planners (such as ASPEN) heuristically pick conflicts and resolve them by moving activities and choosing alternative decompositions of abstract activities. Moving an activity hierarchy to resolve a conflict is O(vnc 2 ) for v state or resource variables, n hierarchies in the schedule, and c constraints in hierarchy per variable. Summarization can collapse the constraints per variable making c smaller. In the worst case, where no constraints are collapsed because they are over different variables, the complexity of moving activity hierarchies at different levels of expansion is the same.... level 0 1 d 12n branching factor b c constraints per hierarchy v variables
89
89 Complexity Analyses In the other extreme, where constraints are always collapsed when made for the same variable, the number of constraints c is the same as the number of activities and grows b i for b children per activity and depth level i. Thus, the complexity of scheduling operations grows O(vnb 2i ). Along another dimension, the number of temporal constraints that can cause conflicts during scheduling grows exponentially ( O(b i ) ) with the number of activities as hierarchies are expanded. In addition, by using summary information to prune decomposition choices with greater numbers of conflicts, exponential computation is avoided. Thus, reasoning at abstract levels can resolve conflicts exponentially faster.... level 0 1 d 12n branching factor b c constraints per hierarchy v variables
90
90 Complexity Analyses: Local Search Moving an activity hierarchy is a factor of O(b 2(d-i) ) more complex at level d than i if summary information fully collapses up the hierarchy. If no information collapses, moving a hierarchy has the same complexity at all levels O(vnb 2d ). The number of potential temporal constraint conflicts is a factor of O(b d-i ) greater at level d than i. Thus, reasoning at abstract levels can resolve conflicts exponentially faster.... level 0 1 d 12n branching factor b c constraints per hierarchy v variables
91
91 Decomposition Strategies Level expansion –repair conflicts at current level of abstraction until conflicts cannot be further resolved –then decompose all activities to next level and begin repairing again Expand most threats first (EMTF) –instead of moving activity to resolve conflict, decompose with some probability (decomposition rate) –expands activities involved in greater numbers of conflicts (threats) FTF (fewest-threats-first) heuristic tests each decomposition choice and picks those with fewer conflicts with greater probability.
92
92 Decomposition Strategies Level expansion –repair conflicts at current level of abstraction until conflicts cannot be further resolved –then decompose all activities to next level and begin repairing again Expand most threats first (EMTF) –instead of moving activity to resolve conflict, decompose with some probability (decomposition rate) –expands activities involved in greater numbers of conflicts (threats) Relative performance of two techniques depends decomposition rate selected for EMTF
93
93 Decomposition Strategies FTF (fewest-threats-first) heuristic tests each decomposition choice and picks those with fewer conflicts with greater probability. rover_move path1path2path3 10 conflicts 20 conflicts15 conflicts
94
94 Multi-Rover Domain 2 to 5 rovers Triangulated field of 9 to 105 waypoints 6 to 30 science locations assigned according to a multiple travelling salesman algorithm Rovers’ plans contain 3 shortest path choices to reach next science location Paths between waypoints have capacities for a certain number of rovers Rovers cannot be at same location at the same time Rovers cannot cannot cross a path in opposite directions at the same time Rovers communicate with the lander over a shared channel for telemetry--different paths require more bandwidth than others
95
95 Experiments using ASPEN for a Multi-Rover Domain Performance improves greatly when activities share a common resource. Rarely shared resources (only path variables)Mix of rarely shared (paths) and often shared (channel) resources Often shared (channel) resource only
96
96 Experiments using ASPEN for a Multi-Rover Domain CPU time required increases dramatically for solutions found at increasing depth levels.
97
97 Picking branches that result in fewer conflicts (FTF) greatly improves performance. Expanding activities involved in greater numbers of conflicts is better than level-by-level expansion when choosing a proper rate of decomposition Experiments in ASPEN for a Multi-Rover Domain
98
98 Comparing Abstract Reasoning in Refinement and Local Search Planning Refinement planning/coordination sees exponential speedup based on fewer plans at abstract levels. Local search planning is exponentially faster when summarization compresses number of constraints. But number of potential temporal constraint violations grows exponentially with number of plans in local search. Experiments in refinement coordination do not explore the the collapse of summary information, but analysis predicts additional O(b 2(d-i) ) speedup. Both can take advantage of FTF and EMTF, but only refinement can prune space because it backtracks.
99
99 Overview Problem description Summary of approach Related work Representations and supporting algorithms –CHiPs –Metric resources –Summary information Coordination algorithm –Complexity analyses –Decomposition search techniques –Applications and experiments Planning –Concurrent hierarchical refinement and local search planners –Scheduling complexity –Mars rovers experiments Conclusion
100
100 Contributions Algorithms for deriving and reasoning about summary information for propositional state and metric resources –must/may assert, achieve, clobber, undo –CAW & MSW to determine whether abstract plans are conflict free or unresolvable –toolbox of sound and complete algorithms for constructing efficient coordination and planning algorithms
101
101 Contributions Coordination and planning algorithms –sound, complete concurrent hierarchical coordination –sound, complete concurrent hierarchical planner –iterative repair planner employing abstract reasoning with summary information –evaluated in manufacturing, evacuation, military operations, and Mars rovers domains
102
102 Contributions Complexity analyses and experiments –Finding solutions at abstract levels is exponentially less complex O(k b d- b i ) in number of tasks for both refinement and local search. –Finding abstract solutions is exponentially less complex when summarization collapses constraints O(b 2(d-i) ) for both refinement and local search. –Experiments support the analyses in evacuation and Mars rovers domains. –Communication delay can be reduced exponentially by gradually sending summary information O(b d-i ) and sending at an appropriate granularity O(b i ). –Extension of work by Korf ’87 and Knoblock ‘91 showing how hierarchical coordination/planning can obtain exponential speedups when subgoals interact
103
103 Contributions Decomposition search techniques –EMTF, FTF (for refinement and local search) –Pruning of inconsistent and costlier search space –Evaluation against prior heuristics showing stronger ability to find optimal solutions at lower abstraction levels
104
104 Applying summary information to other classes of coordination/planning –state-based search –complex resources –more expressive temporal models Summarizing other information –constraint hierarchies (in addition to task hierarchies) –reasoning about uncertainty and risk Coordination protocols based on summary information –organization and scaling of agent groups –BDI-based multiagent mental models Coordinating continuously Interfacing deliberative and reactive coordination Exploiting synergy while coordinating Case-based coordination Future Directions
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.