Download presentation
Presentation is loading. Please wait.
Published byMabel Thomas Modified over 6 years ago
1
Fast Hamiltonicity Checking via Bases of Perfect Matchings
Wordle created using wordle.net and Hamilt: 100 NP-complete: 100 Computation: 90 Creativity: 80 Extensions: 70 Automated: 70 Surprises: 70 Structured search: 60 Techniques: 60 Paradigms: 60 Verifiers: 60 Fundamental: 60 Branching: 50 Dynamic Programming: 50 Efficiently: 50 Insights:30 Solutions:30 Tasks: 30 Omni-present: 20 Marek Cygan, Stefan Kratsch, Jesper Nederlof STOC 2013
2
Hamiltonicity (aka Hamiltonian cycle)
Held&Karp (‘61), Bellman (‘62): time and space (Dynamic Programming). Gurevich&Shelah (‘71): time, poly space (Divide & Conquer). Kohn et al. (‘77), Karp: time, poly space (Inclusion-Exclusion). Björklund (‘10): time, poly space (Determinant approach).
3
Hamiltonicity (aka Hamiltonian cycle) Traveling Salesman
Held&Karp (‘61), Bellman (‘62): time and space (Dynamic Programming). Gurevich&Shelah (‘71): time, poly space (Divide & Conquer). Kohn et al. (‘77), Karp: time, poly space (Inclusion-Exclusion). Björklund (‘10): time, poly space (Determinant approach).
4
Hamiltonicity (aka Hamiltonian cycle)
Decompose HC into subpaths, keep track of visited vertices ( ) to make sure subpaths combine. Held&Karp (‘61), Bellman (‘62): time and space (Dynamic Programming). Gurevich&Shelah (‘71): time, poly space (Divide & Conquer). Kohn et al. (‘77), Karp: time, poly space (Inclusion-Exclusion). Björklund (‘10): time, poly space (Determinant approach).
5
Hamiltonicity (aka Hamiltonian cycle)
Decompose HC into subpaths, keep track of visited vertices ( ) to make sure subpaths combine. Held&Karp (‘61), Bellman (‘62): time and space (Dynamic Programming). Gurevich&Shelah (‘71): time, poly space (Divide & Conquer). Kohn et al. (‘77), Karp: time, poly space (Inclusion-Exclusion). Björklund (‘10): time, poly space (Determinant approach). Split input graph into two subgraphs. Decompose HC into intersection with subgraphs. Keep track of.. what exactly?
6
Directed Bipartite graphs
7
Directed Bipartite graphs
8
Graph with separator S
9
Graph with separator 2 S
10
Graph with separator 2 2 2 S
11
Graph with separator 2 2 2 S Seems to require storing information for every matching. ->
12
Our contribution We determine the required amount of stored information via studying the rank of the ‘Matchings Connectivity matrix’: For even integer , and perfect matchings of the complete graph on ,
13
Our contribution We show that and use it to
over GF(2), has rank at most by giving an explicit basis and corresponding factorization. changing representation between can be done quick. induces a permutation matrix. and use it to Solve Hamiltonicity on Directed Bipartite graphs in time and space. Solve Hamiltonicity in time and space if a path decomposition of width is given. Exclude time under SETH.
14
k-CNF-SAT on n vars cannot be solved in time when tends to infinity.
Our contribution We show that over GF(2), has rank at most by giving an explicit basis and corresponding factorization. changing representation between can be done quick. induces a permutation matrix. and use it to Solve Hamiltonicity on Directed Bipartite graphs in time and space. Solve Hamiltonicity in time and space if a path decomposition of width is given. Exclude time under SETH. k-CNF-SAT on n vars cannot be solved in time when tends to infinity.
15
Base cases t=4 1
16
Base cases t=6
17
Basis of H_t We define a set of pm’s as follows: Divide base set into groups: 1 | | | . . | . t-1 | t
18
Basis of H_t We define a set of pm’s as follows: Divide base set into groups: are all matchings using only edges between consecutive groups. 1-1 correspondence with -length bitstrings 1 | | | . . | . t-1 | t
19
Basis of H_t We define a set of pm’s as follows: Divide base set into groups: are all matchings using only edges between consecutive groups. 1-1 correspondence with -length bitstrings 1 | | | . . | . t-1 | t 1 1
20
Basis of H_t We define a set of pm’s as follows: Divide base set into groups: are all matchings using only edges between consecutive groups. 1-1 correspondence with -length bitstrings 1 | | | . . | . t-1 | t 1
21
Basis of H_t We define a set of pm’s as follows: Divide base set into groups: are all matchings using only edges between consecutive groups. 1-1 correspondence with -length bitstrings HC. iff complementary bitstrings. Let be obtained by using order rather then 1 | | | . . | . t-1 | t 1
22
Base cases t=6
23
Proof idea of basis Observation: Every matching belongs to for some .
Let be obtained from by swapping to consecutive elements. We show that for , the row is the sum of at most 3 rows from . This relies on the base cases: All rows share all but at most three edges, which can therefore be contracted. We are left with pm’s from .
24
HC in directed bipartite graphs
Focus on computing parity of number of Hamiltonian cycles. Adding isolation lemma gives monte carlo algorithm.
25
HC in directed bipartite graphs
26
HC in directed bipartite graphs
27
HC in directed bipartite graphs
1 1 1 1 1 1 1 1 1 1 1
28
HC in directed bipartite graphs
1 1 1 1 1 1 1 1 1 1 1
29
HC in directed bipartite graphs
1 1 1 1 1 1 1 1 1 1 1
30
HC in directed bipartite graphs
1 1 1 1 1 1 1 1 1 1 1
31
HC in directed bipartite graphs
1 1 1 1 1 1 1 1 1 1 1 1
32
HC in directed bipartite graphs
Compute the representation amounts to, for every compute #given pm’s that are HC with M. We compute this representation using DP over all subsets of matchings in . Since is a permutation matrix, we can easily compute the number of 1’s from the two vectors in time.
33
Graphs with small pathwidth
2 2 2 S
34
Graphs with small pathwidth
Again, counted #solutions modulo 2 Use dynamic programming, for every partition of the bag into degree 0,1,2 vertices store a representation of the matchings that can be established by partial solutions. Non-trivial when a new edge e is inserted: change representation to for some that puts e at the end.
35
Further results Lower bound under Strong ETH is rather technical reduction using some generic gadgets and the permutation matrix property from the basis Recently, a superset of the authors showed that rank upper bounds such as the one presented can also be used to give deterministic algorithms, and extend to weighted problems.
36
Conclusions Rank upper abounds on the partial solution versus partial solutions matrix are useful for designing algorithms. Can Hamiltonicity be solved in time? Can our tools be used to obtain faster deterministic algorithms and algorithms for TSP?
37
Conclusions Rank upper abounds on the partial solution versus partial solutions matrix are useful for designing algorithms. Can Hamiltonicity be solved in time? Can our tools be used to obtain faster deterministic algorithms and algorithms for TSP? Thanks for attending!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.