Combinatorial optimization and the mean field model Johan Wästlund Chalmers University of Technology Sweden.

Slides:



Advertisements
Similar presentations
Lower Bounds for Additive Spanners, Emulators, and More David P. Woodruff MIT and Tsinghua University To appear in FOCS, 2006.
Advertisements

Advanced topics in Financial Econometrics Bas Werker Tilburg University, SAMSI fellow.
Great Theoretical Ideas in Computer Science
Introduction to Graph Theory Instructor: Dr. Chaudhary Department of Computer Science Millersville University Reading Assignment Chapter 1.
22C:19 Discrete Math Graphs Fall 2014 Sukumar Ghosh.
Totally Unimodular Matrices
Great Theoretical Ideas in Computer Science for Some.
Optimization in mean field random models Johan Wästlund Linköping University Sweden.
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
Combinatorial Algorithms
On the Spread of Viruses on the Internet Noam Berger Joint work with C. Borgs, J.T. Chayes and A. Saberi.
1 By Gil Kalai Institute of Mathematics and Center for Rationality, Hebrew University, Jerusalem, Israel presented by: Yair Cymbalista.
Entropy Rates of a Stochastic Process
CS774. Markov Random Field : Theory and Application Lecture 04 Kyomin Jung KAIST Sep
Complexity 16-1 Complexity Andrei Bulatov Non-Approximability.
Great Theoretical Ideas in Computer Science.
Computability and Complexity 23-1 Computability and Complexity Andrei Bulatov Search and Optimization.
Complexity 15-1 Complexity Andrei Bulatov Hierarchy Theorem.
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
Approximation Algorithms
Approximation Algorithms: Combinatorial Approaches Lecture 13: March 2.
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
EXPANDER GRAPHS Properties & Applications. Things to cover ! Definitions Properties Combinatorial, Spectral properties Constructions “Explicit” constructions.
Parametric Inference.
The k-server Problem Study Group: Randomized Algorithm Presented by Ray Lam August 16, 2003.
Computability and Complexity 24-1 Computability and Complexity Andrei Bulatov Approximation.
Job Scheduling Lecture 19: March 19. Job Scheduling: Unrelated Multiple Machines There are n jobs, each job has: a processing time p(i,j) (the time to.
1 Separator Theorems for Planar Graphs Presented by Shira Zucker.
Probability theory 2011 Convergence concepts in probability theory  Definitions and relations between convergence concepts  Sufficient conditions for.
Probability theory 2010 Conditional distributions  Conditional probability:  Conditional probability mass function: Discrete case  Conditional probability.
The moment generating function of random variable X is given by Moment generating function.
Small Subgraphs in Random Graphs and the Power of Multiple Choices The Online Case Torsten Mütze, ETH Zürich Joint work with Reto Spöhel and Henning Thomas.
TELCOM2125: Network Science and Analysis
Maximum likelihood (ML)
Numerical Methods for Partial Differential Equations CAAM 452 Spring 2005 Lecture 9 Instructor: Tim Warburton.
22C:19 Discrete Math Graphs Spring 2014 Sukumar Ghosh.
1 10. Joint Moments and Joint Characteristic Functions Following section 6, in this section we shall introduce various parameters to compactly represent.
Induction and recursion
Random Graph Models of Social Networks Paper Authors: M.E. Newman, D.J. Watts, S.H. Strogatz Presentation presented by Jessie Riposo.
The Erdös-Rényi models
Hon Wai Leong, NUS (CS6234, Spring 2009) Page 1 Copyright © 2009 by Leong Hon Wai CS6234 Lecture 1 -- (14-Jan-09) “Introduction”  Combinatorial Optimization.
CS774. Markov Random Field : Theory and Application Lecture 08 Kyomin Jung KAIST Sep
Random matching and traveling salesman problems Johan Wästlund Chalmers University of Technology Sweden.
Design Techniques for Approximation Algorithms and Approximation Classes.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
1 Spring 2003 Prof. Tim Warburton MA557/MA578/CS557 Lecture 5a.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
The M/M/ N / N Queue etc COMP5416 Advanced Network Technologies.
CS 103 Discrete Structures Lecture 13 Induction and Recursion (1)
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Mathematical Induction Section 5.1. Climbing an Infinite Ladder Suppose we have an infinite ladder: 1.We can reach the first rung of the ladder. 2.If.
CompSci 102 Discrete Math for Computer Science March 13, 2012 Prof. Rodger Slides modified from Rosen.
Giansalvo EXIN Cirrincione unit #4 Single-layer networks They directly compute linear discriminant functions using the TS without need of determining.
Approximation Algorithms by bounding the OPT Instructor Neelima Gupta
School of Information Sciences University of Pittsburgh TELCOM2125: Network Science and Analysis Konstantinos Pelechrinis Spring 2013 Figures are taken.
Introduction Wireless Ad-Hoc Network  Set of transceivers communicating by radio.
Theory of Computational Complexity Probability and Computing Ryosuke Sasanuma Iwama and Ito lab M1.
Slide 1 Branched Polymers joint work with Rick Kenyon, Brown Peter Winkler, Dartmouth.
Theory of Computational Complexity M1 Takao Inoshita Iwama & Ito Lab Graduate School of Informatics, Kyoto University.
Approximation algorithms
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
Optimization problems such as
Great Theoretical Ideas in Computer Science
Computability and Complexity
1.3 Modeling with exponentially many constr.
Discrete Mathematics for Computer Science
Introduction Wireless Ad-Hoc Network
Presentation transcript:

Combinatorial optimization and the mean field model Johan Wästlund Chalmers University of Technology Sweden

Random instances of optimization problems

Typical distance between nearby points is of order n -1/2

Random instances of optimization problems A tour consists of n links, therefore we expect the total length of the minimum tour to scale like n 1/2 Beardwood-Halton-Hammersley (1959):

Mean field model of distance Distances X ij chosen as i.i.d. variables Given n and the distribution of distances, study the random variable L n If the distribution models distances in d dimensions, we expect L n to scale like n 1-1/d In particular, pseudo-dimension 1 means L n is asymptotically independent of n

Mean field model of distance The edges of a complete graph on n vertices are given i. i. d. nonnegative costs Exponential(1) distribution.

Mean field model of distance We are interested in the cost of the minimum matching, minimum traveling salesman tour etc, for large n.

Mean field model of distance Convergence in probability to a constant?

Matching Set of edges that gives a pairing of all points

Statistical Physics / C-S Spin configuration Hamiltonian Ground state energy Temperature Gibbs measure Thermodynamic limit Feasible solution Cost of solution Cost of minimal solution Artificial parameter T Gibbs measure n→∞

Statistical physics Replica-cavity method of statistical mechanics has given spectacular predictions for random optimization problems M. Mézard, G. Parisi 1980’s Limit of   /12 for minimum matching on the complete graph (Aldous 2000) Limit … for the TSP (Wästlund 2006)

A. Frieze (2004): “Up to now there has been almost no progress analysing this random model of the travelling salesman problem.” N. J. Cerf et al (1997): “Researchers outside physics remain largely unaware of the analytical progress made on the random link TSP.”

Non-rigorous derivation of the   /12 limit Matching problem on K n for large n. In principle, this requires even n, but we shall consider a relaxation Let the edges be exponential of mean n, so that the sequence of ordered edge costs from a given vertex is approximately a Poisson process of rate 1.

Non-rigorous derivation of the   /12 limit The total cost of the minimum matching is of order n. Introduce a punishment c>0 for not using a particular vertex. This makes the problem well-defined also for odd n. For fixed c, let n tend to infinity. As c tends to infinity, we expect to recover the behavior of the original problem.

Non-rigorous derivation of the   /12 limit For large n, suppose that the problem behaves in the same way for n-1 vertices. Choose an arbitrary vertex to be the root What does the graph look like locally around the root? When only edges of cost <2c are considered, the graph becomes locally tree-like

Non-rigorous derivation of the   /12 limit Non-rigorous replica-cavity method Aldous derived equivalent equations with the Poisson-Weighted Infinite Tree (PWIT)

Non-rigorous derivation of the   /12 limit Let X be the difference in cost between the original problem and that with the root removed. If the root is not matched, then X = c. Otherwise X =  i – X i, where X i is distributed like X, and  i is the cost of the i:th edge from the root. The X i ’s are assumed to be independent.

Non-rigorous derivation of the   /12 limit It remains to do some calculations. We have where X i is distributed like X

Non-rigorous derivation of the   /12 limit Let X  -u

Non-rigorous derivation of the   /12 limit Then if u>-c,

Non-rigorous derivation of the   /12 limit Henceis constant

Non-rigorous derivation of the   /12 limit The constant depends on c and holds when –c<u<c f(-u) f(u)

Non-rigorous derivation of the   /12 limit From definition, exp(-f(c)) = P(X=c) = proportion of vertices that are not matched, and exp(-f(-c)) = exp(0) = 1 e -f(u) + e -f(-u) = 2 – proportion of vertices that are matched = 1 when c = infinity.

Non-rigorous derivation of the   /12 limit

What about the cost of the minimum matching?

Non-rigorous derivation of the   /12 limit

Hence J = area under the curve when f(u) is plotted against f(-u)! Expected cost = n/2 times this area In the original setting = ½ times the area =   /12.

The equation has the explicit solution This gives the cost

The exponential bipartite assignment problem n

Exact formula conjectured by Parisi (1998) Suggests proof by induction Researchers in discrete math, combinatorics and graph theory became interested Generalizations…

Generalizations by Coppersmith & Sorkin to incomplete matchings Remarkable paper by M. Buck, C. Chan & D. Robbins (2000) Introduces weighted vertices Extremely close to proving Parisi’s conjecture!

Incomplete matchings n m

Weighted assignment problems Weights  1,…,  m,  1,…,  n on vertices Edge cost exponential of rate  i  j Conjectured formula for the expected cost of minimum assignment Formula for the probability that a vertex participates in solution (trivial for less general setting!)

The Buck-Chan-Robbins urn process Balls are drawn with probabilities proportional to weight 11 22 33

Proofs of the conjectures Two independent proofs of the Parisi and Coppersmith-Sorkin conjectures in 2003 (Nair, Prabhakar, Sharma and Linusson, Wästlund)

Rigorous method Relax by introducing an extra vertex Let the weight of the extra vertex go to zero Example: Assignment problem with  1 =…=  m =1,  1 =…=  n =1, and  m+1 =  p = P(extra vertex participates) p/n = P(edge (m+1,n) participates)

Rigorous method p/n = P (edge (m+1,n) participates) When  →0, this is Hence By Buck-Chan-Robbins urn theorem,

Rigorous method Hence Inductively this establishes the Coppersmith-Sorkin formula

Rigorous results Much simpler proofs of Parisi, Coppersmith- Sorkin, Buck-Chan-Robbins formulas Exact results for higher moments Exact results and limits for optimization problems on the complete graph

The 2-dimensional urn process 2-dimensional time until k balls have been drawn

Limit shape as n→∞ Matching: TSP/2-factor:

Mean field TSP If the edge costs are i.i.d and satisfy P(l<t)/t→1 as t→0 (pseudodimension 1), then as n →∞,

For the TSP, the replica-cavity approach gives

It follows that is constant, and = 1 by boundary conditions Replica-cavity prediction agrees with the rigorous result (Parisi 2006)

Further exact formulas

LP-relaxation of matching in the complete graph K n

Future work Explain why the cavity method gives the same equation as the limit shape in the urn process Reprove results of one method with the other Find the variance with the replica method Find rigorously the distribution of edge costs participating in the solution (there is an exact conjecture)