Classical mathematics and new challenges László Lovász Microsoft Research One Microsoft Way, Redmond, WA 98052 Theorems and Algorithms.

Slides:



Advertisements
Similar presentations
Randomness Conductors Expander Graphs Randomness Extractors Condensers Universal Hash Functions
Advertisements

Linear-Degree Extractors and the Inapproximability of Max Clique and Chromatic Number David Zuckerman University of Texas at Austin.
Eigenvalues and geometric representations of graphs László Lovász Microsoft Research One Microsoft Way, Redmond, WA 98052
Randomized Algorithms Kyomin Jung KAIST Applied Algorithm Lab Jan 12, WSAC
1 Heat flow and a faster Algorithm to Compute the Surface Area of a Convex Body Hariharan Narayanan, University of Chicago Joint work with Mikhail Belkin,
Lecture 17 Introduction to Eigenvalue Problems
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
Approximate Counting via Correlation Decay Pinyan Lu Microsoft Research.
Discovering Affine Equalities Using Random Interpretation Sumit Gulwani George Necula EECS Department University of California, Berkeley.
Approximation Algoirthms: Semidefinite Programming Lecture 19: Mar 22.
Computational problems, algorithms, runtime, hardness
Totally Unimodular Matrices Lecture 11: Feb 23 Simplex Algorithm Elliposid Algorithm.
Semidefinite Programming
1 Introduction to Linear and Integer Programming Lecture 9: Feb 14.
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2002 Review Lecture Tuesday, 12/10/02.
Expanders Eliyahu Kiperwasser. What is it? Expanders are graphs with no small cuts. The later gives several unique traits to such graph, such as: – High.
Lecture 20: April 12 Introduction to Randomized Algorithms and the Probabilistic Method.
1 Trends in Mathematics: How could they Change Education? László Lovász Eötvös Loránd University Budapest.
cover times, blanket times, and majorizing measures Jian Ding U. C. Berkeley James R. Lee University of Washington Yuval Peres Microsoft Research TexPoint.
Pablo A. Parrilo ETH Zürich Semialgebraic Relaxations and Semidefinite Programs Pablo A. Parrilo ETH Zürich control.ee.ethz.ch/~parrilo.
Foundations of Cryptography Lecture 2 Lecturer: Moni Naor.
Dana Moshkovitz, MIT Joint work with Subhash Khot, NYU.
Applications of Calculus. The logarithmic spiral of the Nautilus shell is a classical image used to depict the growth and change related to calculus.
Randomness – A computational complexity view Avi Wigderson Institute for Advanced Study.
Absolute error. absolute function absolute value.
Lecture 6 - Models of Complex Networks II Dr. Anthony Bonato Ryerson University AM8002 Fall 2014.
Diophantine Approximation and Basis Reduction
Dense subgraphs of random graphs Uriel Feige Weizmann Institute.
Spectral Graph Theory and Applications Advanced Course WS2011/2012 Thomas Sauerwald He Sun Max Planck Institute for Informatics.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Graph limit theory: Algorithms László Lovász Eötvös Loránd University, Budapest May
The Power and Weakness of Randomness (when you are short on time) Avi Wigderson School of Mathematics Institute for Advanced Study.
Relationships Between Structures “→” ≝ “Can be defined in terms of” Programs Groups Proofs Trees Complex numbers Operators Propositions Graphs Real.
July The Mathematical Challenge of Large Networks László Lovász Eötvös Loránd University, Budapest
Volume computation László Lovász Microsoft Research
Why is it useful to walk randomly? László Lovász Mathematical Institute Eötvös Loránd University October
Fan Chung University of California, San Diego The PageRank of a graph.
October Large networks: a new language for science László Lovász Eötvös Loránd University, Budapest
Matchings and where they lead us László Lovász Microsoft Research
Optimization in very large graphs László Lovász Eötvös Loránd University, Budapest December
Cover times, blanket times, and the GFF Jian Ding Berkeley-Stanford-Chicago James R. Lee University of Washington Yuval Peres Microsoft Research.
The mathematical challenge of large networks László Lovász Eötvös Loránd University, Budapest Joint work with Christian Borgs, Jennifer Chayes, Balázs.
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
Summer training on Programming Methodology with Logic and Algorithm Designing.
Seminar on random walks on graphs Lecture No. 2 Mille Gandelsman,
CES 592 Theory of Software Systems B. Ravikumar (Ravi) Office: 124 Darwin Hall.
geometric representations of graphs
Discrete mathematics: the last and next decade László Lovász Microsoft Research One Microsoft Way, Redmond, WA 98052
Presented by Alon Levin
The Poincaré Constant of a Random Walk in High- Dimensional Convex Bodies Ivona Bezáková Thesis Advisor: Prof. Eric Vigoda.
Sampling algorithms and Markov chains László Lovász Microsoft Research One Microsoft Way, Redmond, WA 98052
Dense graph limit theory: Extremal graph theory László Lovász Eötvös Loránd University, Budapest May
Lap Chi Lau we will only use slides 4 to 19
Introduction to Randomized Algorithms and the Probabilistic Method
Topics in Algorithms Lap Chi Lau.
Vector representations of graphs
CS154, Lecture 18:.
Structural Properties of Low Threshold Rank Graphs
Subhash Khot Theory Group
Graph limits and graph homomorphisms László Lovász Microsoft Research
Probabilistic existence of regular combinatorial objects
Estimating Networks With Jumps
Hariharan Narayanan, University of Chicago Joint work with
Partitioning and decomposing graphs László Lovász
geometric representations of graphs
Around the Regularity Lemma
Eötvös Loránd Tudományegyetem, Budapest
Complexity Theory: Foundations
Presentation transcript:

Classical mathematics and new challenges László Lovász Microsoft Research One Microsoft Way, Redmond, WA Theorems and Algorithms

Algorithmic vs. structural mathematics Geometric constructions Euclidean algorithm Newton’s method Gaussian elimination ancient and classical algorithms

An example: diophantine approximation and continued fractions Givenfind rational approximation such that and continued fraction expansion

30’s: Mathematical notion of algorithms Church, Turing, Post recursive functions, Λ-calculus, Turing-machines Church, Gödel algorithmic and logical undecidability A mini-history of algorithms

50’s, 60’s: Computers the significance of running time simple and complex problems sorting searching arithmetic … Travelling Salesman matching network flows factoring …

late 60’s-80’s: Complexity theory P=NP? Time, space, information complexity Polynomial hierarchy Nondeterminism, good characteriztion, completeness Randomization, parallelism Classification of many real-life problems into P vs. NP-complete

90’s: Increasing sophistication upper and lower bounds on complexity algorithmsnegative results factoring volume computation semidefinite optimization topology algebraic geometry coding theory

Higlights of the 90’s: Approximation algorithms positive and negative results Probabilistic algorithms Markov chains, high concentration, nibble methods, phase transitions Pseudorandom number generators from art to science: theory and constructions

Approximation algorithms: The Max Cut Problem maximize NP-hard …Approximations?

Easy with 50% error Erdős ~’65 Polynomial with 12% error Goemans-Williamson ’93 ??? Arora-Lund-Motwani- Sudan-Szegedy ’92 Hastad NP-hard with 6% error (Interactive proof systems, PCP) (semidefinite optimization)

Randomized algorithms (making coin flips): Algorithms and probability Algorithms with stochastic input: difficult to analyze even more difficult to analyze important applications (primality testing, integration, optimization, volume computation, simulation) even more important applications Difficulty: after a few iterations, complicated functions of the original random variables arise.

Strong concentration (Talagrand) Laws of Large Numbers: sums of independent random variables is strongly concentrated General strong concentration: very general “smooth” functions of independent random variables are strongly concentrated Nibble, martingales, rapidly mixing Markov chains,… New methods in probability:

Example Want:such that: - any 3 linearly independent - every vector is a linear combination of 2 Few vectors O(  q)? (was open for 30 years) Every finite projective plane of order q has a complete arc of size  q polylog ( q ). Kim-Vu

Second idea: choose at random ????? Solution:Rödl nibble + strong concentration results First idea: use algebraic construction (conics,…) gives only about q

Driving forces for the next decade New areas of applications The study of very large structures More tools from classical areas in mathematics

New areas of application: interaction between discrete and continuous Biology: genetic code population dynamics protein folding Physics: elementary particles, quarks, etc. (Feynman graphs) statistical mechanics (graph theory, discrete probability) Economics: indivisibilities (integer programming, game theory) Computing: algorithms, complexity, databases, networks, VLSI,...

Very large structures -genetic code -brain -animal -ecosystem -economy -society How to model them? non-constant but stable partly random -internet -VLSI -databases

Very large structures: how to model them? Graph minorsRobertson, Seymour, Thomas If a graph does not contain a given minor, then it is essentially a 1-dimensional structure of essentially 2-dimensional pieces. up to a bounded number of additional nodes tree-decomposition embedable in a fixed surface except for “fringes” of bounded depth

The nodes of  graph can be partitioned into a bounded number of essentially equal parts so that almost all bipartite graphs between 2 parts are essentially random (with different densities). with  k 2 exceptions Very large structures: how to model them? Regularity LemmaSzeméredi 74 given  >0 and k>1, # of parts is between k and f(k,  ) difference at most 1 for subsets X,Y of the two parts, # of edges between X and Y is p|X||Y|   n 2

How to model them? How to handle them algorithmically? heuristics/approximation algorithms -internet -VLSI -databases -genetic code -brain -animal -ecosystem -economy -society A complexity theory of linear time? Very large structures linear time algorithms sublinear time algorithms (sampling)

Example: Volume computation Given:, convex Want: volume of K by a membership oracle; with relative error ε Not possible in polynomial time, even if ε=n cn. Elekes, Bárány, Füredi Possible in randomized polynomial time, for arbitrarily small ε. Dyer, Frieze, Kannan in n More and more tools from classical math

Complexity: For self-reducible problems, counting  sampling (Jerrum-Valiant-Vazirani) Enough to sample from convex bodies * * * * * * * * * must be exponential in n

Complexity: For self-reducible problems, counting  sampling (Jerrum-Valiant-Vazirani) Enough to sample from convex bodies by sampling …

Complexity: For self-reducible problems, counting  sampling (Jerrum-Valiant-Vazirani) Enough to sample from convex bodies Algorithmic results: Use rapidly mixing Markov chains (Broder; Jerrum-Sinclair) Enough to estimate the mixing rate of random walk on lattice in K

Complexity: For self-reducible problems, counting  sampling (Jerrum-Valiant-Vazirani) Enough to sample from convex bodies Algorithmic results: Use rapidly mixing Markov chains (Broder; Jerrum-Sinclair) Enough to estimate the mixing rate of random walk on lattice in K Graph theory (expanders): use conductance to estimate eigenvalue gap Alon, Jerrum-Sinclair Probability: use eigenvalue gap K’ K” F

Complexity: For self-reducible problems, counting  sampling (Jerrum-Valiant-Vazirani) Enough to sample from convex bodies Algorithmic results: Use rapidly mixing Markov chains (Broder; Jerrum-Sinclair) Enough to estimate the mixing rate of random walk on lattice in K Graph theory (expanders): use conductance to estimate eigenvalue gap Alon, Jerrum-Sinclair Enough to prove isoperimetric inequality for subsets of K Differential geometry:Isoperimetric inequality Dyer Frieze Kannan 1989 Probability: use eigenvalue gap

Differential equations: bounds on Poincaré constant Paine-Weinberger bisection method, improved isoperimetric inequality LL-Simonovits 1990 Log-concave functions:reduction to integration Applegate-Kannan 1992 Convex geometry:Ball walk LL 1992 Statistics:Better error handling Dyer-Frieze 1993 Optimization:Better prepocessing LL-Simonovits 1995 achieving isotropic position Kannan-LL-Simonovits 1998 Functional analysis: isotropic position of convex bodies

Geometry: projective (Hilbert) distance affine invariant isoperimetric inequality analysis of hit-and-run walk LL 1999 Differential equations: log-Sobolev inequality elimination of “start penalty” for lattice walk Frieze-Kannan 1999 log-Cheeger inequalityelimination of “start penalty” for ball walk Kannan-LL 1999 Scientific computing: non-reversible chains mix better; lifting Diaconis-Holmes-Neal Feng-LL-Pak walk with inertia Aspnes-Kannan-LL

Linear algebra : eigenvalues semidefinite optimization higher incidence matrices homology theory More and more tools from classical math Geometry : geometric representations convexity Analysis: generating functions Fourier analysis, quantum computing Number theory: cryptography Topology, group theory, algebraic geometry, special functions, differential equations,…