1 Biased card shuffling and the asymmetric exclusion process Elchanan Mossel, Microsoft Research Joint work with Itai Benjamini, Microsoft Research Noam.

Slides:



Advertisements
Similar presentations
Shortest Vector In A Lattice is NP-Hard to approximate
Advertisements

Approximation, Chance and Networks Lecture Notes BISS 2005, Bertinoro March Alessandro Panconesi University La Sapienza of Rome.
Gibbs sampler - simple properties It’s not hard to show that this MC chain is aperiodic. Often is reversible distribution. If in addition the chain is.
Bounds on Code Length Theorem: Let l ∗ 1, l ∗ 2,..., l ∗ m be optimal codeword lengths for a source distribution p and a D-ary alphabet, and let L ∗ be.
Discrete Time Markov Chains
Markov Chains 1.
On the Spread of Viruses on the Internet Noam Berger Joint work with C. Borgs, J.T. Chayes and A. Saberi.
The Rate of Concentration of the stationary distribution of a Markov Chain on the Homogenous Populations. Boris Mitavskiy and Jonathan Rowe School of Computer.
Random Walks Ben Hescott CS591a1 November 18, 2002.
Lecture 3: Markov processes, master equation
Entropy Rates of a Stochastic Process
Asymptotic Growth Rate
Solutions to group exercises 1. (a) Truncating the chain is equivalent to setting transition probabilities to any state in {M+1,...} to zero. Renormalizing.
Dimensionality Reduction
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Expanders Eliyahu Kiperwasser. What is it? Expanders are graphs with no small cuts. The later gives several unique traits to such graph, such as: – High.
Mixing Times of Markov Chains for Self-Organizing Lists and Biased Permutations Prateek Bhakta, Sarah Miracle, Dana Randall and Amanda Streib.
Mixing Times of Self-Organizing Lists and Biased Permutations Sarah Miracle Georgia Institute of Technology.
cover times, blanket times, and majorizing measures Jian Ding U. C. Berkeley James R. Lee University of Washington Yuval Peres Microsoft Research TexPoint.
Monte Carlo Methods in Partial Differential Equations.
CS6800 Advanced Theory of Computation Fall 2012 Vinay B Gavirangaswamy
The effect of New Links on Google Pagerank By Hui Xie Apr, 07.
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
GROUPS & THEIR REPRESENTATIONS: a card shuffling approach Wayne Lawton Department of Mathematics National University of Singapore S ,
Lecture 2 We have given O(n 3 ), O(n 2 ), O(nlogn) algorithms for the max sub-range problem. This time, a linear time algorithm! The idea is as follows:
Random Walks and Markov Chains Nimantha Thushan Baranasuriya Girisha Durrel De Silva Rahul Singhal Karthik Yadati Ziling Zhou.
Decentralised load balancing in closed and open systems A. J. Ganesh University of Bristol Joint work with S. Lilienthal, D. Manjunath, A. Proutiere and.
Markov Decision Processes1 Definitions; Stationary policies; Value improvement algorithm, Policy improvement algorithm, and linear programming for discounted.
Edge-disjoint induced subgraphs with given minimum degree Raphael Yuster 2012.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Simulated Annealing.
Practical Dynamic Programming in Ljungqvist – Sargent (2004) Presented by Edson Silveira Sobrinho for Dynamic Macro class University of Houston Economics.
1 New Coins from old: Computing with unknown bias Elchanan Mossel, U.C. Berkeley
Many random walks are faster than one Noga AlonTel Aviv University Chen AvinBen Gurion University Michal KouckyCzech Academy of Sciences Gady KozmaWeizmann.
The Logistic Growth SDE. Motivation  In population biology the logistic growth model is one of the simplest models of population dynamics.  To begin.
Time to Equilibrium for Finite State Markov Chain 許元春(交通大學應用數學系)
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
CS Statistical Machine learning Lecture 24
Random walks on undirected graphs and a little bit about Markov Chains Guy.
Summations COSC 3101, PROF. J. ELDER 2 Recall: Insertion Sort.
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
An Introduction to Markov Chain Monte Carlo Teg Grenager July 1, 2004.
The Markov Chain Monte Carlo Method Isabelle Stanton May 8, 2008 Theory Lunch.
Solving Recurrences with the Substitution Method.
Instructor Neelima Gupta Expected Running Times and Randomized Algorithms Instructor Neelima Gupta
Lecture 5 Today, how to solve recurrences We learned “guess and proved by induction” We also learned “substitution” method Today, we learn the “master.
Divide and Conquer. Recall Divide the problem into a number of sub-problems that are smaller instances of the same problem. Conquer the sub-problems by.
Graph Partitioning using Single Commodity Flows
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
The Poincaré Constant of a Random Walk in High- Dimensional Convex Bodies Ivona Bezáková Thesis Advisor: Prof. Eric Vigoda.
1 Distributed Vertex Coloring. 2 Vertex Coloring: each vertex is assigned a color.
UCLA March 2, 2006 IPAM Workshop on Swarming by Nature and by Design thanks to the organizers: A. Bertozzi D. Grunbaum P. S. Krishnaprasad I. Schwartz.
Theory of Computational Complexity M1 Takao Inoshita Iwama & Ito Lab Graduate School of Informatics, Kyoto University.
Shuffling by semi-random transpositions Elchanan Mossel, U.C. Berkeley Joint work with Yuval Peres and Alistair Sinclair.
Chapter 6 Random Processes
Markov Chains and Mixing Times
Random walks on undirected graphs and a little bit about Markov Chains
Markov Chains Mixing Times Lecture 5
Path Coupling And Approximate Counting
Glauber Dynamics on Trees and Hyperbolic Graphs
Log-Sobolev Inequality on the Multislice (and what those words mean)
Phase Transitions In Reconstruction Yuval Peres, U.C. Berkeley
Reconstruction on trees and Phylogeny 2
GROUPS & THEIR REPRESENTATIONS: a card shuffling approach
Haim Kaplan and Uri Zwick
The Curve Merger (Dvir & Widgerson, 2008)
Lecture 4: Algorithmic Methods for G/M/1 and M/G/1 type models
Seminar on Markov Chains and Mixing Times Elad Katz
Discrete-time markov chain (continuation)
Presentation transcript:

1 Biased card shuffling and the asymmetric exclusion process Elchanan Mossel, Microsoft Research Joint work with Itai Benjamini, Microsoft Research Noam Berger, U.C. Berkeley Chris Hoffman, University of Washington

2 Card Shuffling Consider the following Markov chain on the space of permutations on N elements: Choose uniformly at random two adjacent cards. With probability p order them in increasing order. With probability q = 1-p order them in decreasing order.

4 If p = q = 0.5, we call the card shuffling unbiased. Otherwise, we say that the system is biased. In this case we assume W.L.O.G that p>q. Terminology Motivation Analytic methods don’t give the mixing time (more later). Are biased system mixing faster than non-biased? Robustness analysis of bubble-sort.

5 Mixing times The “total-variation” distance between μ and ν is: Let  t σ be the distribution on the permutations after t steps when starting at the permutation σ. The “mixing time” of the dynamics is defined by:

6 We prove the following conjecture of Diaconis and Ram (2000): For all p > ½, the mixing time for the biased card shuffling is  (N 2 ). Our Main Result

7 Related Card Shuffling Results The mixing time for the unbiased card shuffling is Θ(N 3 log N) (Wilson). Sharp results – using height functions and approximate eigen-functions. The mixing time for the “deterministic biased” card shuffling is  (N 2 ) (Diaconis, Ram) – uses representation theory.

8 Methods for bounding Mixing Time Coupling Spectral gap Log Sobolev constant Representation theory.

9 Spectral gap and mixing time The card shuffling defines a stochastic matrix with spectrum 1 > γ 1 > … >. The “spectral gap” of the dynamics is 1-γ 1. In general: Problem: For the biased card shuffling, 1-γ 1 = O(1/n), and log(1/(min π(σ)) = Ω(n 2 ), so we get a bound of order n 3.

10 Log Sobolev and Mixing Time The Log Sobolev constant  (won’t define) gives a bound on the mixing time: Problem. For the biased card shuffling: 1/  = Ω(n 3 ).

11 Our proof – coupling: Let x and y be permutations. We choose simultaneously the location and the direction for updating x and y. This defines a coupling 

12 The Exclusion Process The state space for the exclusion process is {0,1} N where ones represent particles and zeroes represent their absence.

13 If there are zero or two particles we do nothing. First we pick a pair of adjacent positions. Dynamics of the Exclusion process

14 with probability 1-p we put the particle on the right. with probability p we put the particle on the left If there is one particle then

15 Projections For any J<N consider the following height functions h J :S N  {0,1} N The transition probabilities of biased card shuffling project to the probabilities of the exclusion process. (Used by Wilson for the unbiased case)

16 The coupling  on the card shuffling generates a coupling  J on the exclusion process with J particles. The projections determine the permutation. Thus

17 A Partial Order We define a partial order on states of the exclusion process. For x and y with Σy i =Σx i, we write y  x if, for all i, the i-th particle of y is to the left of the i-th particle of x. y x NOTE: The coupling preserves the partial ordering.

18 For any N and J < N, let H J,N be the hitting time of Starting at before time T. Since the coupling preserves the order J The partial Order and Coupling J

19 If there exists C such that for all N and j<N Then CN 2 The partial Order and Coupling

20 Reduction: It is sufficient to prove that there exists a constant C, such that for all N, the discrete time exclusion process starting at will hit before time CN 2 with probability at least 1-1/Ne.

21 Equivalent Formulation: There exists a constant C, such that the continuous time exclusion process starting at will hit before time CN with probability at least 1-1/eN.

22 To infinite systems We can couple with the following process on . Starting at How much time will it take until we hit JN-J

23 The motionless process The product measure with probabilities Is a stationary measure. It’s not ergodic. Take the ergodic component By Poincaré, the ground configuration is recurrent. We prove that it’s hitting time from the stationary measure has tail exp(-Ω(n ½ )) (Not easy).

24 Kipnis results for product measures Kipnis proved that starting with i.i.d. measure on Z with density ρ, the location of a tagged particle x(t) satisfies the following.

25 Half-line results We need a similar result: starting with all particles on the left half-line, and a product measure on the right half-line, the particles pile up with a linear speed. By duality, and reflection, suffices to prove that for the one sided process Kipnis results still holds. Note that here the tagged particle moves slower than in the two sided process.

26 Second class particles In order to prove the result we couple the one sided process, two sided process and a third process with “second” class particles with the following drift rule:

27 Second class particles Consider the following coupling of the 3 systems 1: 2: 3: Let x 1 (t) be the location of the tagged particle in system 1. Similarly, let x 2 (t), x 3 (t) and y 3 (t). Then for all t, 0 ≤ x 1 (t) - x 2 (t) = x 3 (t) - x 2 (t) ≤ max{0, x 3 (t) - y 3 (t)}.

28 Second class particles In order to analyze x 3 (t) - y 3 (t), we note that The distance between consecutive particles is geometric. 3: By deleting all non-occupied sites, we obtain the motionless process, in which the distance between the tagged particles has an exponential tail. Therefore distance has exp. tail as needed Actual argument goes via coupling of system 3 with a stationary system of two particles which projects to the stationary motionless measure

29 J N - J We couple the following 3 processes: Main steps of main result Is dominated by a process with geometric gaps Which behaves similarly to a process with geometric gaps and infinite number of particles to the right.