PERTEMUAN 26. Markov Chains and Random Walks Fundamental Theorem of Markov Chains If M g is an irreducible, aperiodic Markov Chain: 1. All states are.

Slides:



Advertisements
Similar presentations
Link Prediction and Path Analysis using Markov Chains
Advertisements

. Markov Chains. 2 Dependencies along the genome In previous classes we assumed every letter in a sequence is sampled randomly from some distribution.
22C:19 Discrete Math Graphs Fall 2010 Sukumar Ghosh.
22C:19 Discrete Math Graphs Fall 2014 Sukumar Ghosh.
Walks, Paths and Circuits Walks, Paths and Circuits Sanjay Jain, Lecturer, School of Computing.
Markov Models.
Gibbs sampler - simple properties It’s not hard to show that this MC chain is aperiodic. Often is reversible distribution. If in addition the chain is.
CS433 Modeling and Simulation Lecture 06 – Part 03 Discrete Markov Chains Dr. Anis Koubâa 12 Apr 2009 Al-Imam Mohammad Ibn Saud University.
Graph Traversals Visit vertices of a graph G to determine some property: Is G connected? Is there a path from vertex a to vertex b? Does G have a cycle?
Markov Chains 1.
TCOM 501: Networking Theory & Fundamentals
Random Walks Ben Hescott CS591a1 November 18, 2002.
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
A Fuzzy Web Surfer Model Narayan L. Bhamidipati and Sankar K. Pal Indian Statistical Institute Kolkata.
Entropy Rates of a Stochastic Process
Parallel random walks Brian Moffat. Outline What are random walks What are Markov chains What are cover/hitting/mixing times Speed ups for different graphs.
6.896: Probability and Computation Spring 2011 Constantinos (Costis) Daskalakis lecture 2.
Overview of Markov chains David Gleich Purdue University Network & Matrix Computations Computer Science 15 Sept 2011.
More on Rankings. Query-independent LAR Have an a-priori ordering of the web pages Q: Set of pages that contain the keywords in the query q Present the.
Experiments with MATLAB Experiments with MATLAB Google PageRank Roger Jang ( 張智星 ) CSIE Dept, National Taiwan University, Taiwan
Markov Chains Lecture #5
Analysis of Network Diffusion and Distributed Network Algorithms Rajmohan Rajaraman Northeastern University, Boston May 2012 Chennai Network Optimization.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 3 March 23, 2005
Algorithmic and Economic Aspects of Networks Nicole Immorlica.
Introduction to PageRank Algorithm and Programming Assignment 1 CSC4170 Web Intelligence and Social Computing Tutorial 4 Tutor: Tom Chao Zhou
1 Mazes In The Theory of Computer Science Dana Moshkovitz.
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 3 April 2, 2006
Representing Graphs Wade Trappe. Lecture Overview Introduction Some Terminology –Paths Adjacency Matrix.
Link Analysis, PageRank and Search Engines on the Web
1 Markov Chains Algorithms in Computational Biology Spring 2006 Slides were edited by Itai Sharon from Dan Geiger and Ydo Wexler.
CSE 321 Discrete Structures Winter 2008 Lecture 25 Graph Theory.
Expanders Eliyahu Kiperwasser. What is it? Expanders are graphs with no small cuts. The later gives several unique traits to such graph, such as: – High.
CTIS 154 Discrete Mathematics II1 8.2 Paths and Cycles Kadir A. Peker.
Complexity 1 Mazes And Random Walks. Complexity 2 Can You Solve This Maze?
Markov Models. Markov Chain A sequence of states: X 1, X 2, X 3, … Usually over time The transition from X t-1 to X t depends only on X t-1 (Markov Property).
Problems, cont. 3. where k=0?. When are there stationary distributions? Theorem: An irreducible chain has a stationary distribution  iff the states are.
22C:19 Discrete Math Graphs Spring 2014 Sukumar Ghosh.
CS6800 Advanced Theory of Computation Fall 2012 Vinay B Gavirangaswamy
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
Entropy Rate of a Markov Chain
Graph Theoretic Concepts. What is a graph? A set of vertices (or nodes) linked by edges Mathematically, we often write G = (V,E)  V: set of vertices,
1 Random Walks on Graphs: An Overview Purnamrita Sarkar, CMU Shortened and modified by Longin Jan Latecki.
6.896: Probability and Computation Spring 2011 Constantinos (Costis) Daskalakis lecture 3.
Markov Chains and Random Walks. Def: A stochastic process X={X(t),t ∈ T} is a collection of random variables. If T is a countable set, say T={0,1,2, …
Trust Management for the Semantic Web Matthew Richardson1†, Rakesh Agrawal2, Pedro Domingos By Tyrone Cadenhead.
Random walks on undirected graphs and a little bit about Markov Chains Guy.
Relevant Subgraph Extraction Longin Jan Latecki Based on : P. Dupont, J. Callut, G. Dooms, J.-N. Monette and Y. Deville. Relevant subgraph extraction from.
An Introduction to Graph Theory
Seminar on random walks on graphs Lecture No. 2 Mille Gandelsman,
Graph theory and networks. Basic definitions  A graph consists of points called vertices (or nodes) and lines called edges (or arcs). Each edge joins.
MAT 2720 Discrete Mathematics Section 8.2 Paths and Cycles
 Quotient graph  Definition 13: Suppose G(V,E) is a graph and R is a equivalence relation on the set V. We construct the quotient graph G R in the follow.
Complexity and Efficient Algorithms Group / Department of Computer Science Testing the Cluster Structure of Graphs Christian Sohler joint work with Artur.
Date: 2005/4/25 Advisor: Sy-Yen Kuo Speaker: Szu-Chi Wang.
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
Theory of Computational Complexity Probability and Computing Lee Minseon Iwama and Ito lab M1 1.
Relations and Graphs Relations and Graphs Sanjay Jain, Lecturer, School of Computing.
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
Markov Chains and Random Walks
Markov Chains and Mixing Times
Random walks on undirected graphs and a little bit about Markov Chains
Markov Chains Mixing Times Lecture 5
Chapter 5 Fundamental Concept
Haim Kaplan and Uri Zwick
Prof. Paolo Ferragina, Algoritmi per "Information Retrieval"
Randomized Algorithms Markov Chains and Random Walks
Prof. Paolo Ferragina, Algoritmi per "Information Retrieval"
Connectivity Section 10.4.
Representing Graphs Wade Trappe.
Presentation transcript:

PERTEMUAN 26

Markov Chains and Random Walks

Fundamental Theorem of Markov Chains If M g is an irreducible, aperiodic Markov Chain: 1. All states are positive reccurent. 2. P k converges to W, where each row of W is the same (and equal to , say) 3.  is the unique vector for which  P = 

Fundamental Theorem of Markov Chains Let M g be a Markov Chain with states S 0 …S n The Fundamental Theorem tells us that after a sufficiently large number of time steps, the probability of being in state S i+1 is the same as being in state S i. This steady-state condition is known as a stationary distribution The rate at which a Markov Chain converges to a stationary distribution is called the mixing rate.

Random Walks A Random Walk on connected, undirected, non-bipartite Graph G can be modeled as a Markov Chain M g, where the vertices of the Graph, V(G), are represented by the states of the the Markov Chain and the transition matrix is as follows If (u,v) is a member of E otherwise

Random Walks M g is irreducible because G is connected M g is aperiodic Periodicity is the GCD of the length of all closed walks on G Since G is undirected, there exist closed walks of length 2 (u,v  E, exists walk u-v-u) Since G is non-bipartite it contains odd cycles Therefore GCD of all closed walks is 1 M g is aperiodic

Random Walks Given that M g is aperiodic and irreducible, we can apply the Fundamental Theorem of Markov Chains and deduce that M g converges to a stationary distribution. Lemma: For all v  V,  v = d(v) / 2 |E| ( d(v) = the degree of v) Proof denote the component corresponding to vertex v in the probability vector

Hitting time (h uv ) – expected number of steps in a Random Walk that starts at u and ends upon its first visit to v Commute time (c uv ) -- expected number of steps in a Random Walk that starts at u, visitsv once and returns to u. (c uv = h uv + h vu )

The Lollipop Graph Lollipop Graph consists of n vertices A clique on n/2 vertices A path on n/2 vertices Let u,v  V, u is in the clique, v is at the far end of the path. Surprisingly, h uv != h vu ( h uv is  (n 3 ) h vu is  (n 2 )

Markov Chains: an Application Link Prediction and Path Analysis using Markov Chains Use Markov Chains to perform probabilistic analysis and modeling of weblink sequences; ie. If a user requests page n, what will be her most likely next choice Possible Applications  Web Server Request Prediction  Adaptive Web Navigation  Tour Generation  Personalized Hub Model can be used in adaptive mode; transition matrix can be updated as new data (example: Web Server Request) arrives

Markov Chains: an Application Link Prediction and Path Analysis using Markov Chains System Overview

Markov Chains: an Application Experimental Results HTTP Server Request Prediction 6572 URIs (including html documents, directories, gifs, and cgi requests) 40,000 Requests Over 50% of the web server requests can be predicted to be the state with the highest probability

References L. Lovasz. Random Walks on Graphs: A Survey. Combinatorics: Paul Erdos is Eighty (vol. 2), 1996, pp ( R. Sarukkai, "Link Prediction and Path Analysis Using Markov Chains: 9th World Wide Wide Conference, May, ( Introduction to Markov chains : with special emphasis on rapid mixing / Ehrhard Behrends. Germany [1990-onward] Vieweg & Sohn, GW Am Math Soc 2000.