Download presentation
Presentation is loading. Please wait.
Published byVivien Ellis Modified over 9 years ago
1
COMPSCI 230 Discrete Mathematics for Computer Science
2
Probability Refresher What’s a Random Variable? A Random Variable is a real-valued function on a sample space S E[X+Y] =E[X] + E[Y]
3
Conditional Expectation What does this mean: E[X | A]? E[ X ] = E[ X | A ] Pr[ A ] + E[ X | A ] Pr[ A ] Pr[ A ] = Pr[ A | B ] Pr[ B ] + Pr[ A | B ] Pr[ B ] Is this true: Yes! Similarly: E[X | A] = ∑ k Pr[X = k | A]
4
Random Walks
5
How to walk home drunk
6
No new ideas Solve HW problem Eat Wait Work 0.3 0.4 0.99 0.01 probability Hungry Abstraction of Student Life
7
Like finite automata, but instead of a determinisic or non-deterministic action, we have a probabilistic action Example questions: “What is the probability of reaching goal on string Work,Eat,Work?” No new ideas Solve HW problem Eat Wait Work 0.3 0.4 0.99 0.01 Hungry
8
- Simpler: Random Walks on Graphs At any node, go to one of the neighbors of the node with equal probability
9
- Simpler: Random Walks on Graphs At any node, go to one of the neighbors of the node with equal probability
10
- Simpler: Random Walks on Graphs At any node, go to one of the neighbors of the node with equal probability
11
- Simpler: Random Walks on Graphs At any node, go to one of the neighbors of the node with equal probability
12
- Simpler: Random Walks on Graphs At any node, go to one of the neighbors of the node with equal probability
13
0n k Random Walk on a Line You go into a casino with $k, and at each time step, you bet $1 on a fair game You leave when you are broke or have $n Question 1: what is your expected amount of money at time t? Let X t be a R.V. for the amount of $$$ at time t
14
0n k Random Walk on a Line You go into a casino with $k, and at each time step, you bet $1 on a fair game You leave when you are broke or have $n X t = k + 1 + 2 +... + t, ( i is RV for change in your money at time i) So, E[X t ] = k E[ i ] = 0
15
0n k Random Walk on a Line You go into a casino with $k, and at each time step, you bet $1 on a fair game You leave when you are broke or have $n Question 2: what is the probability that you leave with $n?
16
Random Walk on a Line Question 2: what is the probability that you leave with $n? E[X t ] = k E[X t ] = E[X t | X t = 0] × Pr(X t = 0) + E[X t | X t = n] × Pr(X t = n) + E[ X t | X t is neither] × Pr(X t is neither) Allow play to infinity, but i = 0 after reaching 0 or n. As t ∞, Pr(neither) 0, and something t < n Hence Pr(X t = n) E[X t ]/n = k/n = n × Pr(X t = n) + (something t ) × Pr(neither)
17
0n k Another Way To Look At It You go into a casino with $k, and at each time step, you bet $1 on a fair game You leave when you are broke or have $n Question 2: what is the probability that you leave with $n? = probability that I hit green before I hit red
18
- What is chance I reach green before red? Random Walks and Electrical Networks Same as voltage if edges are identical resistors and we put 1-volt battery between green and red
19
- Random Walks and Electrical Networks Same as equations for voltage if edges all have same resistance! p x = Pr(reach green first starting from x) p green = 1, p red = 0 And for the rest p x = Average y Nbr(x) (p y )
20
0n k Another Way To Look At It You go into a casino with $k, and at each time step, you bet $1 on a fair game You leave when you are broke or have $n Question 2: what is the probability that you leave with $n? voltage(k) = k/n = Pr[ hitting n before 0 starting at k] !!!
21
Getting Back Home - Lost in a city, you want to get back to your hotel How should you do this? Requires a good memory and a piece of chalk Depth First Search!
22
Getting Back Home - How about walking randomly?
23
Will this work? When will I get home? Is Pr[ reach home ] = 1? What is E[ time to reach home ]?
24
Pr[ will reach home ] = 1
25
We Will Eventually Get Home Look at the first n steps There is a non-zero chance p 1 that we get home Also, p 1 ≥ (1/n) n Suppose we fail Then, wherever we are, there is a chance p 2 ≥ (1/n) n that we hit home in the next n steps from there Probability of failing to reach home by time kn = (1 – p 1 )(1 – p 2 ) … (1 – p k ) 0 as k ∞
26
E[ time to reach home ] is at most this Furthermore (no proof provided): If the graph has n nodes and m edges, then E[ time to visit all nodes ] ≤ 2m × (n-1)
27
Cover Times Cover time (from u) C u = E [ time to visit all vertices | start at u ] Cover time of the graph (worst case expected time to see all vertices) C(G) = max u { C u }
28
Cover Time Theorem If the graph G has n nodes and m edges, then the cover time of G is C(G) ≤ 2m (n – 1) Any graph on n vertices has < n 2 /2 edges Hence C(G) < n 3 for all graphs G
29
Actually, we get home pretty fast… Chance that we don’t hit home by (2k)2m(n-1) steps is (½) k
30
A Simple Calculation If the average income of people is $100 then more than 50% of the people can be earning more than $200 each False! else the average would be higher!!! True or False:
31
Andrei A. Markov Markov’s Inequality If X is a non-negative r.v. with mean E[X], then Pr[ X > 2 E[X] ] < ½ Pr[ X > k E[X] ] < 1/k
32
(since X is non-neg) Markov’s Inequality Non-neg random variable X has expectation A = E[X] A = E[X] = E[X | X > 2A ] Pr[X > 2A] + E[X | X ≤ 2A ] Pr[X ≤ 2A] ≥ E[X | X > 2A ] Pr[X > 2A] Also, E[X | X > 2A] > 2A A > 2A × Pr[X > 2A] Pr[ X > k × expectation ] < 1/k ½ > Pr[X > 2A]
33
Actually, we get home pretty fast… Chance that we don’t hit home by (2k)2m(n-1) steps is (½) k
34
An Averaging Argument Suppose I start at u E[ time to hit all vertices | start at u ] ≤ C(G) Hence, by Markov’s Inequality: Pr[ time to hit all vertices > 2C(G) | start at u ] ≤ ½
35
So Let’s Walk Some More! Pr [ time to hit all vertices > 2C(G) | start at u ] ≤ ½ Suppose at time 2C(G), I’m at some node with more nodes still to visit Pr [ haven’t hit all vertices in 2C(G) more time | start at v ] ≤ ½ Chance that you failed both times ≤ ¼ = (½) 2 Hence, Pr[ havent hit everyone in time k × 2C(G) ] ≤ (½) k
36
Hence, if we know that Expected Cover Time C(G) < 2m(n-1) then Pr[ home by time 4k m(n-1) ] ≥ 1 – (½) k
37
Random walks on infinite graphs
38
‘To steal a joke from Kakutani (U.C.L.A. colloquium talk): “A drunk man will eventually find his way home but a drunk bird may get lost forever.”’ Shizuo Kakutani R. Durrett, Probability: Theory and Examples, Fourth edition, Cambridge University Press, New York, NY, 2010, p. 191.
39
0i Random Walk On a Line Flip an unbiased coin and go left/right Let X t be the position at time t Pr[ X t = i ]= Pr[ #heads – #tails = i] = Pr[ #heads – (t - #heads) = i] t (t+i)/2 /2 t =
40
Sterling’s approx Random Walk On a Line 0i Pr[ X 2t = 0 ] = ≤ Θ (1/√t) 2t t /2 2t Y 2t = indicator for (X 2t = 0) E[ Y 2t ] = Θ (1/√t) Z 2n = number of visits to origin in 2n steps E[ Z 2n ] = E[ t = 1…n Y 2t ] ≤ Θ (1/√1 + 1/√2 +…+ 1/√n)= Θ (√n)
41
In n steps, you expect to return to the origin Θ (√n) times!
42
How About a 2-d Grid? Let us simplify our 2-d random walk: move in both the x-direction and y-direction…
43
How About a 2-d Grid? Let us simplify our 2-d random walk: move in both the x-direction and y-direction…
44
How About a 2-d Grid? Let us simplify our 2-d random walk: move in both the x-direction and y-direction…
45
How About a 2-d Grid? Let us simplify our 2-d random walk: move in both the x-direction and y-direction…
46
How About a 2-d Grid? Let us simplify our 2-d random walk: move in both the x-direction and y-direction…
47
In The 2-d Walk Returning to the origin in the grid both “line” random walks return to their origins Pr[ visit origin at time t ] = Θ (1/√t) × Θ (1/√t) = Θ (1/t) E[ # of visits to origin by time n ] = Θ (1/1 + 1/2 + 1/3 + … + 1/n ) = Θ (log n)
48
But In 3D Pr[ visit origin at time t ] = Θ (1/√t) 3 = Θ (1/t 3/2 ) lim n ∞ E[ # of visits by time n ] < K (constant) Hence Pr[ never return to origin ] > 1/(K+1) E[visits in total ] = Pr[ return to origin ]*(1+ E[visits in total ]) + 0*Pr[ never return to origin ]
49
A Tale of Two Brothers Carl and Hal (cburch@cs.cmu.edu, hburch@cs.cmu.edu). As competitive as they are similar. They decided to play a fair game once an hour! Each brother wins any event with probability 1/2, independent of any other events. Amazingly, Hal was ahead for the last 50 years of his life! Should we have expected this?
50
A Random Walk Let x k denote the difference in victories after k events. We can view x 0, x 1, x 2, …, x n as a “random walk” that depends on the outcomes of the individual events.
51
Theorem: The number of sequences in which x k is never negative is equal to the number that start and end at 0, for even n.
52
Proof: Put the two sets in one-to-one and onto correspondence.
53
Lemma: Suppose that Y is an integer random variable in [0,m] such that Then Proof:
54
Lemma: For odd n,the number of sequences that last touch 0 at position i is equal to the number that last touch 0 at position n-i-1. Proof: Put these sets in one-to-one and onto correspondence using previous theorem to map A to A’ and B to B’.
55
Theorem: Let Y be the length of the longest suffix that does not touch 0. For odd n, Proof: Y is a random variable in the range [0,n-1]. By previous lemma,
56
A seemingly contradictory observation: The expected length of the final suffix that does not touch 0 is (n-1)/2, but the expected number of times the sequence touches 0 in the last n/2 steps is
57
Berthold’s Triangle The number of sequences with k 1’s and n-k 0’s in which the number of 1’s is greater than or equal to the number of 0’s in every prefix. Examples: 111 110 101
58
1 0 1 0 1 1 0 0 2 1 0 0 2 3 1 0 0 0 5 4 1 0 0 0 5 9 5 1 0 0 0 0 14 14 6 1 Notice that the n’th row sums to
59
Theorem: Proof: By induction on n. Base cases: Inductive step:
60
Another strange fact… Let Z be the number of steps before the sequence returns to 0. Let Z’ be the number of steps before the sequence drops below 0. This looks like trouble! E[Z’] isn’t finite!
61
Here’s What You Need to Know… Random Walk in a Line Cover Time of a Graph Markov’s Inequality
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.