Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Parrondo's Paradox. 2 Two losing games can be combined to make a winning game. Game A: repeatedly flip a biased coin (coin a) that comes up head with.

Similar presentations


Presentation on theme: "1 Parrondo's Paradox. 2 Two losing games can be combined to make a winning game. Game A: repeatedly flip a biased coin (coin a) that comes up head with."— Presentation transcript:

1 1 Parrondo's Paradox

2 2 Two losing games can be combined to make a winning game. Game A: repeatedly flip a biased coin (coin a) that comes up head with probability p a <1/2 and tails with probability 1-p a. You win a dollar if the head appears.

3 3 Game B: repeatedly flip coins, but the coin that is flipped depends on the previous outcomes. Let w be the number of your wins so far and l the number of your losses. Each round we bet one dollar, so w- l represents winnings: if it is negative, you have lost money.

4 4 Here we use two biased coins (coin b and coin c). If 3|w- l, then flip coin b, which comes up head with proabability p b. Otherwise flip coin c, which comes up head with probability p c. Again, you win one dollar, if it comes up head.

5 5

6 6 Suppose p b =0.09 and p c =0.74. If we use coin b for 1/3 of the time that the winnings are a multiple of 3 and use coin c the other 2/3 of the time. The probability of winning is

7 7 But coin b may not be used 1/3 of the time! Intuitively, when starting with winning 0, use coin b and most likely lose, after which use coin c and most likely win. Thus, you may spend lots of time going back and forth between having lost one dollar and breaking even before either winning one dollar or losing two dollars.

8 8 Suppose we start playing Game B when the winning is 0, continuing until either lose three dollars or win three dollars. Note that if you are more likely to lose than win in this case, by symmetry you are more likely to lose 3 dollars than win 3 dollars whenever 3|w- l. Consider the Markov chain on the state space consisting of the integers {-3,-2,-1,0,1,2,3}. We show that it is more likely to reach -3 than 3.

9 9 Let z i be the probability that the game will reach -3 before reaching 3 when starting with winning i. We want to calculate z i, for i=-3,..,3, especially z 0. z 0 >1/2 means it is more likely to lose three dollars before winning three dollars starting from 0.

10 10 We have the following equations: Solving this system of equations, we have z -2 = (1-p c )z -3 +p c z -1, z -1 = (1-p c )z -2 +p c z 0, z 0 = (1-p b )z -1 +p b z 1, z 1 = (1-p c )z 0 +p c z 2, z 2 = (1-p c )z 1 +p c z 3,

11 11 Instead of solving the equations directly, there is a simpler way to determine the relating probability of reaching -3 or 3 first. Consider any sequence of moves that starts at 0 and ends at 3 before reaching -3. E.g. s=0121210-1-2-1012123.

12 12 Create a 1-1 and onto mapping of such sequences with the sequence that start at 0 and end at -3 before 3 by negating every number starting from the last 0 in the sequence. Eg. f(s)=0121210-1-2-10-1-2-1-2-3.

13 13 Lemma: For any sequence s of moves that starts at 0 and ends at 3 before -3, we have Pf: t 1 : # of transitions from 0 to 1 in s; t 2 : # of transitions from 0 to -1 in s; t 3 : the sum of the number of transitions from -2 to -1, -1 to 0, 1 to 2 and 2 to 3; t 4 : the sum of the number of transitions from 2 to 1, 1 to 0, -1 to -2 and -2 to -3.

14 14 Then, the probability that the sequence s occurs is Transform s to f(s). Then (0  1) is changed to (0  -1). After this point, in s the total number of transitions that move up 1 is 2 more than the number of transitions that move down 1, since in s it ends at 3. In f(s), the total number of transitions that move down 1 is two more than that of moving up 1.

15 15 It follows the probability f(s) occurs is Thus,

16 16 Let S be the set of all sequences of moves that start at 0 and end at 3 before -3. Which is 12321/15379 < 1. I.e. it is move likely to lose than win.

17 17 Another way of showing Game B is losing. Consider the Markov chain on states {0,1,2}, which indicates (w- l ) mod 3. Let  i be the stationary probability of this chain. The probability that we win a dollar in the stationary distribution, is Want to know if the above is >1/2 or <1/2. p b  0 + p c  1 + p c  2 = p b  0 + p c (1-  0 ) = p c – (p c -p b )  0.

18 18 The equations for the stationary distribution are:  Plugging p b and p c,  0  0.3826 and p c -(p c -p b )  0 <1/2. Again, a losing game..  0 +  1 +  2 = 1 p b  0 + (1-p c )  2 =  1 p c  1 + (1-p b )  0 =  2 p c  2 + (1-p c )  1 =  0

19 19 Combining Game A and B Game C: Repeatedly perform the following: It seems Game C is a losing game. Start by flipping a fair coin d if d comes out head then proceed as in Game A. if tail, then proceed to Game B.

20 20 Analysis: If 3 | (w- l ), then you win with probability else win with By the previous lemma, i.e., Game C appears to be a winning game!

21 21 Analysis: It seems odd, we check with the Markov chain approach. Then by plugging the values. If the winnings from a round of game A, B and C are X A, X B and X C (respectively), then it seems that But E[X A ] 0?

22 22 Reason: the above equation is wrong! Why? Consider the above mentioned Markov chain on {0,1,2} for games B and C. Let s represent the current state, we have

23 23 Linearity holds for any given step; but we must condition on the current state. Combining the games we’ve changes how often the chain spends in each state, allowing two losing games to become a winning game!


Download ppt "1 Parrondo's Paradox. 2 Two losing games can be combined to make a winning game. Game A: repeatedly flip a biased coin (coin a) that comes up head with."

Similar presentations


Ads by Google