Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.PDFs and CDFs, for 6.12. 2.Random walks, ch 7.6. Reflection principle,

Slides:



Advertisements
Similar presentations
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Midterms 2.Flushes 3.Hellmuth vs. Farha 4.Worst possible beat 5.Jackpot.
Advertisements

Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Savage/Tyler, Kaplan/Gazes 2.P(flop a full house) 3.Bernoulli random.
Part 3: The Minimax Theorem
Problem 5.31 Ryan H Kian L Jun Oh Y Starting Out:  Define a Round Robin Tournament:  A tournament in which each player plays every other player. There.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Midterms. 2.Hellmuth/Gold. 3.Poisson. 4.Continuous distributions.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Review of random walks. 2.Ballot theorem. 3.P( avoiding zero )
Ekstrom Math 115b Mathematics for Business Decisions, part II Integration Math 115b.
Continuous Random Variables. For discrete random variables, we required that Y was limited to a finite (or countably infinite) set of values. Now, for.
CSE115/ENGR160 Discrete Mathematics 04/17/12
1 CS 430 / INFO 430 Information Retrieval Lecture 12 Probabilistic Information Retrieval.
Random Variables A Random Variable assigns a numerical value to all possible outcomes of a random experiment We do not consider the actual events but we.
Suppose an ordinary, fair six-sided die is rolled (i.e., for i = 1, 2, 3, 4, 5, 6, there is one side with i spots), and X = “the number of spots facing.
Lecture II.  Using the example from Birenens Chapter 1: Assume we are interested in the game Texas lotto (similar to Florida lotto).  In this game,
Binomial PDF and CDF Section Starter Five marbles are on a table. Two of them are going to be painted with a “W” and the rest will be painted.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Andy Guo 1 Handout Ch5(2) 實習. Andy Guo 2 Normal Distribution There are three reasons why normal distribution is important –Mathematical properties of.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Hand in hw4. 2.Review list 3.Tournament 4.Sample problems * Final.
Counting. Techniques for counting Rule 1 Suppose we carry out have a sets A 1, A 2, A 3, … and that any pair are mutually exclusive (i.e. A 1  A 2 =
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day, Thur 3/8/12: 0.HAND IN HW3 again! 1.E(X+Y) example corrected. 2.Random.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 0. Collect hw2, return hw1, give out hw3. 1.Project A competition.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Midterm. 2.Review of Bernoulli and binomial random variables. 3.Geometric.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.E(cards til 2 nd king). 2.Negative binomial. 3.Rainbow flops examples,
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Collect Hw4. 2.Review list. 3.Answers to hw4. 4.Project B tournament.
Counting Subsets of a Set: Combinations Lecture 31 Section 6.4 Wed, Mar 21, 2007.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day, Tue 3/13/12: 1.Collect Hw WSOP main event. 3.Review list.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
Outline for the day: 1.Discuss handout / get new handout. 2.Teams 3.Example projects 4.Expected value 5.Pot odds calculations 6.Hansen / Negreanu 7.P(4.
Counting Subsets of a Set: Combinations Lecture 33 Section 6.4 Tue, Mar 27, 2007.
EE 5345 Multiple Random Variables
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Deal-making and expected value 2.Odds ratios, revisited 3.Variance.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1. Review list 2.Bayes’ Rule example 3.CLT example 4.Other examples.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Expected value and pot odds, continued 2.Violette/Elezra example.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day, Tue 3/13/12: 1.Fred Savage hand. 2.Random walks, continued, for 7.14.
Some other set notation We will use the notation to mean that e is an element of A. We will use the notation to mean that e is not an element of A.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Ch4: 4.3The Normal distribution 4.4The Exponential Distribution.
Joint Moments and Joint Characteristic Functions.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.E(X+Y) = E(X) + E(Y) examples. 2.CLT examples. 3.Lucky poker. 4.Farha.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Odds ratios revisited. 2.Gold/Hellmuth. 3.Deal making. 4.Variance.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Tournaments 2.Review list 3.Random walk and other examples 4.Evaluations.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Uniform, normal, and exponential. 2.Exponential example. 3.Uniform.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.HW4 notes. 2.Law of Large Numbers (LLN). 3.Central Limit Theorem.
Incomplete Information and Bayes-Nash Equilibrium.
(Day 14 was review. Day 15 was the midterm.) Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Return and review.
Outline: 1) Odds ratios, continued. 2) Expected value revisited, Harrington’s strategy 3) Pot odds 4) Examples.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1. Combos, permutations, and A  vs 2  after first ace 2.Conditional.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Expected value. 2.Heads up with AA. 3.Heads up with Gus vs.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Review List 2.Review of Discrete variables 3.Nguyen / Szenkuti.
Counting Subsets of a Set: Combinations Lecture 28 Section 6.4 Thu, Mar 30, 2006.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Odds ratio example again. 2.Random variables. 3.cdf, pmf, and density,
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Introduction to Probability: Solutions for Quizzes 4 and 5
Stat 35b: Introduction to Probability with Applications to Poker
Presentation transcript:

Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.PDFs and CDFs, for Random walks, ch 7.6. Reflection principle, ballot theorem, and avoiding 0. 3.Hellmuth and Farha. 4.Chip proportions and induction. Homework 4, Stat 35, due Thu March 14, 11 am. 6.12, 7.2, 7.8, Project B is due Tue, Mar. 12, 8pm, by to Read ch7. Also, read pages on optimal play, though it’s not on the final. If you want examples and descriptions of the arguments to your function, see projectBexamples.txt on the course website, Many questions on the final will be similar to midterm questions, so go over the midterm questions you missed. Ignore the extra-credit on the midterm, though.

1. Pdfs and cdfs, for Suppose X has the cdf F(c) = P(X ≤ c) = 1 - exp(-4c), for c ≥ 0, and F(c) = 0 for c < 0. Then to find the pdf, f(c), take the derivative of F(c). f(c) = F’(c) = 4exp(-4c), for c ≥ 0, and f(c) = 0 for c<0. Thus, X is exponential with = 4. Now, what is E(X)? E(X) = ∫ -∞ ∞ c f(c) dc = ∫ 0 ∞ c {4exp(-4c)} dc = ¼, after integrating by parts, or just by remembering that if X is exponential with = 4, then E(X) = ¼. For 6.12, the key things to remember are 1. f(c) = F'(c). 2. For an exponential random variable with mean, F(c) = 1 – exp(-1/ c). 3. For any Z, E(Z) = ∫ c f(c) dc. And, V(Z) = E(Z 2 ) – [E(Z)] 2, where E(Z 2 ) = ∫ c 2 f(c) dc. 4. E(X) for exponential = 1/. 5. E(X 2 ) for exponential = 2/ 2. For 7.14, the key things to remember are 1.If p≠0.5, then by Theorem 7.6.8, P(win tournament) = (1-r k )/(1-r n ), where r = q/p. 2.Let x = r 2. If –x 3 + 2x -1 = 0, that means (x-1)(-x 2 - x + 1) = 0. There are 3 solutions to this. One is x = 1. The others occur when x 2 + x - 1 = 0, so x = [-1 +/- sqrt(1+4)]/2 = or So, x = , 0.618, or 1. Two of these possibilities can be ruled out. Remember that p≠0.5.

2. Random walks, ch 7.6. Suppose that X 1, X 2, …, are iid, and S k = X 0 + X 1 + … + X k for k = 0, 1, 2, …. The totals {S 0, S 1, S 2, …} form a random walk. The classical (simple) case is when each X i is 1 or -1 with probability ½ each. Reflection principle: The number of paths from (0,X 0 ) to (n,y) that touch the x-axis = the number of paths from (0,-X 0 ) to (n,y), for any n,y, and X 0 > 0. Ballot theorem: In n = a+b hands, if player A won a hands and B won b hands, where a>b, and if the hands are aired in random order, P(A won more hands than B throughout the telecast) = (a-b)/n. [In an election, if candidate X gets x votes, and candidate Y gets y votes, where x > y, then the probability that X always leads Y throughout the counting is (x-y) / (x+y).] For a simple random walk, P(S 1 ≠ 0, S 2 ≠ 0, …, S n ≠ 0) = P(S n = 0), for any even n.

Reflection principle: The number of paths from (0,X 0 ) to (n,y) that touch the x-axis = the number of paths from (0,-X 0 ) to (n,y), for any n,y, and X 0 > 0. For each path from (0,X 0 ) to (n,y) that touches the x-axis, you can reflect the first part til it touches the x-axis, to find a path from (0,-X 0 ) to (n,y), and vice versa. Total number of paths from (0,-X 0 ) to (n,y) is easy to count: it’s just C(n,a), where you go up a times and down b times [i.e. a-b = y - (-X 0 ) = y + X 0. a+b=n, so b = n-a, 2a-n=y+x, a=(n+y+x)/2].

Ballot theorem: In n = a+b hands, if player A won a hands and B won b hands, where a>b, and if the hands are aired in random order, P(A won more hands than B throughout the telecast) = (a-b)/n. Proof: We know that, after n = a+b hands, the total difference in hands won is a-b. Let x = a-b. We want to count the number of paths from (1,1) to (n,x) that do not touch the x-axis. By the reflection principle, the number of paths from (1,1) to (n,x) that do touch the x-axis equals the total number of paths from (1,-1) to (n,x). So the number of paths from (1,1) to (n,x) that do not touch the x-axis equals the number of paths from (1,1) to (n,x) minus the number of paths from (1,-1) to (n,x) = C(n-1,a-1) – C(n-1,a) = (n-1)! / [(a-1)! (n-a)!] – (n-1)! / [a! (n-a-1)!] = {n! / [a! (n-a)!]} [(a/n) - (n-a)/n] = C(n,a) (a-b)/n. And each path is equally likely, and has probability 1/C(n,a). So, P(going from (0,0) to (n,a) without touching the x-axis = (a-b)/n.

Avoiding zero. For a simple random walk, for any even # n, P(S 1 ≠ 0, S 2 ≠ 0, …, S n ≠ 0) = P(S n = 0). Proof. The number of paths from (0,0) to (n, j) that don’t touch the x-axis at positive times = the number of paths from (1,1) to (n,j) that don’t touch the x-axis at positive times = paths from (1,1) to (n,j) - paths from (1,-1) to (n,j) by the reflection principle = N n-1,j-1 – N n-1,j+1 Let Q n,j = P(S n = j). P(S 1 > 0, S 2 > 0, …, S n-1 > 0, S n = j) = ½[Q n-1,j-1 – Q n-1,j+1 ]. Summing from j = 2 to ∞, P(S 1 > 0, S 2 > 0, …, S n-1 > 0, S n > 0) = ½[Q n-1,1 – Q n-1,3 ] + ½[Q n-1,3 – Q n-1,5 ] + ½[Q n-1,5 – Q n-1,7 ] + … = (1/2) Q n-1,1 = (1/2) P(S n = 0), because to end up at (n, 0), you have to be at (n-1,+/-1), so P(S n = 0) = (1/2) Q n-1,1 + (1/2) Q n-1,-1 = Q n-1,1. By the same argument, P(S 1 < 0, S 2 < 0, …, S n-1 < 0, S n < 0) = (1/2) P(S n = 0). So, P(S 1 ≠ 0, S 2 ≠ 0, …, S n ≠ 0) = P(S n = 0).

3. Hellmuth and Farha. P(flush | suited)? P(flush | unsuited)? Consider only flushes of your suit. [C(11,3)*C(39,2)+C(11,4)*39+C(11,5)]/C(50,5) = 6.40%. [C(12,4)*38+C(12,4)*38+C(12,5)+C(12,5)]/C(50,5) = 1.85%. 4. Chip proportions and induction. P(win a tournament) is proportional to your number of chips. Simplified scenario. Suppose you either go up or down 1 each hand, with probability 1/2. Suppose there are n chips, and you have k of them. P(win the tournament given k chips) = P(random walk goes from k to n before hitting 0) = p k. Now, clearly p 0 = 0. Consider p 1. From 1, you will either go to 0 or 2. So, p 1 = 1/2 p 0 + 1/2 p 2 = 1/2 p 2. That is, p 2 = 2 p 1. We have shown that p j = j p 1, for j = 0, 1, and 2. (induction:) Suppose that, for j = 0, 1, 2, …, m, p j = j p 1. We will show that p m+1 = (m+1) p 1. Therefore, p j = j p 1 for all j. That is, P(win the tournament) is prop. to your number of chips. p m = 1/2 p m-1 + 1/2 p m+1. If p j = j p 1 for j ≤ m, then we have mp 1 = 1/2 (m-1)p 1 + 1/2 p m+1, so p m+1 = 2mp 1 - (m-1) p 1 = (m+1)p 1.

Theorem 7.6.7, p152, describes another simplified scenario. Suppose you either double each hand you play, or go to zero, each with probability 1/2. Again, P(win a tournament) is prop. to your number of chips. Again, p 0 = 0, and p 1 = 1/2 p 2 = 1/2 p 2, so again, p 2 = 2 p 1. We have shown that, for j = 0, 1, and 2, p j = j p 1. (induction:) Suppose that, for j ≤ m, p j = j p 1. We will show that p 2m = (2m) p 1. Therefore, p j = j p 1 for all j = 2 k. That is, P(win the tournament) is prop. to number of chips. This time, p m = 1/2 p 0 + 1/2 p 2m. If p j = j p 1 for j ≤ m, then we have mp 1 = 0 + 1/2 p 2m, so p 2m = 2mp 1. Done. Problem 7.14 refers to Theorem 7.6.8, p152. You have k of the n chips in play. Each hand, you gain 1 with prob. p, or lose 1 with prob. q=1-p. Suppose 0<p <1 and p≠0.5. Let r = q/p. Then P(you win the tournament) = (1-r k )/(1-r n ). The proof is again by induction, and is similar to the proof we did of Theorem Notice that if k = 0, then (1-r k )/(1-r n ) = 0. If k = n, then (1-r k )/(1-r n ) = 1.

Example (Chen and Ankenman, 2006). Suppose that a $100 winner-take-all tournament has 1024 = 2 10 players. So, you need to double up 10 times to win. Winner gets $102,400. Suppose you have probability p = 0.54 to double up, instead of 0.5. What is your expected profit in the tournament? (Assume only doubling up.) P(winning) = , so exp. return = ($102,400) = $ So exp. profit = $ Optimal play, ch 6.3, pp