Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Midterm. 2.Review of Bernoulli and binomial random variables. 3.Geometric.

Slides:



Advertisements
Similar presentations
Discrete Uniform Distribution
Advertisements

Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Savage/Tyler, Kaplan/Gazes 2.P(flop a full house) 3.Bernoulli random.
STA291 Statistical Methods Lecture 13. Last time … Notions of: o Random variable, its o expected value, o variance, and o standard deviation 2.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Midterms. 2.Hellmuth/Gold. 3.Poisson. 4.Continuous distributions.
5.1 Sampling Distributions for Counts and Proportions.
Prof. Bart Selman Module Probability --- Part d)
Chapter 5 Probability Distributions
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Slide
Binomial & Geometric Random Variables
Stat 321- Day 13. Last Time – Binomial vs. Negative Binomial Binomial random variable P(X=x)=C(n,x)p x (1-p) n-x  X = number of successes in n independent.
Probability Models Chapter 17.
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Unit 4 Starters. Starter Suppose a fair coin is tossed 4 times. Find the probability that heads comes up exactly two times.
Stat 13, Thu 5/3/ Correction on normal notation. 2. Normal percentiles. 3. Normal probability plots. 4. Bernoulli and Binomial random variables.
Chapter 5.4: Bernoulli and Binomial Distribution
1 Chapter 8: The Binomial and Geometric Distributions 8.1Binomial Distributions 8.2Geometric Distributions.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Hand in hw4. 2.Review list 3.Tournament 4.Sample problems * Final.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 0. Collect hw2, return hw1, give out hw3. 1.Project A competition.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.E(cards til 2 nd king). 2.Negative binomial. 3.Rainbow flops examples,
Expected values and variances. Formula For a discrete random variable X and pmf p(X): Expected value: Variance: Alternate formula for variance:  Var(x)=E(X^2)-[E(X)]^2.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Collect Hw4. 2.Review list. 3.Answers to hw4. 4.Project B tournament.
Warm-up Grab a die and roll it 10 times and record how many times you roll a 5. Repeat this 7 times and record results. This time roll the die until you.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day, Tue 3/13/12: 1.Collect Hw WSOP main event. 3.Review list.
6.2 Homework Questions.
AP Statistics Chapter 8 Notes. The Binomial Setting If you roll a die 20 times, how many times will you roll a 4? Will you always roll a 4 that many times?
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1. Review list 2.Bayes’ Rule example 3.CLT example 4.Other examples.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day, Tues 2/28/12: 1.Midterms back. 2.Review of midterm. 3.Poisson distribution,
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day, Tue 3/13/12: 1.Fred Savage hand. 2.Random walks, continued, for 7.14.
Chapter 16 Week 6, Monday. Random Variables “A numeric value that is based on the outcome of a random event” Example 1: Let the random variable X be defined.
Chapter 4. Random Variables - 3
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U.
Bernoulli Trials, Geometric and Binomial Probability models.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.E(X+Y) = E(X) + E(Y) examples. 2.CLT examples. 3.Lucky poker. 4.Farha.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Odds ratios revisited. 2.Gold/Hellmuth. 3.Deal making. 4.Variance.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Hand in hw1! Get hw2. 2.Combos, permutations, and A  vs 2  after.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Tournaments 2.Review list 3.Random walk and other examples 4.Evaluations.
1)Hand in HW. 2)No class Tuesday (Veteran’s Day) 3)Midterm Thursday (1 page, double-sided, of notes allowed) 4)Review List 5)Review of Discrete variables.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Uniform, normal, and exponential. 2.Exponential example. 3.Uniform.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.HW4 notes. 2.Law of Large Numbers (LLN). 3.Central Limit Theorem.
Binomial Distributions Chapter 5.3 – Probability Distributions and Predictions Mathematics of Data Management (Nelson) MDM 4U Authors: Gary Greer (with.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.PDFs and CDFs, for Random walks, ch 7.6. Reflection principle,
(Day 14 was review. Day 15 was the midterm.) Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Return and review.
Outline: 1) Odds ratios, continued. 2) Expected value revisited, Harrington’s strategy 3) Pot odds 4) Examples.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1. Combos, permutations, and A  vs 2  after first ace 2.Conditional.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Expected value. 2.Heads up with AA. 3.Heads up with Gus vs.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Review List 2.Review of Discrete variables 3.Nguyen / Szenkuti.
The Law of Averages. What does the law of average say? We know that, from the definition of probability, in the long run the frequency of some event will.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Odds ratio example again. 2.Random variables. 3.cdf, pmf, and density,
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Chapter 5 Statistical Models in Simulation
Stat 35b: Introduction to Probability with Applications to Poker
Using the Tables for the standard normal distribution
Stat 35b: Introduction to Probability with Applications to Poker
Ch. 6. Binomial Theory.
Further Stats 1 Chapter 5 :: Central Limit Theorem
Expected values and variances
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Presentation transcript:

Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Midterm. 2.Review of Bernoulli and binomial random variables. 3.Geometric random variables. 4.Negative binomial. 5.E(X+Y). 6.Harman, Negreanu and running it twice.   u 

1.Midterm. The midterm is Thu Feb 21, in class. It will be on chapters 1-5. I will be lecturing on some stuff in chapter 6 before then, but the midterm will only cover ch 1-5. It will be mostly multiple choice, plus a few short answer questions. You can use the book, plus just one page (double sided) of 8.5 x 11 paper with notes on it. Keep that page of notes after the test.   u 

2. Bernoulli and binomial Random Variables. If X = 1 with probability p, and X = 0 otherwise, then X = Bernoulli (p). If X is Bernoulli (p), then µ = E(X) = p, and  = √(pq). For example, suppose X = 1 if you have a pocket pair next hand; X = 0 if not. p = 5.88%. So, q = 94.12%. Suppose now X = # of times something with prob. p occurs, out of n independent trials Then X = Binomial (n.p). e.g. the number of pocket pairs, out of 10 hands. pmf: P(X = k) = choose(n, k) * p k q n - k. e.g. say n=10, k=3: P(X = 3) = choose(10,3) * p 3 q 7. Why? Could have , or , etc. choose(10, 3) choices of places to put the 1’s, and for each the prob. is p 3 q 7. Key idea: X = Y 1 + Y 2 + … + Y n, where the Y i are independent and Bernoulli (p). If X is Binomial (n,p), then µ = np, and  = √(npq).

Binomial Random Variables, continued. Suppose X = the number of pocket pairs you get in the next 100 hands. What’s P(X = 4)? What’s E(X)?  ? X = Binomial (100, 5.88%). P(X = k) = choose(n, k) * p k q n - k. So, P(X = 4) = choose(100, 4) * * = 13.9%, or 1 in 7.2. E(X) = np = 100 * =  = √(100 * * ) = So, out of 100 hands, you’d typically get about 5.88 pocket pairs, +/- around 2.35.

3. Geometric Random Variables, ch 5.3. Suppose now X = # of trials until the first occurrence. (Again, each trial is independent, and each time the probability of an occurrence is p.) Then X = Geometric (p). e.g. the number of hands til you get your next pocket pair. [Including the hand where you get the pocket pair. If you get it right away, then X = 1.] Now X could be 1, 2, 3, …, up to ∞. pmf: P(X = k) = p 1 q k - 1. e.g. say k=5: P(X = 5) = p 1 q 4. Why? Must be Prob. = q * q * q * q * p. If X is Geometric (p), then µ = 1/p, and  = (√q) ÷ p. e.g. Suppose X = the number of hands til your next pocket pair. P(X = 12)? E(X)?  ? X = Geometric (5.88%). P(X = 12) = p 1 q 11 = * ^ 11 = 3.02%. E(X) = 1/p =  = sqrt(0.9412) / = So, you’d typically expect it to take 17 hands til your next pair, +/- around 16.5 hands.

4. Negative Binomial Random Variables, ch 5.4. Recall: if each trial is independent, and each time the probability of an occurrence is p, and X = # of trials until the first occurrence, then: X is Geometric (p), P(X = k) = p 1 q k - 1, µ = 1/p,  = (√q) ÷ p. Suppose now X = # of trials until the rth occurrence. Then X = negative binomial (r,p). e.g. the number of hands you have to play til you’ve gotten r=3 pocket pairs. Now X could be 3, 4, 5, …, up to ∞. pmf: P(X = k) = choose(k-1, r-1) p r q k - r, for k = r, r+1, …. e.g. say r=3 & k=7: P(X = 7) = choose(6,2) p 3 q 4. Why? Out of the first 6 hands, there must be exactly r-1 = 2 pairs. Then pair on 7th. P(exactly 2 pairs on first 6 hands) = choose(6,2) p 2 q 4. P(pair on 7th) = p. If X is negative binomial (r,p), then µ = r/p, and  = (√rq) ÷ p. e.g. Suppose X = the number of hands til your 12th pocket pair. P(X = 100)? E(X)?  ? X = Geometric (12, 5.88%). P(X = 100) = choose(99,11) p 12 q 88 = choose(99,11) * ^ 12 * ^ 88 = 0.104%. E(X) = r/p = 12/ =  = sqrt(12*0.9412) / = So, you’d typically expect it to take hands til your 12th pair, +/- around 57.2 hands.

5) E(X+Y) = E(X) + E(Y). Whether X & Y are independent or not! pp Similarly, E(X + Y + Z + …) = E(X) + E(Y) + E(Z) + … And, if X & Y are independent, then V(X+Y) = V(X) + V(Y). so SD(X+Y) = √[SD(X)^2 + SD(Y)^2]. Example 1: Play 10 hands. X = your total number of pocket aces. What is E(X)? X is binomial (n,p) where n=10 and p = , so E(X) = np = Alternatively, X = # of pocket aces on hand 1 + # of pocket aces on hand 2 + … So, E(X) = Expected # of AA on hand1 + Expected # of AA on hand2 + … Each term on the right = 1 * * = So E(X) = … = Example 2: Play 1 hand, against 9 opponents. X = total # of pocket aces at the table. Now what is E(X)? Note: not independent! If you have AA, then it’s unlikely anyone else does too. Nevertheless, Let X 1 = 1 if player #1 has AA, and 0 otherwise. X 2 = 1 if player #2 has AA, and 0 otherwise. etc. Then X = X 1 + X 2 + … + X 10. So E(X) = E(X 1 ) + E(X 2 ) + … + E(X 10 ) = … = (Harman / Negreanu)

6) Harman / Negreanu, and running it twice. Harman has 10  7 . Negreanu has K  Q . The flop is 10 u 7  K u. Harman’s all-in. $156,100 pot. P(Negreanu wins) = 28.69%. P(Harman wins) = 71.31%. Let X = amount Harman has after the hand. If they run it once, E(X) = $0 x 29% + $156,100 x 71.31% = $111, If they run it twice, what is E(X)? There’s some probability p 1 that Harman wins both times ==> X = $156,100. There’s some probability p 2 that they each win one ==> X = $78,050. There’s some probability p 3 that Negreanu wins both ==> X = $0. E(X) = $156,100 x p 1 + $78,050 x p 2 + $0 x p 3. If the different runs were independent, then p 1 = P(Harman wins 1st run & 2nd run) would = P(Harman wins 1st run) x P(Harman wins 2nd run) = 71.31% x 71.31% ~ 50.85%. But, they’re not quite independent! Very hard to compute p 1 and p 2. However, you don’t need p 1 and p 2 ! X = the amount Harman gets from the 1st run + amount she gets from 2nd run, so E(X) = E(amount Harman gets from 1st run) + E(amount she gets from 2nd run) = $78,050 x P(Harman wins 1st run) + $0 x P(Harman loses first run) + $78,050 x P(Harman wins 2nd run) + $0 x P(Harman loses 2nd run) = $78,050 x 71.31% + $0 x 28.69% + $78,050 x 71.31% + $0 x 28.69% = $111,

HAND RECAP: Harman 10  7 . Negreanu K  Q . Flop 10 u 7  K u. Harman’s all-in. $156,100 pot. P(Negreanu wins) = 28.69%.P(Harman wins) = 71.31% The standard deviation changes a lot! Say they run it once. (see p127.) V(X) = E(X 2 ) - µ 2. µ = $111,314.9, so µ 2 ~ $12.3 billion. E(X 2 ) = ($156,100 2 )(71.31%) + (0 2 )(28.69%) = $17.3 billion. So V(X) = $17.3 billion - $12.3 billion = $5.09 billion. The standard deviation  = sqrt($5.09 billion) ~ $71,400. So if they run it once, Harman expects to get back about $111, /- $71,400. If they run it twice? Hard to compute, but approximately, if each run were independent, then V(X 1 +X 2 ) = V(X 1 ) + V(X 2 ), so if X 1 = amount she gets back on 1st run, and X 2 = amount she gets from 2nd run, then V(X 1 +X 2 ) ~ V(X 1 ) + V(X 2 ) ~ $1.25 billion + $1.25 billion = $2.5 billion, The standard deviation  = sqrt($2.5 billion) ~ $50,000. So if they run it twice, Harman expects to get back about $111, /- $50,000.