Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Savage/Tyler, Kaplan/Gazes 2.P(flop a full house) 3.Bernoulli random.

Slides:



Advertisements
Similar presentations
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Midterms 2.Flushes 3.Hellmuth vs. Farha 4.Worst possible beat 5.Jackpot.
Advertisements

Special random variables Chapter 5 Some discrete or continuous probability distributions.
Discrete Uniform Distribution
2 Discrete Random Variable. Independence (module 1) A and B are independent iff the knowledge that B has occurred does not change the probability that.
STA291 Statistical Methods Lecture 13. Last time … Notions of: o Random variable, its o expected value, o variance, and o standard deviation 2.
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 17 Probability Models.
The Bernoulli distribution Discrete distributions.
Flipping an unfair coin three times Consider the unfair coin with P(H) = 1/3 and P(T) = 2/3. If we flip this coin three times, the sample space S is the.
Probability Distributions
Chapter 5 Probability Distributions
Stat 321- Day 13. Last Time – Binomial vs. Negative Binomial Binomial random variable P(X=x)=C(n,x)p x (1-p) n-x  X = number of successes in n independent.
Review of important distributions Another randomized algorithm
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Probability Models Chapter 17.
Chapter 17 Probability Models math2200. I don’t care about my [free throw shooting] percentages. I keep telling everyone that I make them when they count.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Hand in hw4. 2.Review list 3.Tournament 4.Sample problems * Final.
The Negative Binomial Distribution An experiment is called a negative binomial experiment if it satisfies the following conditions: 1.The experiment of.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 0. Collect hw2, return hw1, give out hw3. 1.Project A competition.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Midterm. 2.Review of Bernoulli and binomial random variables. 3.Geometric.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.E(cards til 2 nd king). 2.Negative binomial. 3.Rainbow flops examples,
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 11 Section 1 – Slide 1 of 34 Chapter 11 Section 1 Random Variables.
Section 2.5 Important definition in the text: The definition of the moment generating function (m.g.f.) Definition If S is the space for a random.
Expected values and variances. Formula For a discrete random variable X and pmf p(X): Expected value: Variance: Alternate formula for variance:  Var(x)=E(X^2)-[E(X)]^2.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Collect Hw4. 2.Review list. 3.Answers to hw4. 4.Project B tournament.
Warm-up Grab a die and roll it 10 times and record how many times you roll a 5. Repeat this 7 times and record results. This time roll the die until you.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day, Tue 3/13/12: 1.Collect Hw WSOP main event. 3.Review list.
Binomial Experiment A binomial experiment (also known as a Bernoulli trial) is a statistical experiment that has the following properties:
6.2 Homework Questions.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Outline for the day: 1.Discuss handout / get new handout. 2.Teams 3.Example projects 4.Expected value 5.Pot odds calculations 6.Hansen / Negreanu 7.P(4.
Stat 13 Lecture 19 discrete random variables, binomial A random variable is discrete if it takes values that have gaps : most often, integers Probability.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Addiction 2.Syllabus, etc. 3. Wasicka/Gold/Binger Example 4.Meaning.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1. Review list 2.Bayes’ Rule example 3.CLT example 4.Other examples.
Chapter 4 Discrete Probability Distributions. Roll Two Dice and Record the Sums Physical Outcome: An ordered pair of two faces showing. We assign a numeric.
Lec. 08 – Discrete (and Continuous) Probability Distributions.
Random Variables Example:
Normal Approximations to Binomial Distributions.  For a binomial distribution:  n = the number of independent trials  p = the probability of success.
Probability Models Chapter 17. Bernoulli Trials  The basis for the probability models we will examine in this chapter is the Bernoulli trial.  We have.
Bernoulli Trials, Geometric and Binomial Probability models.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Tournaments 2.Review list 3.Random walk and other examples 4.Evaluations.
1)Hand in HW. 2)No class Tuesday (Veteran’s Day) 3)Midterm Thursday (1 page, double-sided, of notes allowed) 4)Review List 5)Review of Discrete variables.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Uniform, normal, and exponential. 2.Exponential example. 3.Uniform.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.HW4 notes. 2.Law of Large Numbers (LLN). 3.Central Limit Theorem.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1. Combos, permutations, and A  vs 2  after first ace 2.Conditional.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Expected value. 2.Heads up with AA. 3.Heads up with Gus vs.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Review List 2.Review of Discrete variables 3.Nguyen / Szenkuti.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Odds ratio example again. 2.Random variables. 3.cdf, pmf, and density,
7.5 Binomial and Geometric Distributions *two of more commonly found discrete probability distributions Thursday, June 30, 2016.
7.5 Binomial and Geometric Distributions *two of more commonly found discrete probability distributions Thursday, June 30, 2016.
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Chapter 5 Statistical Models in Simulation
Stat 35b: Introduction to Probability with Applications to Poker
Useful Discrete Random Variable
III. More Discrete Probability Distributions
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Chapter 17 Probability Models Copyright © 2009 Pearson Education, Inc.
Stat 35b: Introduction to Probability with Applications to Poker
Presentation transcript:

Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Savage/Tyler, Kaplan/Gazes 2.P(flop a full house) 3.Bernoulli random variables 4.Binomial 5.Geometric 6.Negative binomial 7. Harman/Negreanu, P(flop two pairs)   u 

2. P(flop a full house)? [Use that trick: consider all sets of 5 cards: your 2 & flop] P(flop full house) = # of different full houses / choose(52,5) How many different full houses are possible? For instance, (5  5  5  8 u 8  ), or (8  8  8  5 u 5  ). 13 * choose(4,3) different choices for the triple. For each such choice, there are 12 * choose(4,2) choices left for the pair. So, P(flop full house) = 13 * choose(4,3) * 12 * choose(4,2) / choose(52,5) ~ 0.144%, or 1 in 694.

3. Bernoulli Random Variables. If X = 1 with probability p, and X = 0 otherwise, then X = Bernoulli (p). Probability mass function (pmf): P(X = 1) = p P(X = 0) = q, where p+q = 100%. If X is Bernoulli (p), then µ = E(X) = p, and  = √(pq). For example, suppose X = 1 if you have a pocket pair next hand; X = 0 if not. p = 5.88%. So, q = 94.12%. [Two ways to figure out p: (a) Out of choose(52,2) combinations for your two cards, 13 * choose(4,2) are pairs. 13 * choose(4,2) / choose(52,2) = 5.88%. (b) Imagine ordering your 2 cards. No matter what your 1st card is, there are 51 equally likely choices for your 2nd card, and 3 of them give you a pocket pair. 3/51 = 5.88%.] µ = E(X) = SD =  = √(.0588 * ) =

4. Binomial Random Variables. Suppose now X = # of times something with prob. p occurs, out of n independent trials Then X = Binomial (n.p). e.g. the number of pocket pairs, out of 10 hands. Now X could = 0, 1, 2, 3, …, or n. pmf: P(X = k) = choose(n, k) * p k q n - k. e.g. say n=10, k=3: P(X = 3) = choose(10,3) * p 3 q 7. Why? Could have , or , etc. choose(10, 3) choices of places to put the 1’s, and for each the prob. is p 3 q 7. Key idea: X = Y 1 + Y 2 + … + Y n, where the Y i are independent and Bernoulli (p). If X is Bernoulli (p), then µ = p, and  = √(pq). If X is Binomial (n,p), then µ = np, and  = √(npq). e.g. Suppose X = the number of pocket pairs you get in the next 100 hands. What’s P(X = 4)? What’s E(X)?  ? X = Binomial (100, 5.88%). P(X = k) = choose(n, k) * p k q n - k. So, P(X = 4) = choose(100, 4) * * = 13.9%, or 1 in 7.2. E(X) = np = 100 * =  = √(100 * * ) = So, out of 100 hands, you’d typically get about 5.88 pocket pairs, +/- around 2.35.

5. Geometric Random Variables. Suppose now X = # of trials until the first occurrence. (Again, each trial is independent, and each time the probability of an occurrence is p.) Then X = Geometric (p). e.g. the number of hands til you get your next pocket pair. [Including the hand where you get the pocket pair. If you get it right away, then X = 1.] Now X could be 1, 2, 3, …, up to ∞. pmf: P(X = k) = p 1 q k - 1. e.g. say k=5: P(X = 5) = p 1 q 4. Why? Must be Prob. = q * q * q * q * p. If X is Geometric (p), then µ = 1/p, and  = (√q) ÷ p. e.g. Suppose X = the number of hands til your next pocket pair. P(X = 12)? E(X)?  ? X = Geometric (5.88%). P(X = 12) = p 1 q 11 = * ^ 11 = 3.02%. E(X) = 1/p =  = sqrt(0.9412) / = So, you’d typically expect it to take 17 hands til your next pair, +/- around 16.5 hands.

6. Negative Binomial Random Variables. Suppose now X = # of trials until the rth occurrence. (Again, each trial is independent, and each time the probability of an occurrence is p.) Then X = negative binomial (r,p). e.g. the number of hands you have to play til you’ve gotten r=3 pocket pairs. X could be 3, 4, 5, …, up to ∞. pmf: P(X = k) = choose(k-1, r-1) p r q k - r. e.g. say r=3 & k=7: P(X = 7) = choose(6,2) p 3 q 4. Why? Out of the first 6 hands, there must be exactly r-1 = 2 pairs. Then pair on 7th. P(exactly 2 pairs on first 6 hands) = choose(6,2) p 2 q 4. P(pair on 7th) = p. If X is negative binomial (r,p), then µ = r/p, and  = (√rq) ÷ p. e.g. Suppose X = the number of hands til your 12th pocket pair. P(X = 100)? E(X)?  ? X = Geometric (12, 5.88%). P(X = 100) = choose(99,11) p 12 q 88 = choose(99,11) * ^ 12 * ^ 88 = 0.104%. E(X) = r/p = 12/ =  = sqrt(12*0.9412) / = So, you’d typically expect it to take hands til your 12th pair, +/- around 57.2 hands.

7. Harman/Negreanu P(flop two pairs)? Tough one. Don’t double-count (4  4  9  9  Q  ) and (9  9  4  4  Q  )! There are choose(13,2) possibilities for the NUMBERS of the two pairs. For each such choice (such as 4 & 9), there are choose(4,2) choices for the suits of the lower pair, and same for the suits of the higher pair. So, choose(13,2) * choose(4,2) * choose(4,2) different possibilities for the two pairs. For each such choice, there are 44 [ = 44] different possibilities for your fifth card, so that it’s not a full house but simply two pairs. So, P(flop two pairs) = choose(13,2) * choose(4,2) * choose(4,2) * 44 / choose(52,5) ~ 4.75%, or 1 in 21.