Stat 35b: Introduction to Probability with Applications to Poker

Slides:



Advertisements
Similar presentations
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Savage/Tyler, Kaplan/Gazes 2.P(flop a full house) 3.Bernoulli random.
Advertisements

Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1. Bayes’ Rule again 2.Gold vs. Benyamine 3.Bayes’ Rule example 4.Variance,
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Midterms. 2.Hellmuth/Gold. 3.Poisson. 4.Continuous distributions.
Hypergeometric Random Variables. Sampling without replacement When sampling with replacement, each trial remains independent. For example,… If balls are.
Probability Distributions
Stat 321 – Lecture 19 Central Limit Theorem. Reminders HW 6 due tomorrow Exam solutions on-line Today’s office hours: 1-3pm Ch. 5 “reading guide” in Blackboard.
Chapter 5 Probability Distributions
Mutually Exclusive: P(not A) = 1- P(A) Complement Rule: P(A and B) = 0 P(A or B) = P(A) + P(B) - P(A and B) General Addition Rule: Conditional Probability:
Stat 321- Day 13. Last Time – Binomial vs. Negative Binomial Binomial random variable P(X=x)=C(n,x)p x (1-p) n-x  X = number of successes in n independent.
1 Introduction to Stochastic Models GSLM Outline  independence of random variables  variance and covariance  two useful ideas  examples 
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Stat 13, Thu 5/3/ Correction on normal notation. 2. Normal percentiles. 3. Normal probability plots. 4. Bernoulli and Binomial random variables.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Hand in hw4. 2.Review list 3.Tournament 4.Sample problems * Final.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day, Thur 3/8/12: 0.HAND IN HW3 again! 1.E(X+Y) example corrected. 2.Random.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 0. Collect hw2, return hw1, give out hw3. 1.Project A competition.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Midterm. 2.Review of Bernoulli and binomial random variables. 3.Geometric.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.E(cards til 2 nd king). 2.Negative binomial. 3.Rainbow flops examples,
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Collect Hw4. 2.Review list. 3.Answers to hw4. 4.Project B tournament.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day, Tue 3/13/12: 1.Collect Hw WSOP main event. 3.Review list.
X = 2*Bin(300,1/2) – 300 E[X] = 0 Y = 2*Bin(30,1/2) – 30 E[Y] = 0.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Addiction 2.Syllabus, etc. 3. Wasicka/Gold/Binger Example 4.Meaning.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1. Review list 2.Bayes’ Rule example 3.CLT example 4.Other examples.
Stat 13, Thu 4/19/ Hand in HW2! 1. Resistance. 2. n-1 in sample sd formula, and parameters and statistics. 3. Probability basic terminology. 4. Probability.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Project B example, again 2.Booth vs. Ivey 3.Bayes Rule examples.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.E(X+Y) = E(X) + E(Y) examples. 2.CLT examples. 3.Lucky poker. 4.Farha.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Hand in hw1! Get hw2. 2.Combos, permutations, and A  vs 2  after.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Tournaments 2.Review list 3.Random walk and other examples 4.Evaluations.
1)Hand in HW. 2)No class Tuesday (Veteran’s Day) 3)Midterm Thursday (1 page, double-sided, of notes allowed) 4)Review List 5)Review of Discrete variables.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Uniform, normal, and exponential. 2.Exponential example. 3.Uniform.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.HW4 notes. 2.Law of Large Numbers (LLN). 3.Central Limit Theorem.
(Day 14 was review. Day 15 was the midterm.) Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Return and review.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1. Combos, permutations, and A  vs 2  after first ace 2.Conditional.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.Review List 2.Review of Discrete variables 3.Nguyen / Szenkuti.
Section 7.3. Why we need Bayes?  How to assess the probability that a particular event occurs on the basis of partial evidence.  The probability p(F)
Stat 35b: Introduction to Probability with Applications to Poker
Binomial Distributions
MAT 446 Supplementary Note for Ch 3
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Probabilities What is the probability that among 23 people (this class) there will be a shared birthday?
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Simple stochastic models 1
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Binomial Distribution
Stat 35b: Introduction to Probability with Applications to Poker
Stat 35b: Introduction to Probability with Applications to Poker
Week 8 Chapter 14. Random Variables.
The Binomial Distribution
Stat 35b: Introduction to Probability with Applications to Poker
Exam 2 - Review Chapters
Stat 35b: Introduction to Probability with Applications to Poker
Expected values and variances
4. Expectation and Variance Joint PMFs
Stat 35b: Introduction to Probability with Applications to Poker
Elementary Statistics
Distributions Discrete and Continuous
Stat 35b: Introduction to Probability with Applications to Poker
6: Binomial Probability Distributions
Stat 35b: Introduction to Probability with Applications to Poker
Presentation transcript:

Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: Bayes’s Rule Bernoulli random variables. Binomial random variables. Farha vs. Antonius, expected value and variance. Midterm is Feb 21, in class. 50 min. Open book plus one page of notes, double sided. Bring a calculator!   u    u 

1. Bayes’s rule, p49-52. Suppose that B1, B2 , Bn are disjoint events and that exactly one of them must occur. Suppose you want P(B1 | A), but you only know P(A | B1 ), P(A | B2 ), etc., and you also know P(B1), P(B2), …, P(Bn). Bayes’ Rule: If B1 , …, Bn are disjoint events with P(B1 or … or Bn) = 1, then P(Bi | A) = P(A | Bi ) * P(Bi) ÷ [ ∑P(A | Bj)P(Bj)]. Why? Recall: P(X | Y) = P(X & Y) ÷ P(Y). So P(X & Y) = P(X | Y) * P(Y). P(B1 | A) = P(A & B1 ) ÷ P(A) = P(A & B1 ) ÷ [ P(A & B1) + P(A & B2) + … + P(A & Bn) ] = P(A | B1 ) * P(B1) ÷ [ P(A | B1)P(B1) + P(A | B2)P(B2) + … + P(A | Bn)P(Bn) ].

Bayes’s rule, continued. Bayes’s rule: If B1 , …, Bn are disjoint events with P(B1 or … or Bn) = 1, then P(Bi | A) = P(A | Bi ) * P(Bi) ÷ [ ∑P(A | Bj)P(Bj)]. See example 3.4.1, p50. If a test is 95% accurate and 1% of the pop. has a condition, then given a random person from the population, P(she has the condition | she tests positive) = P(cond | +) = P(+ | cond) P(cond) ÷ [P(+ | cond) P(cond) + P(+ | no cond) P(no cond)] = 95% x 1% ÷ [95% x 1% + 5% x 99%] ~ 16.1%. Tests for rare conditions must be extremely accurate.

Bayes’ rule example. Suppose P(your opponent has the nuts) = 1%, and P(opponent has a weak hand) = 10%. Your opponent makes a huge bet. Suppose she’d only do that with the nuts or a weak hand, and that P(huge bet | nuts) = 100%, and P(huge bet | weak hand) = 30%. What is P(nuts | huge bet)? P(nuts | huge bet) = P(huge bet | nuts) * P(nuts) ------------------------------------------------------------------------------------------- P(huge bet | nuts) P(nuts) + P(huge bet | horrible hand) P(horrible hand) = 100% * 1% --------------------------------------- 100% * 1% + 30% * 10% = 25%.

2. Bernoulli Random Variables, ch. 5.1. If X = 1 with probability p, and X = 0 otherwise, then X = Bernoulli (p). Probability mass function (pmf): P(X = 1) = p P(X = 0) = q, where p+q = 100%. If X is Bernoulli (p), then µ = E(X) = p, and s = √(pq). For example, suppose X = 1 if you have a pocket pair next hand; X = 0 if not. p = 5.88%. So, q = 94.12%. [Two ways to figure out p: (a) Out of choose(52,2) combinations for your two cards, 13 * choose(4,2) are pairs. 13 * choose(4,2) / choose(52,2) = 5.88%. (b) Imagine ordering your 2 cards. No matter what your 1st card is, there are 51 equally likely choices for your 2nd card, and 3 of them give you a pocket pair. 3/51 = 5.88%.] µ = E(X) = .0588. SD = s = √(.0588 * 0.9412) = 0.235.

3. Binomial Random Variables, ch. 5.2. Suppose now X = # of times something with prob. p occurs, out of n independent trials Then X = Binomial (n.p). e.g. the number of pocket pairs, out of 10 hands. Now X could = 0, 1, 2, 3, …, or n. pmf: P(X = k) = choose(n, k) * pk qn - k. e.g. say n=10, k=3: P(X = 3) = choose(10,3) * p3 q7 . Why? Could have 1 1 1 0 0 0 0 0 0 0, or 1 0 1 1 0 0 0 0 0 0, etc. choose(10, 3) choices of places to put the 1’s, and for each the prob. is p3 q7 . Key idea: X = Y1 + Y2 + … + Yn , where the Yi are independent and Bernoulli (p). If X is Bernoulli (p), then µ = p, and s = √(pq). If X is Binomial (n,p), then µ = np, and s = √(npq).

Binomial Random Variables, continued. Suppose X = the number of pocket pairs you get in the next 100 hands. What’s P(X = 4)? What’s E(X)? s? X = Binomial (100, 5.88%). P(X = k) = choose(n, k) * pk qn - k. So, P(X = 4) = choose(100, 4) * 0.0588 4 * 0.9412 96 = 13.9%, or 1 in 7.2. E(X) = np = 100 * 0.0588 = 5.88. s = √(100 * 0.0588 * 0.9412) = 2.35. So, out of 100 hands, you’d typically get about 5.88 pocket pairs, +/- around 2.35.

4) Farha vs. Antonius, expected value and variance. E(X+Y) = E(X) + E(Y). Whether X & Y are independent or not! Similarly, E(X + Y + Z + …) = E(X) + E(Y) + E(Z) + … And, if X & Y are independent, then V(X+Y) = V(X) + V(Y). so SD(X+Y) = √[SD(X)^2 + SD(Y)^2]. Also, if Y = 9X, then E(Y) = 9E(Y), and SD(Y) = 9SD(X). V(Y) = 81V(X). Farha vs. Antonius. Running it 4 times. Let X = chips you have after the hand. Let p be the prob. you win. X = X1 + X2 + X3 + X4, where X1 = chips won from the first “run”, etc. E(X) = E(X1) + E(X2) + E(X3) + E(X4) = 1/4 pot (p) + 1/4 pot (p) + 1/4 pot (p) + 1/4 pot (p) = pot (p) = same as E(Y), where Y = chips you have after the hand if you ran it once! But the SD is smaller: clearly X1 = Y/4, so SD(X1) = SD(Y)/4. So, V(X1) = V(Y)/16. V(X) ~ V(X1) + V(X2) + V(X3) + V(X4), = 4 V(X1) = 4 V(Y) / 16 = V(Y) / 4. So SD(X) = SD(Y) / 2.