Tutorial 5: Discrete Probability I Reference:http://www.cs.duke.edu/courses/fall07/ cps102/lecture11.ppt Steve Gu Feb 15, 2008.

Slides:



Advertisements
Similar presentations
CompSci 102 Discrete Math for Computer Science March 29, 2012 Prof. Rodger Lecture adapted from Bruce Maggs/Lecture developed at Carnegie Mellon, primarily.
Advertisements

Great Theoretical Ideas in Computer Science.
Week 21 Basic Set Theory A set is a collection of elements. Use capital letters, A, B, C to denotes sets and small letters a 1, a 2, … to denote the elements.
Chapter 4 Probability and Probability Distributions
The Bernoulli distribution Discrete distributions.
Probability Fall 2011, CISC1400 Dr. Zhang.
Mathematics in Today's World
Solve for x. 28 = 4(2x + 1) = 8x = 8x + 8 – 8 – 8 20 = 8x = x Distribute Combine Subtract Divide.
1 Introduction to Stochastic Models GSLM Outline  course outline course outline  Chapter 1 of the textbook.
Lec 18 Nov 12 Probability – definitions and simulation.
1 Discrete Structures & Algorithms Discrete Probability.
4.2 Probability Models. We call a phenomenon random if individual outcomes are uncertain but there is nonetheless a regular distribution of outcomes in.
1 Discrete Math CS 280 Prof. Bart Selman Module Probability --- Part a) Introduction.
Random Variables A Random Variable assigns a numerical value to all possible outcomes of a random experiment We do not consider the actual events but we.
1 Discrete Probability Hsin-Lung Wu Assistant Professor Advanced Algorithms 2008 Fall.
Great Theoretical Ideas in Computer Science.
Probability Theory: Counting in Terms of Proportions Great Theoretical Ideas In Computer Science Steven Rudich, Anupam GuptaCS Spring 2004 Lecture.
Special Topics. Definitions Random (not haphazard): A phenomenon or trial is said to be random if individual outcomes are uncertain but the long-term.
Chapter 9 Introducing Probability - A bridge from Descriptive Statistics to Inferential Statistics.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Counting Principles (Permutations and Combinations )
Probability We love Section 9.3a and b!. Most people have an intuitive sense of probability, but that intuition is often incorrect… Let’s test your intuition.
Great Theoretical Ideas In Computer Science Steven Rudich, Anupam GuptaCS Spring 2004 Lecture 22April 1, 2004Carnegie Mellon University
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
Simple Mathematical Facts for Lecture 1. Conditional Probabilities Given an event has occurred, the conditional probability that another event occurs.
Counting and Probability. Counting Elements of Sets Theorem. The Inclusion/Exclusion Rule for Two or Three Sets If A, B, and C are finite sets, then N(A.
Probability Distributions. Essential Question: What is a probability distribution and how is it displayed?
Section 7.2. Section Summary Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Independence Bernoulli.
Great Theoretical Ideas in Computer Science.
Chapter 5.1 Probability Distributions.  A variable is defined as a characteristic or attribute that can assume different values.  Recall that a variable.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
 History and Relevance of probability theory Probability theory began with the study of game of chance that were related to gambling, like throwing a.
1.3 Simulations and Experimental Probability (Textbook Section 4.1)
Basic Concepts of Probability Coach Bridges NOTES.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
22C:19 Discrete Structures Discrete Probability Spring 2014 Sukumar Ghosh.
COMPSCI 102 Introduction to Discrete Mathematics.
5.1 Randomness  The Language of Probability  Thinking about Randomness  The Uses of Probability 1.
Statistics Lecture 4. Last class: measures of spread and box-plots Have completed Chapter 1 Today - Chapter 2.
Discrete Structures By: Tony Thi By: Tony Thi Aaron Morales Aaron Morales CS 490 CS 490.
Probability Basics Section Starter Roll two dice and record the sum shown. Repeat until you have done 20 rolls. Write a list of all the possible.
Sixth lecture Concepts of Probabilities. Random Experiment Can be repeated (theoretically) an infinite number of times Has a well-defined set of possible.
Natural Language Processing Giuseppe Attardi Introduction to Probability IP notice: some slides from: Dan Jurafsky, Jim Martin, Sandiway Fong, Dan Klein.
Random Variables Example:
Chapter 7: Probability Lesson 1: Basic Principles of Probability Mrs. Parziale.
Great Theoretical Ideas in Computer Science for Some.
Probability Experiments Probability experiment An action, or trial, through which specific results (counts, measurements, or responses) are obtained. Outcome.
Great Theoretical Ideas In Computer Science John LaffertyCS Fall 2006 Lecture 10 Sept. 28, 2006Carnegie Mellon University Classics.
Introduction to Algorithms Probability Review – Appendix C CIS 670.
Chapter 7 Sets & Probability Section 7.3 Introduction to Probability.
Basic Probability. Introduction Our formal study of probability will base on Set theory Axiomatic approach (base for all our further studies of probability)
Probability Distribution. Probability Distributions: Overview To understand probability distributions, it is important to understand variables and random.
PROBABILITY DISTRIBUTIONS DISCRETE RANDOM VARIABLES OUTCOMES & EVENTS Mrs. Aldous & Mr. Thauvette IB DP SL Mathematics.
Probability Models Section 6.2. The Language of Probability What is random? What is random? Empirical means that it is based on observation rather than.
Counting and Probability. Imagine tossing two coins and observing whether 0, 1, or 2 heads are obtained. Below are the results after 50 tosses Tossing.
Great Theoretical Ideas in Computer Science.
Introduction to probability (3) Definition: - The probability of an event A is the sum of the weights of all sample point in A therefore If A1,A2,…..,An.
Probability – part 1 Prof. Carla Gomes Rosen 1 1.
Probability Imagine tossing two coins and observing whether 0, 1, or 2 heads are obtained. It would be natural to guess that each of these events occurs.
Introduction to Discrete Mathematics
22C:19 Discrete Math Discrete Probability
Conditional Probability
Great Theoretical Ideas In Computer Science
Probability Theory: Counting in Terms of Proportions
Chapter 4 Section 1 Probability Theory.
Probability Theory: Counting in Terms of Proportions
e is the possible out comes for a model
Lecture 2 Basic Concepts on Probability (Section 0.2)
Presentation transcript:

Tutorial 5: Discrete Probability I Reference: cps102/lecture11.ppt Steve Gu Feb 15, 2008

Language of Probability The formal language of probability is a very important tool in describing and analyzing probability distribution

Probability Space A Probability space has 3 elements: – Sample Space : Ω All the possible individual outcomes of an experiment – Event Space : ₣ Set of all possible subsets of elements taken from Ω – Probability Measure : P A mapping from event space to real numbers such that for any E and F from ₣ – P(E)>=0 – P(Ω)=1 – P(E union F) = P(E) + P(F) We can write a probability space as (Ω, ₣,P)

Ω Sample space Sample Space: Ω p(x) = 0.2 probability of x

Events Any set E  Ω is called an event  p(x) x  E Pr D [E] = S Pr D [E] = 0.4

Uniform Distribution If each element has equal probability, the distribution is said to be uniform  p(x) = x  E Pr D [E] = |E| |Ω||Ω|

A fair coin is tossed 100 times in a row What is the probability that we get exactly half heads?

The sample space Ω is the set of all outcomes {H,T} 100 Each sequence in Ω is equally likely, and hence has probability 1/|Ω|=1/2 100 Using the Language

Ω = all sequences of 100 tosses x = HHTTT……TH p(x) = 1/|Ω| Visually

Set of all sequences {H,T} 100 Probability of event E = proportion of E in Ω Event E = Set of sequences with 50 H’s and 50 T’s / 2 100

Suppose we roll a white die and a black die What is the probability that sum is 7 or 11?

(1,1), (1,2), (1,3), (1,4), (1,5), (1,6), (2,1), (2,2), (2,3), (2,4), (2,5), (2,6), (3,1), (3,2), (3,3), (3,4), (3,5), (3,6), (4,1), (4,2), (4,3), (4,4), (4,5), (4,6), (5,1), (5,2), (5,3), (5,4), (5,5), (5,6), (6,1), (6,2), (6,3), (6,4), (6,5), (6,6) } Pr[E] = |E|/|Ω| = proportion of E in S = 8/36 Same Methodology! Ω = {

23 people are in a room Suppose that all possible birthdays are equally likely What is the probability that two people will have the same birthday?

x = (17,42,363,1,…, 224,177) 23 numbers And The Same Methods Again! Sample space Ω = {1, 2, 3, …, 366} 23 Event E = { x  W | two numbers in x are same } Count |E| instead!What is |E|?

all sequences in S that have no repeated numbers E = |Ω| = |E| = (366)(365)…(344) = 0.494… |Ω||Ω| |E| |Ω||Ω| = 0.506…

and is defined to be = S A B proportion of A  B More Language Of Probability The probability of event A given event B is written Pr[ A | B ] to B Pr [ A  B ] Pr [ B ]

event A = {white die = 1} event B = {total = 7} Suppose we roll a white die and black die What is the probability that the white is 1 given that the total is 7?

(1,1), (1,2), (1,3), (1,4), (1,5), (1,6), (2,1), (2,2), (2,3), (2,4), (2,5), (2,6), (3,1), (3,2), (3,3), (3,4), (3,5), (3,6), (4,1), (4,2), (4,3), (4,4), (4,5), (4,6), (5,1), (5,2), (5,3), (5,4), (5,5), (5,6), (6,1), (6,2), (6,3), (6,4), (6,5), (6,6) } Ω = { |B| Pr[B] 1/6 |A  B| =Pr [ A | B ] Pr [ A  B ] 1/36 == event A = {white die = 1}event B = {total = 7}

Independence! A and B are independent events if Pr[ A | B ] = Pr[ A ]  Pr[ A  B ] = Pr[ A ] Pr[ B ]  Pr[ B | A ] = Pr[ B ]

E.g., {A 1, A 2, A 3 } are independent events if: Pr[A 1 | A 2  A 3 ] = Pr[A 1 ] Pr[A 2 | A 1  A 3 ] = Pr[A 2 ] Pr[A 3 | A 1  A 2 ] = Pr[A 3 ] Pr[A 1 | A 2 ] = Pr[A 1 ]Pr[A 1 | A 3 ] = Pr[A 1 ] Pr[A 2 | A 1 ] = Pr[A 2 ] Pr[A 2 | A 3 ] = Pr[A 2 ] Pr[A 3 | A 1 ] = Pr[A 3 ]Pr[A 3 | A 2 ] = Pr[A 3 ] Independence! A 1, A 2, …, A k are independent events if knowing if some of them occurred does not change the probability of any of the others occurring

Silver and Gold One bag has two silver coins, another has two gold coins, and the third has one of each One bag is selected at random. One coin from it is selected at random. It turns out to be gold What is the probability that the other coin is gold?

Let G 1 be the event that the first coin is gold Pr[G 1 ] = 1/2 Let G 2 be the event that the second coin is gold Pr[G 2 | G 1 ] = Pr[G 1 and G 2 ] / Pr[G 1 ] = (1/3) / (1/2) = 2/3 Note: G 1 and G 2 are not independent

The Monty Hall Problem

The Monty Hall Problem

The Monty Hall Problem

Still doubt? Try playing the game online: – nty_hall_game.asp nty_hall_game.asp

Thank you! Q&A