Great Theoretical Ideas In Computer Science

Slides:



Advertisements
Similar presentations
CompSci 102 Discrete Math for Computer Science March 29, 2012 Prof. Rodger Lecture adapted from Bruce Maggs/Lecture developed at Carnegie Mellon, primarily.
Advertisements

Discrete Probability Chapter 7.
Great Theoretical Ideas in Computer Science.
Section 5.1 and 5.2 Probability
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
1 Introduction to Stochastic Models GSLM Outline  course outline course outline  Chapter 1 of the textbook.
1 Discrete Structures & Algorithms Discrete Probability.
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 4-1 Business Statistics: A Decision-Making Approach 7 th Edition Chapter.
1 Discrete Math CS 280 Prof. Bart Selman Module Probability --- Part a) Introduction.
Learning Goal 13: Probability Use the basic laws of probability by finding the probabilities of mutually exclusive events. Find the probabilities of dependent.
1 Basic Probability Statistics 515 Lecture Importance of Probability Modeling randomness and measuring uncertainty Describing the distributions.
Great Theoretical Ideas in Computer Science.
Probability Theory: Counting in Terms of Proportions Great Theoretical Ideas In Computer Science Steven Rudich, Anupam GuptaCS Spring 2004 Lecture.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Conditional Probability and Independence If A and B are events in sample space S and P(B) > 0, then the conditional probability of A given B is denoted.
Lecture Discrete Probability. 5.1 Probabilities Important in study of complexity of algorithms. Modeling the uncertain world: information, data.
Probability.
Section 7.2. Section Summary Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Independence Bernoulli.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review Instructor: Anirban Mahanti Office: ICT Class.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Computing Fundamentals 2 Lecture 6 Probability Lecturer: Patrick Browne
Topic 2: Intro to probability CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text, Probability and Statistics for Engineering.
1 CHAPTERS 14 AND 15 (Intro Stats – 3 edition) PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
5.1 Randomness  The Language of Probability  Thinking about Randomness  The Uses of Probability 1.
1 CHAPTER 7 PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY.
Sixth lecture Concepts of Probabilities. Random Experiment Can be repeated (theoretically) an infinite number of times Has a well-defined set of possible.
Natural Language Processing Giuseppe Attardi Introduction to Probability IP notice: some slides from: Dan Jurafsky, Jim Martin, Sandiway Fong, Dan Klein.
Week 21 Rules of Probability for all Corollary: The probability of the union of any two events A and B is Proof: … If then, Proof:
Great Theoretical Ideas in Computer Science for Some.
Tutorial 5: Discrete Probability I Reference: cps102/lecture11.ppt Steve Gu Feb 15, 2008.
Great Theoretical Ideas In Computer Science John LaffertyCS Fall 2006 Lecture 10 Sept. 28, 2006Carnegie Mellon University Classics.
Great Theoretical Ideas in Computer Science.
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
Chapter 7. Section 7.1 Finite probability  In a lottery, players win a large prize when they pick four digits that match, in the correct order, four.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and Statistics.
1 COMP2121 Discrete Mathematics Principle of Inclusion and Exclusion Probability Hubert Chan (Chapters 7.4, 7.5, 6) [O1 Abstract Concepts] [O3 Basic Analysis.
Section 5.1 and 5.2 Probability
Introduction to Discrete Probability
ICS 253: Discrete Structures I
Probability – part 1 Prof. Carla Gomes Rosen 1 1.
Chapter 6 6.1/6.2 Probability Probability is the branch of mathematics that describes the pattern of chance outcomes.
PROBABILITY AND PROBABILITY RULES
Natural Language Processing
Introduction to Discrete Mathematics
What is Probability? Quantification of uncertainty.
22C:19 Discrete Math Discrete Probability
CS104:Discrete Structures
CHAPTER 12: Introducing Probability
Lecture 10 – Introduction to Probability
Natural Language Processing
Applicable Mathematics “Probability”
Discrete Probability Chapter 7 With Question/Answer Animations
Introduction to Probability
CSCI 5832 Natural Language Processing
Probabilities and Proportions
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and.
Section 6.2 Probability Models
CONDITIONAL PROBABILITY
Probability Theory: Counting in Terms of Proportions
Lecture 2: Probability.
Basic Probability Rules
Probability Theory: Counting in Terms of Proportions
Chapter 4 Probability.
Lecture 2 Basic Concepts on Probability (Section 0.2)
A random experiment gives rise to possible outcomes, but any particular outcome is uncertain – “random”. For example, tossing a coin… we know H or T will.
Sets, Combinatorics, Probability, and Number Theory
Presentation transcript:

Great Theoretical Ideas In Computer Science Victor Adamchik Danny Sleator CS 15-251 Spring 2010 Lecture 11 Feb. 16, 2010 Carnegie Mellon University Probability Theory I

We will consider chance experiments with a finite number of possible outcomes w1, w2, . . . , wn The sample space  of the experiment is the set of all possible outcomes. The roll of a die:  = {1,2,3,4,5,6} Each subset of a sample space is defined to be an event. The event: E = {2,4,6}

Probability of a event Let X be a random variable which denotes the value of the outcome of a certain experiment. We will assign probabilities to the possible outcomes of an experiment. We do this by assigning to each outcome wj a nonnegative number m(wj) in such a way that m(w1) + … + m(wn)=1 The function m(wj) is called the distribution function of the random variable X.

distribution function Probabilities For any subset E of , we define the probability of E to be the number P(E) given by distribution function event

From Random Variables to Events For any random variable X and value a, we can define the event E that X = a P(E) = P(X=a) = P({t  | X(t)=a}) E.g., if A is the event that both coins are heads, then X=1 when both coins are heads and X=0 otherwise.

From Random Variables to Events It’s a function on the sample space ->[0,) It’s a variable with a probability distribution on its values You should be comfortable with both views

Example 1 Consider an experiment in which a coin is tossed twice. = {HH,TT,HT,TH} m(HH)=m(TT)=m(HT)=m(TH)=1/4 Let E ={HH,HT,TH} be the event that at least one head comes up. Then probability of E is P(E) = m(HH)+m(HT)+m(TH) =3/4 Notice that it is an immediate consequence P({w}) = m(w)

Example 2 Three people, A, B, and C, are running for the same office, and we assume that one and only one of them wins. Suppose that A and B have the same chance of winning, but that C has only 1/2 the chance of A or B. Let E be the event that either A or C wins. P(E) = m(A)+m(C) = 2/5+1/5=3/5

Theorem The probabilities satisfy the following properties: P() = 1 P(A  B) = P(A) + P(B), for disjoint A and B P(A) = 1- P(A)

P(A  B) = P(A) + P(B), for disjoint A and B Proof.

P(AB) = P(A) + P(B) - P(A  B) More Theorems For any events A and B P(A) = P(A  B) + P(A  B) For any events A and B P(AB) = P(A) + P(B) - P(A  B)

Uniform Distribution When a coin is tossed and the die is rolled, we assign an equal probability to each outcome. The uniform distribution on a sample space containing n elements is the function m defined by m(w) = 1/n for every w

Example Consider the experiment that consists of rolling a pair of dice. What is the probability of getting a sum of 7 or a sum of 11? We assume that each of 36 outcomes is equally likely.

Example E F S = { (1,1), (1,2), (1,3), (1,4), (1,5), (1,6), (1,1), (1,2), (1,3), (1,4), (1,5), (1,6), (2,1), (2,2), (2,3), (2,4), (2,5), (2,6), (3,1), (3,2), (3,3), (3,4), (3,5), (3,6), (4,1), (4,2), (4,3), (4,4), (4,5), (4,6), (5,1), (5,2), (5,3), (5,4), (5,5), (5,6), (6,1), (6,2), (6,3), (6,4), (6,5), (6,6) } F P(E)= 6 * 1/36 P(F)= 2 * 1/36 P(E  F)= 8/36

A fair coin is tossed 100 times in a row What is the probability that we get exactly half heads?

The sample space  is the set of all outcomes (sequences) {H,T}100 Each sequence in  is equally likely, and hence has probability 1/||=1/2100

Visually  = all sequences of 100 tosses t = HHTTT……TH P(t) = 1/|S|

Event E = Set of sequences with 50 H’s and 50 T’s Set of all 2100 sequences {H,T}100 Probability of event E = proportion of E in S 100 50 / 2100

How many people do we need to have in a room to make it a favorable Birthday Paradox How many people do we need to have in a room to make it a favorable bet (probability of success greater than 1/2) that two people in the room will have the same birthday?

And The Same Methods Again! Sample space  = 365x We must find sequences that have no duplication of birthdays. Event E = { w   | two numbers not the same }

Birthday Paradox Number of People Probability 21 0.556 22 0.524 23 0.492 24 0.461

Infinite Sample Spaces A coin is tossed until the first time that a head turns up.  = {1, 2, 3, 4, …} A distribution function: m(n) = 2-n.

Infinite Sample Spaces Let E be the event that the first time a head turns up is after an even number of tosses. E = {2, 4, 6, …}

Conditional Probability Consider our voting example: three candidates A, B, and C are running for office. We decided that A and B have an equal chance of winning and C is only 1/2 as likely to win as A. Suppose that before the election is held, A drops out of the race. What are new probabilities to the events B and C P(B|A)=2/3 P(C|A)=1/3

Conditional Probability Let  = {w1, w2, . . . , wr} be the original sample space with distribution function m(wk). Suppose we learn that the event E has occurred. We want to assign a new distribution function m(wk |E) to reflect this fact. It is reasonable to assume that the probabilities for wk in E should have the same relative magnitudes that they had before we learned that E had occurred: m(wk |E) = c m(wk), where c is some constant. Also, if wk is not in E, we want m(wk |E) = 0

Conditional Probability By the probability law: From here,

Conditional Probability The probability of event F given event E is written P(F | E ) and is defined by  proportion of F  E F E to E

Suppose we roll a white die and black die What is the probability that the white is 1 given that the total is 7? event A = {white die = 1} event B = {total = 7}

S = { (1,1), (1,2), (1,3), (1,4), (1,5), (1,6), (2,1), (2,2), (2,3), (2,4), (2,5), (2,6), (3,1), (3,2), (3,3), (3,4), (3,5), (3,6), (4,1), (4,2), (4,3), (4,4), (4,5), (4,6), (5,1), (5,2), (5,3), (5,4), (5,5), (5,6), (6,1), (6,2), (6,3), (6,4), (6,5), (6,6) } P ( A  B ) |A  B| 1 P(A | B ) = = = P(B) |B| 6 event A = {white die = 1} event B = {total = 7}

A and B are independent events if Independence! A and B are independent events if P(A | B ) = P(A )  P(A  B ) = P(A) P(B ) P(B | A ) = P(B )

What is the probability that the other coin is gold? Silver and Gold One bag has two silver coins, another has two gold coins, and the third has one of each One bag is selected at random. One coin from it is selected at random. It turns out to be gold What is the probability that the other coin is gold?

Let G1 be the event that the first coin is gold P(G1) = 1/2 Let G2 be the event that the second coin is gold P(G2 | G1 ) = P(G1  G2) / P(G1) = (1/3) / (1/2) = 2/3 Note: G1 and G2 are not independent

Monty Hall Problem The host show hides a car behind one of 3 doors at random. Behind the other two doors are goats. You select a door. Announcer opens one of others with no prize. You can decide to keep or switch. What to do? To switch or not to switch?

This is Tricky? We are inclined to think: “After one door is opened, others are equally likely…”

Monty Hall Problem Sample space = { car behind door 1, car behind door 2, car behind door 3 } Each has probability 1/3 Staying we win if we choose the correct door Switching we win if we choose the incorrect door P( choosing correct door ) = 1/3 P(choosing incorrect door ) = 2/3

Monty Hall Problem Let the doors be called X, Y and Z. Let Cx, Cy, Cz be the events that the car is behind door X and so on. Let Hx, Hy, Hz be the events that the host opens door X and so on. Supposing that you choose door X, the possibility that you win a car if you switch is P(Hz  Cy) + P(Hy  Cz) = P(Hz|Cy) P(Cy) + P(Hy|Cz) P(Cz) = 1 x 1/3 + 1 x 1/3 = 2/3

Another Paradox Consider a family with two children. Given that one of the children is a boy, what is the probability that both children are boys? 1/3

Tree Diagram 1/2 1/2 girl boy 1/2 1/2 1/2 1/2 girl boy girl boy 1/4

Tree Diagram 1/2 1/2 girl boy 1/2 1/2 1/2 girl boy boy 1/3 1/3 1/3

Posterior probability Bayes’ formula Suppose we have a set of disjoint events (hypotheses ) = H1  H2 … Hk We also have an event E called evidence. We want to find the probabilities for the hypotheses given the evidence. Prior probability Posterior probability

Expected Value

A Quick Calculation… I flip a coin 3 times. What is the expected (average) number of heads? Let X denote the number of heads which appear: X = 0, 1, 2 and 3. The corresponding probabilities are 1/8, 3/8, 3/8, and 1/8. E(X) = (1/8)×0 + (3/8)×1 + (3/8)×2 + (1/8)×3 = 1.5 This is a frequency interpretation of probability

Definition: Expectation The expectation, or expected value of a random variable X is written as E(X) or E[X], and is  x m(x) x   E[X] = with a sample space  and a distribution function m(x).

Exercise Suppose that we toss a fair coin until a head first comes up, and let X represent the number of tosses which were made. What is E[X]?

Language of Probability Events P( A | B ) Independence Random Variables Expectation Here’s What You Need to Know…