Lecture 7 5.1 5.2 Discrete Probability. 5.1 Probabilities Important in study of complexity of algorithms. Modeling the uncertain world: information, data.

Slides:



Advertisements
Similar presentations
Discrete Probability Chapter 7.
Advertisements

Probability Three basic types of probability: Probability as counting
Great Theoretical Ideas in Computer Science.
Birthday Problem What is the smallest number of people you need in a group so that the probability of 2 or more people having the same birthday is greater.
Basic Terms of Probability Section 3.2. Definitions Experiment: A process by which an observation or outcome is obtained. Sample Space: The set S of all.
Probability Fall 2011, CISC1400 Dr. Zhang.
From Randomness to Probability
March 31, 2015Applied Discrete Mathematics Week 8: Advanced Counting 1 Discrete Probability Example I: A die is biased so that the number 3 appears twice.
Chapter 7 Probability 7.1 Experiments, Sample Spaces, and Events
22C:19 Discrete Structures Discrete Probability Fall 2014 Sukumar Ghosh.
Section 16.1: Basic Principles of Probability
1 Discrete Structures & Algorithms Discrete Probability.
1 Discrete Math CS 280 Prof. Bart Selman Module Probability --- Part a) Introduction.
1 Copyright M.R.K. Krishna Rao 2003 Chapter 5. Discrete Probability Everything you have learned about counting constitutes the basis for computing the.
1 Section 5.1 Discrete Probability. 2 LaPlace’s definition of probability Number of successful outcomes divided by the number of possible outcomes This.
Chapter 9 Introducing Probability - A bridge from Descriptive Statistics to Inferential Statistics.
1 9/8/2015 MATH 224 – Discrete Mathematics Basic finite probability is given by the formula, where |E| is the number of events and |S| is the total number.
Conditional Probability and Independence If A and B are events in sample space S and P(B) > 0, then the conditional probability of A given B is denoted.
1 9/23/2015 MATH 224 – Discrete Mathematics Basic finite probability is given by the formula, where |E| is the number of events and |S| is the total number.
Chapter 3 Section 3.2 Basic Terms of Probability.
1 Probability. 2 Today’s plan Probability Notations Laws of probability.
Section 7.1. Section Summary Finite Probability Probabilities of Complements and Unions of Events Probabilistic Reasoning.
From Randomness to Probability
Chapter 7 With Question/Answer Animations. Section 7.1.
Section 7.2. Section Summary Assigning Probabilities Probabilities of Complements and Unions of Events Conditional Probability Independence Bernoulli.
STA Lecture 61 STA 291 Lecture 6 Randomness and Probability.
Discrete Probability CSC-2259 Discrete Structures Konstantin Busch - LSU1.
Basic Probability Section 7.1. Definitions Sample Space: The set of all possible outcomes in a given experiment or situation Often denoted by S Event:
Homework Homework due now. Reading: relations
Lesson 6 – 2b Probability Models Part II. Knowledge Objectives Explain what is meant by random phenomenon. Explain what it means to say that the idea.
September1999 CMSC 203 / 0201 Fall 2002 Week #9 – 21/23/25 October 2002 Prof. Marie desJardins.
Probability Formal study of uncertainty The engine that drives statistics Primary objective of lecture unit 4: use the rules of probability to calculate.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Copyright © 2010 Pearson Education, Inc. Chapter 14 From Randomness to Probability.
From Randomness to Probability Chapter 14. Dealing with Random Phenomena A random phenomenon is a situation in which we know what outcomes could happen,
ICS 253: Discrete Structures I Discrete Probability King Fahd University of Petroleum & Minerals Information & Computer Science Department.
Sixth lecture Concepts of Probabilities. Random Experiment Can be repeated (theoretically) an infinite number of times Has a well-defined set of possible.
EMIS 7300 SYSTEMS ANALYSIS METHODS Spring 2006 Dr. John Lipp Copyright © Dr. John Lipp.
Introduction Lecture 25 Section 6.1 Wed, Mar 22, 2006.
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
3.4 Elements of Probability. Probability helps us to figure out the liklihood of something happening. The “something happening” is called and event. The.
Section 7.1. Probability of an Event We first define these key terms: An experiment is a procedure that yields one of a given set of possible outcomes.
3/7/20161 Now it’s time to look at… Discrete Probability.
Expected Value and Fair Game S-MD.6 (+) Use probabilities to make fair decisions (e.g., drawing by lots, using a random number generator). S-MD.7 (+) Analyze.
Ch 11.7 Probability. Definitions Experiment – any happening for which the result is uncertain Experiment – any happening for which the result is uncertain.
AP Statistics From Randomness to Probability Chapter 14.
Introduction to probability (3) Definition: - The probability of an event A is the sum of the weights of all sample point in A therefore If A1,A2,…..,An.
Chapter 7. Section 7.1 Finite probability  In a lottery, players win a large prize when they pick four digits that match, in the correct order, four.
1 COMP2121 Discrete Mathematics Principle of Inclusion and Exclusion Probability Hubert Chan (Chapters 7.4, 7.5, 6) [O1 Abstract Concepts] [O3 Basic Analysis.
Essential Ideas for The Nature of Probability
Introduction to Discrete Probability
Now it’s time to look at…
Introduction to Discrete Probability
ICS 253: Discrete Structures I
PROBABILITY Probability Concepts
What Is Probability?.
Chapter 6: Discrete Probability
What is Probability? Quantification of uncertainty.
CS104:Discrete Structures
Conditional Probability
Unit 4 Probability Basics
Discrete Probability Chapter 7 With Question/Answer Animations
PROBABILITY AND STATISTICS
Great Theoretical Ideas In Computer Science
Now it’s time to look at…
Now it’s time to look at…
Now it’s time to look at…
CSE 321 Discrete Structures
Lecture 2 Basic Concepts on Probability (Section 0.2)
Presentation transcript:

Lecture Discrete Probability

5.1 Probabilities Important in study of complexity of algorithms. Modeling the uncertain world: information, data. Applications in error-correcting coding, data compression, data restoration, medical expert systems, search engines, etc. Modern AI deals with uncertainty in the world (was my measurement correct, were my assumptions correct). Probability theory is the answer to that. Developed in the context of gambling.

5.1 Q: If we roll a die VERY (infinitely) often, what is the frequency (or fraction) that we find “1” ? A: 1/6 (if the die is fair). Each side has equal chance of landing face up. Sample space: The space of all possible outcomes S={ }. Event: A subset of the sample space E={1}. Probability of an event: p(E)=|E|/|S|=1/6 In this example we all outcomes are equally likely! This is not the case in general as we will see later. Q: If we roll 2 dice, what is the probability that the sum is 7? A: |S| = 6 x 6 = 36. |E| = (1,6), (2,5), (3,4), (4,3), (5,2), (6,1) = 6. p(E)=1/6

5.1 It’s all counting again! example: A lottery has a big price if you correctly guess four digits out of 4 in the right order. A small price is won if you have guessed 3 digits correctly at the correct location.  |S| = 10^4. |E-big| = 1 |E-small| = ? There are 4 ways to have 3 digits correct (and one digit wrong therefore). For each of these the number of possibilities are: 1 x 1 x 1 x 9 (9 for incorrect digit). |E-small| = 4 x 9 = 36. p(E-small) = 36 / 10^4.

5.1 Examples: 1) What is the probability to draw a full house from a deck of cards (2 of one kind & 3 of one kind) ?  First draw the 3 of a kind, then the 2 of a kind (order matters). P(13,2) is number of ways to draw 2 different kinds out of 13 kinds. C(4,3) is number of ways to pick 3 cards among 4 (order doesn’t matter). C(4,2) is number of ways to pick 2 cards among 4 (order doesn’t matter). C(52,5) is total number of 5 cards drawn from a deck of 52. solution: P(13,2) x C(4,3) x C(4,2) / C(52,5). 2) Probability of sequence 11,4,17,39,23 out of 50 when sampling a) without replacement, b) with replacement. a) |E| = 1, |S| = P(50,5)  P(E) = 1/P(50,5). (sorry - different P’s !) b) |E| = 1, |S| = 50^5  P(E) = 1/50^5

5.1 Theorem: Let denote the complement of E. Then: Proof: | | = |S|-|E|. = |S| - |E| / |S| = 1 – p(E). Example: We generate a bit-string of length 10. What is the probability that at least one bit has a 0? E = large and hard to enumerate. = small: no zeros in the bit-string! Only one possibility. p(E) = 1 - = 1 – 1 / 2^10 = 1023/1024.

5.1 Set S1 is the set of all Irish citizens with blue eyes. Set S2 is the set of all Irish citizens with black hair. Set S3 is the set of all Irish citizens. |S1| = 1000, |S2|=3000, |S3|=10,000, Q: If we meet a random Irish citizen in the streets of Dublin, what is the probability that he/she has blue eyes OR black hair? A: Total number of possibilities |S3| = 10,000. Total “area covered” by S1 U S2 is: |E| = |S1 U S2| = |S1| + |S2| - p(E) = p(S1 U S2) = |S1| + |S2| - / |S| = p(S1) + p(S2) – p( ) U S1 S2 S3 Theorem: p(E1 U E2) = p(E1) + p(E2) – p( )

5.1 A famous example: You participate in a game where there are 3 doors with only one hiding a big price. You pick a door. The game show host (who knows where the price is), opens another empty door and offers you to switch. Should you?  You don’t switch: You have probability 1/3 to pick the correct door. If you don’t switch that doesn’t change (imagine doing the experiment a million times with this strategy).  You switch: If you got the correct door (prob. 1/3) and you switch you lost. If you got the wrong door (prob. 2/3) and you switch you win!

5.2 Until now, we have only dealt with events that are equally likely. What about this example: Q: We have a coin that is heavier one side than on the other. Consequently it comes up heads ¾ of time. What is the frequency with which we see tails? A: If we see it heads up ¾ of the time, we’ll see it tails up ¼ of the time. Formally: Let S be the sample space (a set) and s be an element in {S}. To each element of S we assign a probability p(s) that fulfils 2 conditions: 1) 2)

5.2 p(s) is called a probability distribution. This is a generalization of the definition of probability in the previous section because there it was assumed that all elements were equally likely: p(s) = 1 / |S|. This is called the uniform distribution. Definition: assume that E is a subset of S with |E| elements. The total probability of the event is now given by:

5.2 Example: We have a loaded die such that 3 appears twice as often and the other side appear equally often. What is the probability of finding an odd number when we roll the die? p(1)=p(2)=p(4)=p(5)=p(6) = q p(3) = 2q  5q + 2q.  q = 1/7 odd outcomes: E = {1 3 5} p(E) = p(1)+p(2)+p(3) = 4/7.

5.2 Theorem: Let denote the complement of E. Then: Still true because: where we used that

5.2 Theorem: When E1, E2,...,En etc are disjoint subsets of S we have: Situation is more difficult if we have overlaps in the subsets: For 2 subsets we have: Theorem: p(E1 U E2) = p(E1) + p(E2) – p( )

5.2 Exercises Ex. 20 p. 377: Find the smallest number of people in a room such that the probability that someone has its birthday today is larger than ½. assuming: a year has 366 days and all birthdays are equally likely. Let’s say there are n people. S = {all combinations of birthdays for n people} |S| = 366^n E1 = {one person has his/her birthday today} E2 = {two people have their birthday today} E3 =.... none of the people has their birthday today. = 365^n n=254

5.2 Exercises Exercise 12 p. 377 p(E) = 0.7 p(F) = 0.5 p(E U F) = p(E) + p(F) – p( ) We know p(E) and p(F) and we know that p( ) <= 0.5 (total overlap F is subset E) p( ) >= 0.2 (minimal overlap since |E| + |F| - | | <= 1  0.7 <= p(E U F) <= 1