Independence and Counting

Slides:



Advertisements
Similar presentations
Mathematics in Today's World
Advertisements

Introduction to Probability
1 Copyright M.R.K. Krishna Rao 2003 Chapter 5. Discrete Probability Everything you have learned about counting constitutes the basis for computing the.
CHAPTER 10: Introducing Probability
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
5.1 Basic Probability Ideas
Conditional Probability and Independence If A and B are events in sample space S and P(B) > 0, then the conditional probability of A given B is denoted.
Discrete Mathematical Structures (Counting Principles)
1 2. Independence and Bernoulli Trials Independence: Events A and B are independent if It is easy to show that A, B independent implies are all independent.
Introduction to Probability © Christine Crisp “Teach A Level Maths” Statistics 1.
Basic Probability Rules Let’s Keep it Simple. A Probability Event An event is one possible outcome or a set of outcomes of a random phenomenon. For example,
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Computing Fundamentals 2 Lecture 6 Probability Lecturer: Patrick Browne
1.4 Equally Likely Outcomes. The outcomes of a sample space are called equally likely if all of them have the same chance of occurrence. It is very difficult.
ICS 253: Discrete Structures I Discrete Probability King Fahd University of Petroleum & Minerals Information & Computer Science Department.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
Week 21 Rules of Probability for all Corollary: The probability of the union of any two events A and B is Proof: … If then, Proof:
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
3/7/20161 Now it’s time to look at… Discrete Probability.
Counting and Probability. Imagine tossing two coins and observing whether 0, 1, or 2 heads are obtained. Below are the results after 50 tosses Tossing.
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and Statistics.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and.
Now it’s time to look at…
ICS 253: Discrete Structures I
What Is Probability?.
Chapter 7: Counting Principles
Chapter 5: Probability: What are the Chances?
What is Probability? Quantification of uncertainty.
Discrete Probability Chapter 7 With Question/Answer Animations
Probability Rules!!! … and yea… Rules About Probability
Chapter 5: Probability: What are the Chances?
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and.
Now it’s time to look at…
CHAPTER 10: Introducing Probability
Section 6.2 Probability Models
Now it’s time to look at…
Sets and Probabilistic Models
Chapter 6: Probability: What are the Chances?
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Berlin Chen Department of Computer Science & Information Engineering
Discrete Random Variables: Expectation, Mean and Variance
Further Topics on Random Variables: Derived Distributions
Discrete Random Variables: Basics
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Covariance and Correlation
Sets and Probabilistic Models
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Discrete Random Variables: Basics
Further Topics on Random Variables: Derived Distributions
Experiments, Outcomes, Events and Random Variables: A Revisit
Conditional Probability, Total Probability Theorem and Bayes’ Rule
Discrete Random Variables: Expectation, Mean and Variance
Discrete Random Variables: Expectation, Mean and Variance
Sets and Probabilistic Models
Independence and Counting
Independence and Counting
Berlin Chen Department of Computer Science & Information Engineering
Random Variables Binomial and Hypergeometric Probabilities
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Derived Distributions
Discrete Random Variables: Basics
Sets and Probabilistic Models
Conditional Probability, Total Probability Theorem and Bayes’ Rule
A random experiment gives rise to possible outcomes, but any particular outcome is uncertain – “random”. For example, tossing a coin… we know H or T will.
Chapter 11 Probability.
Sets, Combinatorics, Probability, and Number Theory
Presentation transcript:

Independence and Counting Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University Reference: - D. P. Bertsekas, J. N. Tsitsiklis, Introduction to Probability , Sections 1.5-1.6

Independence (1/2) Recall that conditional probability captures the partial information that event provides about event A special case arises when the occurrence of provides no such information and does not alter the probability that has occurred is independent of ( also is independent of )

Independence (2/2) and are independent => and are disjoint (?) No ! Why ? and are disjoint then However, if and Two disjoint events and with and are never independent

Independence: An Example (1/3) Example 1.19. Consider an experiment involving two successive rolls of a 4-sided die in which all 16 possible outcomes are equally likely and have probability 1/16 Are the events, Ai = {1st roll results in i }, Bj = {2nd roll results in j }, independent? Using Discrete Uniform Probability Law here

Independence: An Example (2/3) (b) Are the events, A= {1st roll is a 1}, B= {sum of the two rolls is a 5}, independent? Not quite obvious

Independence: An Example (3/3) (c) Are the events, A= {maximum of the two rolls is 2}, B= {minimum of the two rolls is 2}, independent?

Conditional Independence (1/2) Given an event , the events and are called conditionally independent if We also know that If , we have an alternative way to express conditional independence 1 multiplication rule 2 This relation states that if C is known to have occurred, the additional knowledge that B also occurred does not change the probability of A 3

Conditional Independence (2/2) Notice that independence of two events and with respect to the unconditionally probability law does not imply conditional independence, and vice versa If and are independent, the same holds for (i) and (ii) and How can we verify it ? (See Problem 43)

Conditional Independence: Examples (1/2) Example 1.20. Consider two independent fair coin tosses, in which all four possible outcomes are equally likely. Let H1 = {1st toss is a head}, H2 = {2nd toss is a head}, D = {the two tosses have different results}. Using Discrete Uniform Probability Law here

Conditional Independence: Examples (2/2) Example 1.21. There are two coins, a blue and a red one We choose one of the two at random, each being chosen with probability 1/2, and proceed with two independent tosses The coins are biased: with the blue coin, the probability of heads in any given toss is 0.99, whereas for the red coin it is 0.01 Let be the event that the blue coin was selected. Let also be the event that the i-th toss resulted in heads Given the choice of a coin, the events and are independent conditional case: ? unconditional case:

Independence of a Collection of Events We say that the events are independent if For example, the independence of three events amounts to satisfying the four conditions 2n-n-1

Independence of a Collection of Events: Examples (1/4) Example 1.22. Pairwise independence does not imply independence. Consider two independent fair coin tosses, and the following events: H1 = { 1st toss is a head }, H2 = { 2nd toss is a head }, D = { the two tosses have different results }.

Independence of a Collection of Events: Examples (2/4) Example 1.23. The equality is not enough for independence. Consider two independent rolls of a fair six-sided die, and the following events: A = { 1st roll is 1, 2, or 3 }, B = { 1st roll is 3, 4, or 5 }, C = { the sum of the two rolls is 9 }.

Independence of a Collection of Events: Examples (3/4) Example 1.24. Network connectivity. A computer network connects two nodes A and B through intermediate nodes C, D, E, F (See next slide) For every pair of directly connected nodes, say i and j, there is a given probability pij that the link from i to j is up. We assume that link failures are independent of each other What is the probability that there is a path connecting A and B in which all links are up?

Independence of a Collection of Events: Examples (4/4) Example 1.24. (cont.)

Recall: Counting in Probability Calculation Two applications of the discrete uniform probability law When the sample space has a finite number of equally likely outcomes, the probability of any event is given by When we want to calculate the probability of an event with a finite number of equally likely outcomes, each of which has an already known probability . Then the probability of A is given by E.g., the calculation of k heads in n coin tosses The calculation of p is easy than the calculation of the number of elements of A E.g., the calculation of k heads in n coin tosses -the prob. of an given sequence with k heads is easy to calculate -the number of all such sequence is somewhat intricate to calculate

The Counting Principle Consider a process that consists of r stages. Suppose that: (a) There are n1 possible results for the first stage (b) For every possible result of the first stage, there are n2 possible results at the second stage (c) More generally, for all possible results of the first i -1 stages, there are ni possible results at the i-th stage Then, the total number of possible results of the r-stage process is n1 n2 ‧ ‧ ‧ nr

Common Types of Counting Permutations of n objects k-permutations of n objects Combinations of k out of n objects Partitions of n objects into r groups with the i-th group having ni objects

Summary of Chapter 1 (1/2) A probability problem can usually be broken down into a few basic steps: 1. The description of the sample space, i.e., the set of possible outcomes of a given experiment 2. The (possibly indirect) specification of the probability law (the probability of each event) 3. The calculation of probabilities and conditional probabilities of various events of interest

Summary of Chapter 1 (2/2) Three common methods for calculating probabilities The counting method: if the number of outcome is finite and all outcome are equally likely The sequential method: the use of the multiplication (chain) rule The divide-and-conquer method: the probability of an event is obtained based on a set of conditional probabilities are disjoint events that form a partition of the sample space

Recitation SECTION 1.5 Independence Problems 37, 38, 39, 40, 42