Introduction to Algorithms Probability Review – Appendix C CIS 670.

Slides:



Advertisements
Similar presentations
Lecture 13 Elements of Probability CSCI – 1900 Mathematics for Computer Science Fall 2014 Bill Pine.
Advertisements

Copyright © 2006 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Slide
Great Theoretical Ideas in Computer Science.
Week 21 Basic Set Theory A set is a collection of elements. Use capital letters, A, B, C to denotes sets and small letters a 1, a 2, … to denote the elements.
Chapter 5.1 & 5.2: Random Variables and Probability Mass Functions
Solve for x. 28 = 4(2x + 1) = 8x = 8x + 8 – 8 – 8 20 = 8x = x Distribute Combine Subtract Divide.
CPSC 411, Fall 2008: Set 10 1 CPSC 411 Design and Analysis of Algorithms Set 10: Randomized Algorithms Prof. Jennifer Welch Fall 2008.
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
1 Introduction to Stochastic Models GSLM Outline  course outline course outline  Chapter 1 of the textbook.
1 Discrete Structures & Algorithms Discrete Probability.
Chris Morgan, MATH G160 January 9, 2012 Lecture 1 Chapter 4.1, 4.2, & 4.3: Set Theory, Introduction to Probability.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete random variables Probability mass function Distribution function (Secs )
Qsort - 1 Lin / Devi Comp 122 d(n)d(n) Today’s puzzle  How far can you reach with a stack of n blocks, each 2 units long?
Conditional Probability and Independence Section 3.6.
COMP 170 L2 L16: Conditional Probability and Independence Page 1.
1 Randomized Algorithms Andreas Klappenecker [using some slides by Prof. Welch]
1 Discrete Probability Hsin-Lung Wu Assistant Professor Advanced Algorithms 2008 Fall.
Great Theoretical Ideas in Computer Science.
Probability and Statistics Dr. Saeid Moloudzadeh Axioms of Probability/ Basic Theorems 1 Contents Descriptive Statistics Axioms of Probability.
Chapter 9 Introducing Probability - A bridge from Descriptive Statistics to Inferential Statistics.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Great Theoretical Ideas In Computer Science Steven Rudich, Anupam GuptaCS Spring 2004 Lecture 22April 1, 2004Carnegie Mellon University
Simple Mathematical Facts for Lecture 1. Conditional Probabilities Given an event has occurred, the conditional probability that another event occurs.
1 Chapters 6-8. UNIT 2 VOCABULARY – Chap 6 2 ( 2) THE NOTATION “P” REPRESENTS THE TRUE PROBABILITY OF AN EVENT HAPPENING, ACCORDING TO AN IDEAL DISTRIBUTION.
Great Theoretical Ideas in Computer Science.
4.1 Probability Distributions. Do you remember? Relative Frequency Histogram.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
CSCE 411 Design and Analysis of Algorithms Set 9: Randomized Algorithms Prof. Jennifer Welch Fall 2014 CSCE 411, Fall 2014: Set 9 1.
COMP 170 L2 L17: Random Variables and Expectation Page 1.
Chapter 5: Probability Analysis of Randomized Algorithms Size is rarely the only property of input that affects run time Worst-case analysis most common.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
확률및공학통계 (Probability and Engineering Statistics) 이시웅.
Natural Language Processing Giuseppe Attardi Introduction to Probability IP notice: some slides from: Dan Jurafsky, Jim Martin, Sandiway Fong, Dan Klein.
Probability (outcome k) = Relative Frequency of k
Example Suppose we roll a die and flip a coin. How many possible outcomes are there? Give the sample space. A and B are defined as: A={Die is a 5 or 6}
Measuring chance Probabilities FETP India. Competency to be gained from this lecture Apply probabilities to field epidemiology.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Introduction to Probability By Dr. Carol A. Marinas.
1 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions.
Lecture 7 Dustin Lueker.  Experiment ◦ Any activity from which an outcome, measurement, or other such result is obtained  Random (or Chance) Experiment.
Probability theory is the branch of mathematics concerned with analysis of random phenomena. (Encyclopedia Britannica) An experiment: is any action, process.
Great Theoretical Ideas in Computer Science for Some.
Master Method (4. 3) Recurrent formula T(n) = a  T(n/b) + f(n) 1) if for some  > 0 then 2) if then 3) if for some  > 0 and a  f(n/b)  c  f(n) for.
Tutorial 5: Discrete Probability I Reference: cps102/lecture11.ppt Steve Gu Feb 15, 2008.
Great Theoretical Ideas In Computer Science John LaffertyCS Fall 2006 Lecture 10 Sept. 28, 2006Carnegie Mellon University Classics.
Lecture 19 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Spring 2016 COMP 2300 Discrete Structures for Computation Donghyun (David) Kim Department of Mathematics and Physics North Carolina Central University.
Basic Probability. Introduction Our formal study of probability will base on Set theory Axiomatic approach (base for all our further studies of probability)
Theory of Computational Complexity Probability and Computing Ryosuke Sasanuma Iwama and Ito lab M1.
Great Theoretical Ideas in Computer Science.
Introduction to Discrete Probability
Theme 7. Use of probability in psychological research
Probability – part 1 Prof. Carla Gomes Rosen 1 1.
Chapter 7: Counting Principles
Introduction to Discrete Mathematics
What is Probability? Quantification of uncertainty.
STA 291 Spring 2008 Lecture 7 Dustin Lueker.
CSCI 5832 Natural Language Processing
CSCE 411 Design and Analysis of Algorithms
Probability Trees By Anthony Stones.
Master Method (4. 3) Recurrent formula T(n) = aT(n/b) + f(n)
CS200: Algorithm Analysis
Sets A set is simply any collection of objects
I flip a coin two times. What is the sample space?
Discrete Random Variables: Basics
Sets and Probabilistic Models
Lecture 2 Basic Concepts on Probability (Section 0.2)
A random experiment gives rise to possible outcomes, but any particular outcome is uncertain – “random”. For example, tossing a coin… we know H or T will.
Presentation transcript:

Introduction to Algorithms Probability Review – Appendix C CIS 670

Notions of Discrete Probability

Comp 122 Discrete probability The language of probability helps count all possible outcomes.The language of probability helps count all possible outcomes. Definitions:Definitions: Random Experiment (or Process)Random Experiment (or Process) Result (outcome) is not fixed. Multiple outcomes are possible.Result (outcome) is not fixed. Multiple outcomes are possible. Ex: Throwing a fair die.Ex: Throwing a fair die. Sample Space SSample Space S Set of all possible outcomes of a random experiment.Set of all possible outcomes of a random experiment. Ex: {1, 2, 3, 4, 5, 6} when a die is thrown.Ex: {1, 2, 3, 4, 5, 6} when a die is thrown. Elementary EventElementary Event A possible outcome, element of S, x  S;A possible outcome, element of S, x  S; Ex: 2 – Throw of fair die resulting in 2.Ex: 2 – Throw of fair die resulting in 2. Event EEvent E Subset of S, E  S;Subset of S, E  S; Ex: Throw of die resulting in {x > 3} = {4, 5, 6}Ex: Throw of die resulting in {x > 3} = {4, 5, 6} Certain event : SCertain event : S Null event : Null event :  Mutual ExclusionMutual Exclusion Events A and B are mutually exclusive if A  B= .Events A and B are mutually exclusive if A  B= .

Flip a coin Running example: flip a coin twice and write down the outcomes (heads or tails).Running example: flip a coin twice and write down the outcomes (heads or tails). Sample space = elementary events = possible outcomes of experiment = {HH,HT, TH, TT}.Sample space = elementary events = possible outcomes of experiment = {HH,HT, TH, TT}. Events = subsets of sample space. For example {HH,HT, TH} (at least one heads).Events = subsets of sample space. For example {HH,HT, TH} (at least one heads). Each event has a certain probability.Each event has a certain probability. Uniform probability distribution: all elementary events have equal probability: Pr[HH] = Pr[HT] = Pr[TH] = Pr[TT] = 1/4.Uniform probability distribution: all elementary events have equal probability: Pr[HH] = Pr[HT] = Pr[TH] = Pr[TT] = 1/4.

Axioms of Probability A probability distribution Pr{} on a sample space S is a mapping from events of S to real numbers such that the following are satisfied:

Discrete Probability -Example

Axioms of Probability & Conclusions

Conditional Probability

Sometimes we have some prior partial knowledge about the outcome of an experiment. For example, suppose that a friend has flipped two fair coins and has told you that at least one of the coins showed a head. What is the probability that both coins are heads?Sometimes we have some prior partial knowledge about the outcome of an experiment. For example, suppose that a friend has flipped two fair coins and has told you that at least one of the coins showed a head. What is the probability that both coins are heads? In the example above, A is the event that both coins are heads, and B is the event that at least one coin is a head. Thus, Pr{A | B} = (1/4)/(3/4) = 1/3.In the example above, A is the event that both coins are heads, and B is the event that at least one coin is a head. Thus, Pr{A | B} = (1/4)/(3/4) = 1/3.

Independent Events Example 1: Experiment: Rolling two independent dice.Example 1: Experiment: Rolling two independent dice. Event A: Die 1 < 3 Event A: Die 1 < 3 Event B: Die 2 > 3 Event B: Die 2 > 3 A and B are independent. A and B are independent. Example 2: For example, suppose that two fair coins are flipped and that the outcomes are independent. Then the probability of two heads is (1/2)(1/2) = 1/4.Example 2: For example, suppose that two fair coins are flipped and that the outcomes are independent. Then the probability of two heads is (1/2)(1/2) = 1/4.

Discrete Random Variables A random variable X is a function from a sample space S to the real numbers. If the space is finite or countably infinite, a random variable X is called a discrete random variable. Maps each possible outcome of an experiment to a real number.

Comp 122 Discrete Random Variables Example:Example: Rolling 2 dice.Rolling 2 dice. X: Sum of the values on the two dice.X: Sum of the values on the two dice. Pr{X=7} = Pr{(1,6),(2,5),(3,4),(4,3),(5,2),(6,1)}Pr{X=7} = Pr{(1,6),(2,5),(3,4),(4,3),(5,2),(6,1)} = 6/36 = 1/6. = 6/36 = 1/6.

Expected Value (Mean) of a Random Variable

Expected Value -Properties

For mutually independent random variablesFor mutually independent random variables X 1, X 2, …, X n E[X 1 X 2 … X n ] = E[X 1 ] E[X 2 ]…E[X n ]E[X 1 X 2 … X n ] = E[X 1 ] E[X 2 ]…E[X n ]

Indicator Random Variables In order to analyze many algorithms, including the hiring problem, we will use indicator random variables. Indicator random variables provide a convenient method for converting between probabilities and expectationsIn order to analyze many algorithms, including the hiring problem, we will use indicator random variables. Indicator random variables provide a convenient method for converting between probabilities and expectations A simple yet powerful technique for computing the expected value of a random variable.A simple yet powerful technique for computing the expected value of a random variable. Takes only 2 values, 1 and 0.Takes only 2 values, 1 and 0. Suppose we are given a sample space S and an event A. Then the indicator random variable I {A} associated with event A is defined as:Suppose we are given a sample space S and an event A. Then the indicator random variable I {A} associated with event A is defined as:

Indicator Random Variable - Example