Probability Refresher

Slides:



Advertisements
Similar presentations
CS433: Modeling and Simulation
Advertisements

Simulation with ArenaAppendix C – A Refresher on Probability and StatisticsSlide 1 of 33 A Refresher on Probability and Statistics Appendix C.
Probability Theory Part 1: Basic Concepts. Sample Space - Events  Sample Point The outcome of a random experiment  Sample Space S The set of all possible.
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Continuous Random Variable (1). Discrete Random Variables Probability Mass Function (PMF)
Review of Basic Probability and Statistics
Probability Theory Part 2: Random Variables. Random Variables  The Notion of a Random Variable The outcome is not always a number Assign a numerical.
Chapter 2: Probability.
Probability Distributions
Summarizing Measured Data Part I Visualization (Chap 10) Part II Data Summary (Chap 12)
Conditional probability Probability of an event E, given that an event F has occurred is called the conditional probability of E given F, and is written.
Exam 1 – 115a. Basic Probability For any event E, The union of two sets A and B, A  B, includes items that are in either A or B. The intersection, A.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Mutually Exclusive: P(not A) = 1- P(A) Complement Rule: P(A and B) = 0 P(A or B) = P(A) + P(B) - P(A and B) General Addition Rule: Conditional Probability:
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Joint Distribution of two or More Random Variables
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Chapter 12 Review of Calculus and Probability
Review Chapter Chapter 1 Combinatorial Analysis Basic principle of counting Permutation Combination 2.
Chapter 11 Probability Sample spaces, events, probabilities, conditional probabilities, independence, Bayes’ formula.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Overview of Probability Theory In statistical theory, an experiment is any operation that can be replicated infinitely often and gives rise to a set of.
Discrete Random Variables A random variable is a function that assigns a numerical value to each simple event in a sample space. Range – the set of real.
Probability theory Petter Mostad Sample space The set of possible outcomes you consider for the problem you look at You subdivide into different.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 5.2: Recap on Probability Theory Jürgen Sturm Technische Universität.
Appendix : Probability Theory Review Each outcome is a sample point. The collection of sample points is the sample space, S. Sample points can be aggregated.
CS433 Modeling and Simulation Lecture 03 – Part 01 Probability Review 1 Dr. Anis Koubâa Al-Imam Mohammad Ibn Saud University
Probability Refresher. Events Events as possible outcomes of an experiment Events define the sample space (discrete or continuous) – Single throw of a.
Probability Review-1 Probability Review. Probability Review-2 Probability Theory Mathematical description of relationships or occurrences that cannot.
Probability theory Tron Anders Moger September 5th 2007.
Discrete Structures By: Tony Thi By: Tony Thi Aaron Morales Aaron Morales CS 490 CS 490.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Chapter 31 Conditional Probability & Conditional Expectation Conditional distributions Computing expectations by conditioning Computing probabilities by.
© by Yu Hen Hu 1 ECE533 Digital Image Processing Review of Probability, Random Process, Random Field for Image Processing.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
디지털통신 Random Process 임 민 중 동국대학교 정보통신공학과 1.
Random Variables By: 1.
Chapter 7. Section 7.1 Finite probability  In a lottery, players win a large prize when they pick four digits that match, in the correct order, four.
3.1 Discrete Random Variables Present the analysis of several random experiments Discuss several discrete random variables that frequently arise in applications.
Pattern Recognition Probability Review
Virtual University of Pakistan
Review of Probability.
Lecture 3 B Maysaa ELmahi.
Simulation Statistics
Cumulative distribution functions and expected values
ETM 607 – Spreadsheet Simulations
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
3.1 Expectation Expectation Example
Conditional Probability on a joint discrete distribution
Chapter 5 Statistical Models in Simulation
Review of Probability and Estimators Arun Das, Jason Rebello
Chapter 4: Mathematical Expectation:
Probability Review for Financial Engineers
ASV Chapters 1 - Sample Spaces and Probabilities
7.1 Experiments, Sample Spaces, & Events
Independence of random variables
5. Conditioning and Independence
Chapter 2. Random Variables
Probability Rules Rule 1.
Experiments, Outcomes, Events and Random Variables: A Revisit
Quick Review of Probability
Mathematical Expectation
Presentation transcript:

Probability Refresher

Events Sample space  consists of all possible outcomes (discrete or continuous) of an experiment Single throw of a dice: {1, 2, 3, 4, 5, 6} Events are any sub-space of the sample space Combinations (union & intersection) of events are also events E = E1 OR E2 = E1  E2 – (OR  ) E = E1 AND E2 = E1  E2 – (AND  ) Mutually exclusive events: E1  E2 =  Partition (of sample space ): Set of mutually exclusive events that cover the entire sample space Complementary events: E & F are complements of each other if E  F =  and E  F = 

Events & Probabilities – (1) P{E}: Odds that event E occurs (P{} = 1) Union Law: P{EF} = P{E}+P{F}-P{EF} P{EF} ≤ P{E}+P{F} with equality only if E and F are mutually exclusive (complementary events are mutually exclusive) Conditional probability P{E|F} = P{EF}/P{F}  P{EF} = P{E|F}P{F} Fair dice example: 6 possible outcomes with the same probability of 1/6 Probability of a 2 given that outcome is an even number? (1/6)/(1/6+1/6+1/6) = 1/3 2 is one of three possible outcomes that yield an even number

Events & Probabilities – (2) Independence E & F are independent if P{EF} = P{E}P{F} Examples: Dice throwing P{ODD & Multiple of 3}= P{3}=1/6; and P{ODD}=1/2, P{Multiple of 3}=1/3, so that P{ODD}*P{Multiple of 3}=1/6 as well P{ODD & 5}=P{1}+P{3}+P{5}=1/2; and P{ODD}=1/2, P{5}=P{1}+P{2}+P{3}+P{4}+P{5}=5/6, so that P{ODD}*P{5}=5/121/2 Conditional independence E & F are conditionally independent given G if P{EF|G} = P{E|G}P{F|G} P{≥2 & 4|EVEN}=P{2|EVEN}+P{4|EVEN}=2/3; and P{≥2 |EVEN}=1, P{4|EVEN}=2/3, so that P{≥2 |EVEN}*P{4|EVEN}=2/3 as well Note that P{≥2 & 4}=1/2P{4}*P{≥2)=4/6*5/6=5/9

Events & Probabilities – (3) Law of total probability: For any partition F1,…,FN, of the sample space (iFi = ) Bayes Law Prove using definition of conditional probability Combining Bayes Law and Law of total probability

Bayes Law Conditional probability P{E|F} = P{EF}/P{F}  P{EF} = P{E|F}P{F} P{F|E} = P{EF}/P{E} Using the expression for P{EF} from (1) in (2) gives P{F|E} = P{E|F}P{F}/P{E}

Example – Anti-virus s/w Test We know that our s/w is 95% accurate, i.e., P{positive | virus} = 0.95 (true positive) and P{negative | virus} = 0.05 P{negative | no virus} = 0.95 (true negative) and P{positive | no virus} = 0.05 We also know that on average 1 out every 1,000 computers is infected with a virus, i.e., P{virus} = 0.001 What are the odds that a computer that tests positive is infected with a virus? p = P{has virus| positive} Bayes Law: p = [P{positive | virus}P{virus}]/P{positive} Bayes Law + Total Probability Law: Replace P{positive} with P{positive} = P{positive | virus}P{virus}+P{positive | no virus} P{no virus} P{positive} = 0.950.001+0.050.999 = 0.0509 This gives p = 0.950.001/0.509 = 0.0187, i.e., less than 2%

Random Variables Basically a mapping of events to numbers Typical notation: X (random variable), x (value) Cumulative distribution function (c.d.f.): FX(a) = P{X ≤ a} Note that by definition FX() = 1 Complementary distribution: FX(a) = 1- FX(a) = P{X > a} Discrete and continuous r.v.’s Probability mass function (p.m.f) vs. probability density function (p.d.f.) Expectation & higher moments Discrete r.v.: Continuous r.v.: Variance: Var(X) = E[(X – E[X])2] = E[X2] – E2[X] (by linearity of expectation – more on this soon)

Joint Probability Discrete r.v.: Joint probability mass function PX,Y(x,y) PX,Y(x,y) = P{X=x AND Y=y) PX(x) = ΣyPX,Y(x,y) and PY(y) = ΣxPX,Y(x,y) Continuous r.v.: Joint density function fX,Y(x,y) If X and Y are independent r.v.’s (X  Y) Discrete: PX,Y(x,y) = PX(x)PY(y) Continuous: fX,Y(x,y) = fX(x)fY(y) E[XY] = E[X]E[Y]

Conditional Probabilities & Expectations Discrete r.v.: Conditional p.m.f. of X given A Conditional expectation of X given A Continuous r.v.: Conditional p.d.f.

More on Expectation Expected value from conditional expectation Discrete r.v.: E[X] = ΣyE[X|Y=y]P{Y=y} More generally: E[g(X)] = ΣyE[g(X)|Y=y]P{Y=y} Continuous r.v.: E[X] = yE[X|Y=y]fY(y)dy More generally: E[g(X)] = yE[g(X)|Y=y]P{Y=y} Linearity of expectation: E[X+Y] = E[X] + E[Y] Linearity of variance for independent r.v.’s If X  Y then Var(X+Y) = Var(X) + Var(Y)

Random Sum of Random Variables Let X1, X2, X3,… be i.i.d. random variables and N be a non-negative, integer-valued random variable, independent of the Xi’s Define Find expressions for E[S] and Var(S) Condition on N and use linearity of expectation For variance, use the fact that Var(S|N = n) = nVar(X)

Mean of Random Sum of Random Variables

Variance of Random Sum of Random Variables Start with Var(S|N=n) So that Hence