Probability (outcome k) = Relative Frequency of k Two Interpretations of Probability The Frequentist Interpretation: Probability (outcome k) = Relative Frequency of k i.e. if S = set of all outcomes n = # of trials = # of times k occurs in n trials then The Probabilistic/Axiomatic Approach: Probabilities are numerical values assigned to outcomes of set S such that axioms of probability are satisfied. The axioms are a) All probabilities are between 0 and 1. b) The probabilities sum to 1 over the set S. c) Probabilities of mutually exclusive events are additive.
Definitions Random Experiment: An experiment with random results. Outcome: An elementary result from a random experiment. Sample Space: The space S of all possible outcomes in a random experiment. sample spaces can be: __ Finite or countable infinite __ Uncountable Event: A subset of S. Certain event = S Null Event = Ф e.g. Random Experiment : Toss of a coin twice Outcomes = HH,HT,TH,TT Sample Space = {HH,HT,TH,TT} Event : At least one head = {HH,HT,TH}
The Axioms of Probability 1) P(S) = 1 If, for , then For Infinite Sample Spaces (3) must be replaced by: 3a) If such that Since Axiom 3 only lets us evaluate which may not be equal to Probabilities of disjoint events add together
Some Elementary Consequences 1) 2) 3) 4) 5) Subtract even-numbered combinations, add odd-numbered ones. A1 A2 A3 S
Conditional Probability: = Probability that A occurs given B has occurred. Independent Events: are independent if, Which leads to Note that Independence Mutual exclusion In fact, if P(A), P(B) > 0, independent A,B are not mutually exclusive.
Bayes’ Rule: If and form a partition of S. The form is widely used in estimation. and are called prior probabilities.
P(k successes in n independent Bernoulli Trials) Bernoulli Trial: A single trial of an experiment whose only outcomes are “success” and “failure” or {0,1} etc. If p = P(success) then P(k successes in n independent Bernoulli Trials) This is called binomial probability law. Example: Coin toss with “success” = heads Fair coin p = P(heads) = 0.5 P (3 heads in 5 tosses)
Logic of the Binomial Law Multinomial Probability Law: If is a partition of S and rk = Number of times Bk occurs in n trials. then, The binomial law is the m=2 case. Number of ways to get k successes in n attempts Prob of k successes Prob of n-k failures
Random Variables A Random Variable X is a function that assigns a real number, X(), to each Outcome, , of a random experiment: X : S Example: Random Experiment = 3 coin tosses X Substitute H = 1 and T = 0 and read as a binary number. So X(HHH) = 7 (111) X(TTT) = 0 (000) X(HTH) = 5 (101) etc Example: Random Experiment = Examine patient Outcomes = {Healthy, Sick} X(Healthy) = 0 X(Sick)=1 A random variable is a deterministic function, not a random one. Why bother? Example:
Cumulative Distribution Function (CDF) The CDF describes how probability accumulates over the range of a random variable Thus e.g. in the coin toss example: Example:
Properties of CDF: 1) 2) 3) Nondecreasing in x : Right – continuous : 6) 7) The RHS is obtained from
If FX (x) is also left-continuous at a P(X=a) = 0 For continuous FX (x) at a,b If FX (x) is not continuous at a,b, the equalities do not hold.
Three Types of Random Variables Discrete : Continuous: Mixed:
Probability Mass Function (PMF) : For a discrete random variable X with range Probability Density Function(PDF): for differentiable . i.e.
Properties of PDF: 1) 2) 3) 4) Any function g(x) such that can form a valid pdf For Discrete Random Variables pdf is defined using the delta function
pmf cdf
Conditional CDF’s and PDF’s Example 1:
Example 2: Rolling a fair 6-sided die 1 Fx (x) Fx (x | X even) 1/6 x 1 2 3 4 5 6 7