Probability Review Definitions/Identities Random Variables Expected Value Joint Distributions Conditional Probabilities
Probability Defined an event (experiment) has a set of possible outcomes, each with a probability, that measures their relative (anticipated) frequencies of occurrence normalized to 1.
Probability Identities Events and outcomes: Probability of each outcome: Probability distribution:
Joint Distributions Two (or more) events Each event has an outcome Joint distribution stipulates the probability of every combination of outcomes
Two Events
Random Variables
Multiple Random Variables
Joint probability matrix
Conditional Probability Random variables are often NOT independent P(rain in Pittsburgh), P(rain in Monroeville), P(rain in New York), P(rain in Hong Kong) P(Heads up), P(Tails down) P(D1=5), P(D2=6) P(D1=1), P(D1 + D2=2)
Dice Example
Conditional Probability P(A|B) = P(AB) P(B) AB
Example p(y1) = 0.2 p(y2) = 0.1 p(y3) = 0.7
Markov Processes State transition probabilities Matrix or Diagram Matrix Multiplication predicts multiple transition probabilities Mk Converges to steady state