Math Camp 2: Probability Theory Sasha Rakhlin
Introduction -algebra Measure Lebesgue measure Probability measure Expectation and variance Convergence Convergence in probability and almost surely Law of Large Numbers. Central Limit Theorem Useful Probability Inequalities Jensen ’ s inequality Markov ’ s inequality Chebyshev ’ s inequality Cauchy-Schwarz inequality Hoeffeding ’ s inequality
-algebra Let be a set. Then a -algebra is a nonempty collection of subsets of such that the following hold: If E , - E If F i i, i F i
Measure A measure is a function defined on a -algebra over a set with values in [0, ] s.t. ( ) = 0 (E) = i (E i ) if E = i E i ( , , ) is called a measure space
Lebesgue measure The Lebesgue measure is the unique complete translation-invariant measure on a -algebra s.t. ([0,1]) = 1
Probability measure Probability measure is a positive measure over ( , ) s.t. ( ) = 1 ( , , ) is called a probability space A random variable is a measurable function X: R
Expectation and variance If X is a random variable over a probability space ( , , ), the expectation of X is defined as The variance of X is
Convergence x n x if > 0 N s.t. |x n – x| N ( X n converges to X in probability) if > 0
Convergence in probability and almost surely Any event with probability 1 is said to happen almost surely. A sequence of real random variables X n converges almost surely to a random variable X iff Convergence almost surely implies convergence in probability
Law of Large Numbers. Central Limit Theorem Weak LLN: if X 1, X 2, … is an infinite sequence of i.i.d. random variables with = E(X 1 ) = E(X 2 ) = …,, that is, CLT: where is the cdf of N(0,1)
Jensen ’ s inequality If is a convex function, then
Markov ’ s inequality If X 0 and t 0,
Chebyshev ’ s inequality If X is random variable and t > 0, e.g.
Cauchy-Schwarz inequality If E(X 2 ) and E(Y 2 ) are finite,
Hoeffding ’ s inequality Let a i X i b i for i = 1, …, n. Let S n = X i, then for any t > 0,