Math Camp 2: Probability Theory Sasha Rakhlin. Introduction  -algebra Measure Lebesgue measure Probability measure Expectation and variance Convergence.

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Discrete Mathematics and Its Applications Sixth Edition By Kenneth Rosen Chapter 6 Discrete Probability 歐亞書局.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
ORDER STATISTICS.
Independence of random variables
Introduction to stochastic process
1. Probability 2. Random variables 3. Inequalities 4. Convergence of random variables 5. Point and interval estimation 6. Hypotheses testing 7. Nonparametric.
Probability Theory Part 2: Random Variables. Random Variables  The Notion of a Random Variable The outcome is not always a number Assign a numerical.
1 Chap 5 Sums of Random Variables and Long-Term Averages Many problems involve the counting of number of occurrences of events, computation of arithmetic.
From PET to SPLIT Yuri Kifer. PET: Polynomial Ergodic Theorem (Bergelson) preserving and weakly mixing is bounded measurable functions polynomials,integer.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
1 Review of Probability Theory [Source: Stanford University]
4. Convergence of random variables  Convergence in probability  Convergence in distribution  Convergence in quadratic mean  Properties  The law of.
2.3 General Conditional Expectations 報告人:李振綱. Review Def (P.51) Let be a nonempty set. Let T be a fixed positive number, and assume that for each.
2003/04/23 Chapter 3 1頁1頁 3.6 Expected Value of Random Variables.
Probability theory 2011 Convergence concepts in probability theory  Definitions and relations between convergence concepts  Sufficient conditions for.
The moment generating function of random variable X is given by Moment generating function.
Statistical Theory; Why is the Gaussian Distribution so popular? Rob Nicholls MRC LMB Statistics Course 2014.
Approximations to Probability Distributions: Limit Theorems.
Some Continuous Probability Distributions Asmaa Yaseen.
Review of Probability.
T. Mhamdi, O. Hasan and S. Tahar, HVG, Concordia University Montreal, Canada July 2010 On the Formalization of the Lebesgue Integration Theory in HOL On.
Limits and the Law of Large Numbers Lecture XIII.
All of Statistics Chapter 5: Convergence of Random Variables Nick Schafer.
9.1 Sequences. A sequence is a list of numbers written in an explicit order. n th term Any real-valued function with domain a subset of the positive integers.
IRDM WS Chapter 2: Basics from Probability Theory and Statistics 2.1 Probability Theory Events, Probabilities, Random Variables, Distributions,
Analysis of RT distributions with R Emil Ratko-Dehnert WS 2010/ 2011 Session 02 –
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology The Weak Law and the Strong.
Random walk on Z: Simple random walk : Let be independent identically distributed random variables (iid) with only two possible values {1,-1},
Lectures prepared by: Elchanan Mossel elena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 26.
Convergence in Distribution
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
1 ORDER STATISTICS AND LIMITING DISTRIBUTIONS. 2 ORDER STATISTICS Let X 1, X 2,…,X n be a r.s. of size n from a distribution of continuous type having.
Sample Paths, Convergence, and Averages. Convergence Definition: {a n : n = 1, 2, …} converges to b as n  ∞ means that  > 0,  n 0 (  ), such that.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
Stat 35b: Introduction to Probability with Applications to Poker Outline for the day: 1.HW4 notes. 2.Law of Large Numbers (LLN). 3.Central Limit Theorem.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
K. Shum Lecture 26 Singular random variable Strong law of large number.
Chebyshev’s Inequality Markov’s Inequality Proposition 2.1.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
Theory of Computational Complexity Probability and Computing Ryosuke Sasanuma Iwama and Ito lab M1.
P robability Limit Infinity 郭俊利 2010/05/11.
Chapter 1: Outcomes, Events, and Sample Spaces men 1.
11.1 Sequences. A sequence is a list of numbers written in an explicit order. n th term Any real-valued function with domain a subset of the positive.
Basic statistics Usman Roshan.
Large Sample Distribution Theory
Large Sample Theory EC 532 Burak Saltoğlu.
Lecture 3 B Maysaa ELmahi.
Math a Discrete Random Variables
Basic statistics Usman Roshan.
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Lebesgue measure: Lebesgue measure m0 is a measure on i.e., 1. 2.
Introduction to Real Analysis
Sample Mean Distributions
Large Sample Theory EC 532 Burak Saltoğlu.
Probabilistic Convergence and Bounds
STOCHASTIC HYDROLOGY Random Processes
Chebychev, Hoffding, Chernoff
13. The Weak Law and the Strong Law of Large Numbers
Tutorial 10: Limit Theorems
Chap 11 Sums of Independent Random Variables and Limit Theorem Ghahramani 3rd edition 2019/5/16.
Stochastic Calculus for Finance II Chapter 1
Discrete Mathematics and its Applications Lecture 5 – Random graphs
Experiments, Outcomes, Events and Random Variables: A Revisit
13. The Weak Law and the Strong Law of Large Numbers
Presentation transcript:

Math Camp 2: Probability Theory Sasha Rakhlin

Introduction  -algebra Measure Lebesgue measure Probability measure Expectation and variance Convergence Convergence in probability and almost surely Law of Large Numbers. Central Limit Theorem Useful Probability Inequalities Jensen ’ s inequality Markov ’ s inequality Chebyshev ’ s inequality Cauchy-Schwarz inequality Hoeffeding ’ s inequality

 -algebra Let  be a set. Then a  -algebra  is a nonempty collection of subsets of  such that the following hold:    If E  ,  - E   If F i    i,  i F i  

Measure A measure  is a function defined on a  -algebra  over a set  with values in [0,  ] s.t.  (  ) = 0  (E) =  i  (E i ) if E =  i E i ( , ,  ) is called a measure space

Lebesgue measure The Lebesgue measure is the unique complete translation-invariant measure on a  -algebra s.t. ([0,1]) = 1

Probability measure Probability measure is a positive measure  over ( ,  ) s.t.  (  ) = 1 ( , ,  ) is called a probability space A random variable is a measurable function X:  R

Expectation and variance If X is a random variable over a probability space ( , ,  ), the expectation of X is defined as The variance of X is

Convergence x n  x if   > 0  N s.t. |x n – x| N ( X n converges to X in probability) if   > 0

Convergence in probability and almost surely Any event with probability 1 is said to happen almost surely. A sequence of real random variables X n converges almost surely to a random variable X iff Convergence almost surely implies convergence in probability

Law of Large Numbers. Central Limit Theorem Weak LLN: if X 1, X 2, … is an infinite sequence of i.i.d. random variables with  = E(X 1 ) = E(X 2 ) = …,, that is, CLT: where  is the cdf of N(0,1)

Jensen ’ s inequality If  is a convex function, then

Markov ’ s inequality If X  0 and t  0,

Chebyshev ’ s inequality If X is random variable and t > 0, e.g.

Cauchy-Schwarz inequality If E(X 2 ) and E(Y 2 ) are finite,

Hoeffding ’ s inequality Let a i  X i  b i for i = 1, …, n. Let S n =  X i, then for any t > 0,