Probability Review for Financial Engineers

Slides:



Advertisements
Similar presentations
STATISTICS Joint and Conditional Distributions
Advertisements

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
1 1 PRESENTED BY E. G. GASCON Introduction to Probability Section 7.3, 7.4, 7.5.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Multivariate Distributions
Independence of random variables
DEPARTMENT OF HEALTH SCIENCE AND TECHNOLOGY STOCHASTIC SIGNALS AND PROCESSES Lecture 1 WELCOME.
06/05/2008 Jae Hyun Kim Chapter 2 Probability Theory (ii) : Many Random Variables Bioinformatics Tea Seminar: Statistical Methods in Bioinformatics.
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Probability Densities
Probability theory 2010 Main topics in the course on probability theory  Multivariate random variables  Conditional distributions  Transforms  Order.
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
Chapter 6 Continuous Random Variables and Probability Distributions
The moment generating function of random variable X is given by Moment generating function.
Continuous Random Variables and Probability Distributions
Chapter 4: Joint and Conditional Distributions
The joint probability distribution function of X and Y is denoted by f XY (x,y). The marginal probability distribution function of X, f X (x) is obtained.
Some standard univariate probability distributions
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability distribution
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Normal and Sampling Distributions A normal distribution is uniquely determined by its mean, , and variance,  2 The random variable Z = (X-  /  is.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 16 Random Variables.
MTH 161: Introduction To Statistics
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Lectures prepared by: Elchanan Mossel elena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Follows Jim Pitman’s book: Probability Section.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
The Mean of a Discrete RV The mean of a RV is the average value the RV takes over the long-run. –The mean of a RV is analogous to the mean of a large population.
Financial Mathematics. In finance, a hedge is an investment that is taken out specifically to reduce or cancel out the risk in another investment.financerisk.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Probability and Statistics
Multiple Random Variables & OperationsUnit-2. MULTIPLE CHOICE TRUE OR FALSE FILL IN THE BLANKS Multiple.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
STATISTICS Joint and Conditional Distributions Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering National Taiwan University.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Lecture 3 1 Recap Random variables Continuous random variable Sample space has infinitely many elements The density function f(x) is a continuous function.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
Engineering Statistics ECIV 2305
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Review of Probability Concepts Prepared by Vera Tabakova, East Carolina University.
Continuous Random Variables and Probability Distributions
1 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions.
Conditional Probability and the Multiplication Rule NOTES Coach Bridges.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Independent and Dependent Events Lesson 6.6. Getting Started… You roll one die and then flip one coin. What is the probability of : P(3, tails) = 2. P(less.
Geology 6600/7600 Signal Analysis 04 Sep 2014 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
1 Chapter 4 Mathematical Expectation  4.1 Mean of Random Variables  4.2 Variance and Covariance  4.3 Means and Variances of Linear Combinations of Random.
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
Inequalities, Covariance, examples
Warm Up One card is selected at random from a standard deck of 52 playing cards. What is the probability that the card is either a club or an ace?  
Probability.
Main topics in the course on probability theory
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
The distribution function F(x)
Review of Probability Concepts
Suppose you roll two dice, and let X be sum of the dice. Then X is
Probability Review for Financial Engineers
More about Normal Distributions
Welcome to the wonderful world of Probability
Compound Probability.
Handout Ch 4 實習.
Handout Ch 4 實習.
Section 12.2 Theoretical Probability
AP Statistics Chapter 16 Notes.
Section 12.2 Theoretical Probability
Section 12.2 Theoretical Probability
Mathematical Expectation
Presentation transcript:

Probability Review for Financial Engineers Part 2

Conditional Probability The conditional probability that E occurs given that F has occurred is denoted by 𝑃(𝐸|𝐹) If P(F) > 0 then 𝑃 𝐸 𝐹 = 𝑃(𝐸𝐹) 𝑃(𝐹)

Example – 2 dice 2 dice are rolled - a red dice and a green dice, What is the probability distribution for the total? 2 - 1/36 3 - 2 * (1/36) … 7 – 6 * (1/36) 11 – 2 (1/36) 12 – 1/36

2 Dice example continued   What is the expected value? 7 Ex) What is the probability distribution function for the total given that the Green dice was a 3, that is P(T|G=3) 4 – 1/6 5 – 1/6 6 – 1/6 7 – 1/6 8 – 1/6 9 – 1/6

Example) Playing Cards Selecting a card a standard 52 playing card deck What is the probability of getting an ace? 4/52 = 1/13 What is the probability of getting a ace given that someone already removed a jack from the deck? 4/51 the removal of a jack means that a non-ace has been removed from the deck What is the probability of getting an ace given that someone already removed a spade from the deck? 1/13 the removed cards suit is independent of the rank question.

Joint Cumulative Distributions F(a,b) = P(X≤a, Y≤b) The distribution of X can be obtained from the joint distribution of X and Y as follows 𝐹 𝑋 =𝑃 𝑋<𝑎 =𝑃 𝑋<𝑎|𝑌< ∞ =𝑃( lim 𝑏→∞ 𝑋<𝑎|𝑌< 𝑏 ) = lim 𝑏→∞ 𝑃 𝑋<𝑎|𝑌< 𝑏 = lim 𝑏→∞ 𝐹(𝑎,𝑏) =𝐹(𝑎,∞)

Example – Time between arrivals A market buy order and a market sell order arrive uniformly distributed between 1 and 2pm. Each person puts a 10 minute time limit on each order. What is the probability that the trade will not be executed because of a timeout?   This would be the P(B +10 < S) + P(S+10 < B) = 2 P(B +10 < S) =2 𝐵+10<𝑆 𝑓 𝑏,𝑠 𝑑𝑏 𝑑𝑠 =2 𝐵+10<𝑆 𝑓 𝐵 (𝑏) 𝑓 𝑆 (𝑠)𝑑𝑏 𝑑𝑠 =2 10 60 0 𝑠−10 1 60 2 𝑑𝑏 𝑑𝑠 = 2 60 2 10 60 𝑠−10 𝑑𝑠 = 25 36

Expected Values of Joint Densities Suppose f(x,y) is a joint distribution 𝐸[𝑔 𝑋 ℎ 𝑌 ] = −∞ ∞ −∞ ∞ 𝑔 𝑥 ℎ 𝑥 𝑓(𝑥,𝑦)𝑑𝑥 𝑑𝑦 = −∞ ∞ −∞ ∞ 𝑔 𝑥 ℎ 𝑥 𝑓 𝑋 (𝑥) 𝑓 𝑌 (𝑦)𝑑𝑥 𝑑𝑦 = −∞ ∞ ℎ 𝑥 𝑓 𝑌 (𝑦)𝑑𝑦 −∞ ∞ 𝑔 𝑥 𝑓 𝑋 (𝑥)𝑑𝑥 =𝐸[ℎ 𝑌 ]𝐸[𝑔 𝑋 ]

Covariance of 2 Random Variables 𝐶𝑜𝑣 𝑋,𝑌 =𝐸 (𝑋−𝐸 𝑋 ∗ 𝑌−𝐸 𝑌 ] =𝐸 𝑋𝑌 −𝐸 𝑋 𝑌−𝑋𝐸 𝑌 +𝐸 𝑋 𝐸[𝑌] =𝐸 𝑋𝑌 −𝐸 𝑋 𝐸[𝑌]−𝐸[𝑋]𝐸 𝑌 +𝐸 𝑋 𝐸[𝑌] =𝐸 𝑋𝑌 − 𝐸 𝑋 𝐸[𝑌] Note that is X and Y are independent, then the covariance = 0

Variance of sum of random variables 𝑉𝑎𝑟 𝑋+𝑌 = 𝐸[ 𝑋+𝑌−𝐸 𝑋+𝑌 2 ] = 𝐸[ 𝑋+𝑌−𝐸𝑋−𝐸𝑌 2 ] = 𝐸[ 𝑋−𝐸𝑋+𝑌−𝐸𝑌 2 ] = 𝐸 𝑋−𝐸𝑋 2 + 𝑌−𝐸𝑌 2 +2 𝑋−𝐸𝑋 𝑌−𝐸𝑌 = 𝐸 𝑋−𝐸𝑋 2 ]+ 𝐸[ 𝑌−𝐸𝑌 2 +2𝐸[ 𝑋−𝐸𝑋 𝑌−𝐸𝑌 ] 𝑉𝑎𝑟 𝑋+𝑌 = 𝑉𝑎𝑟 𝑋 +𝑉𝑎𝑟 𝑌 +2𝐶𝑜𝑣(𝑋,𝑌)

Correlation of 2 random variables As long as Var(X) and Var(Y) are both positive, the correlation of X and Y is denotes as 𝜌 𝑋,𝑌 = 𝐶𝑜𝑣(𝑋,𝑌) 𝑉𝑎𝑟 𝑋 𝑉𝑎𝑟(𝑌) It can be shown that −1 ≤ 𝜌 𝑋,𝑌 ≤1 The correlation coefficient is a measure of the degree of linearity between X and Y 𝜌 𝑋,𝑌 =0 means very little linearity 𝜌 𝑋,𝑌 𝑛𝑒𝑎𝑟+1 means X and Y increase and decrease together 𝜌 𝑋,𝑌 𝑛𝑒𝑎𝑟−1 means X and Y increase and decrease inversely

Central Limit Theorem Loosely put, the sum of a large number of independent random variables has a normal distribution. Let 𝑋 1 , 𝑋 2 … be a sequence of independent and identically distributed random variables each having mean 𝜇 and variance 𝜎 2 Then the distribution of 𝑋 1 +…+ 𝑋 𝑛 −n𝜇 𝜎 𝑛 Tends to a standard normal as n ∞, that is 𝑃 𝑋 1 +…+ 𝑋 𝑛 −n𝜇 𝜎 𝑛 ≤𝑎 → 1 2𝜋 −∞ 𝑎 𝑒 − 𝑥 2 /2 𝑑𝑥