CS723 - Probability and Stochastic Processes

Slides:



Advertisements
Similar presentations
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Advertisements

CS433: Modeling and Simulation
Introduction to stochastic process
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete random variables Probability mass function Distribution function (Secs )
Probability theory 2010 Main topics in the course on probability theory  Multivariate random variables  Conditional distributions  Transforms  Order.
Probability Distributions Finite Random Variables.
Chapter 5: Probability Concepts
Probability theory 2011 Main topics in the course on probability theory  The concept of probability – Repetition of basic skills  Multivariate random.
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
Chapter 4: Joint and Conditional Distributions
Chapter 4 Joint Distribution & Function of rV. Joint Discrete Distribution Definition.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
Definition of Covariance The covariance of X & Y, denoted Cov(X,Y), is the number where  X = E(X) and  Y = E(Y). Computational Formula:
PBG 650 Advanced Plant Breeding
Two Random Variables W&W, Chapter 5. Joint Distributions So far we have been talking about the probability of a single variable, or a variable conditional.
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
1 Lecture 14: Jointly Distributed Random Variables Devore, Ch. 5.1 and 5.2.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
CS433 Modeling and Simulation Lecture 03 – Part 01 Probability Review 1 Dr. Anis Koubâa Al-Imam Mohammad Ibn Saud University
Lecture V Probability theory. Lecture questions Classical definition of probability Frequency probability Discrete variable and probability distribution.
Chapter 5 Joint Probability Distributions Joint, n. 1. a cheap, sordid place. 2. the movable place where two bones join. 3. one of the portions in which.
Topic 5: Continuous Random Variables and Probability Distributions CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text,
Lecture 1: Basic Statistical Tools. A random variable (RV) = outcome (realization) not a set value, but rather drawn from some probability distribution.
February 16. In Chapter 5: 5.1 What is Probability? 5.2 Types of Random Variables 5.3 Discrete Random Variables 5.4 Continuous Random Variables 5.5 More.
1 Part Three: Chapters 7-9 Performance Modeling and Estimation.
Virtual University of Pakistan Lecture No. 26 Statistics and Probability Miss Saleha Naghmi Habibullah.
The Chinese University of Hong Kong
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
6 vector RVs. 6-1: probability distribution A radio transmitter sends a signal to a receiver using three paths. Let X1, X2, and X3 be the signals that.
Function of a random variable Let X be a random variable in a probabilistic space with a probability distribution F(x) Sometimes we may be interested in.
Random Variables By: 1.
The Pure Birth Process Derivation of the Poisson Probability Distribution Assumptions events occur completely at random the probability of an event occurring.
CS723 - Probability and Stochastic Processes. Lecture No. 09.
Pattern Recognition Probability Review
Expected values, covariance, and correlation
Statistics Lecture 19.
Random Variable 2013.
Functions and their Graphs
Linear Inequalities in Two Variables
Simulation Statistics
Main topics in the course on probability theory
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
CS723 - Probability and Stochastic Processes
PRODUCT MOMENTS OF BIVARIATE RANDOM VARIABLES
3.1 Expectation Expectation Example
Conditional Probability on a joint discrete distribution
CS723 - Probability and Stochastic Processes
How accurately can you (1) predict Y from X, and (2) predict X from Y?
CS723 - Probability and Stochastic Processes
CS723 - Probability and Stochastic Processes
Analysis of Engineering and Scientific Data
Handout Ch 4 實習.
Handout Ch 4 實習.
CS723 - Probability and Stochastic Processes
CS723 - Probability and Stochastic Processes
CS723 - Probability and Stochastic Processes
Chapter 2. Random Variables
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
CS723 - Probability and Stochastic Processes
IE 360: Design and Control of Industrial Systems I
Further Topics on Random Variables: Derived Distributions
Further Topics on Random Variables: Covariance and Correlation
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Further Topics on Random Variables: Derived Distributions
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Derived Distributions
Sets, Combinatorics, Probability, and Number Theory
Presentation transcript:

CS723 - Probability and Stochastic Processes

Lecture No. 12

In Previous Lectures .Discussion of discrete random variables, .Covered many of the topics related to discrete random variables without going into details of continuous random variables .conditional probabilities .conditional distribution .join distribution .Marginal distribution .Expected values .Transformation of random variables .Expected values of transformation of random variables Split into two

Transformation of RV’s Map points on real line to new points on and then assign probabilities The transformation could be R→R with further restriction of being invertible Examples of invertible R→R mappings: Y = 2X , Y =2X+3, Y = exp(X) Examples of non-invertible mappings: Y = |X| , Y = X2 , Y = sin(X) Other mappings could be R2→R or R3→R

Transformation of RV’s Take a RV X with PMF given below and find the PMF of Y = 2X+1 X occurs in range [-4,6] and Y in [-7,13] Pr( X = x ) = Pr( Y = y = 2x + 1 ) Pr( Y = y ) = Pr( X = x = (y - 1)/2 ) SHOW COMPLETE SLIDE AT A TIME AND HIGHLIGHT THE LINES ONE BY ONE

Transformation of RV’s It will be nice if this slide is presented in an animation. Each point of random variable X is shown first along with its PMF value. Then corresponding point of random variable Y is shown with the same PMF.

Transformation of RV’s Take a RV X with PMF given below and find the PMF of Y = |X-1| X occurs in range [-4,6] and Y in [0,5] Pr( Y = 1 ) = Pr(X=2) + Pr(X=0) = 2/15 + 3/15 = 5/15

Transformation of RV’s Pr( X = x ) ≠ Pr( Y = y = |x – 1| ) e.g. Pr(X=2) = 2/15 ≠ Pr(Y=|2-1|=1) = 5/15 Pr( Y = y ) = ∑Pr( X = x s.t. |x – 1| = y)

Transformation of RV’s SHOW BOTH GRAPHS ONE BY ONE. DIM THE BELOW FIRST THEN DIM THE ABOVE

Transformation of RV’s SHOW BOTH GRAPHS ONE BY ONE. DIM THE BELOW FIRST THEN DIM THE ABOVE

Transformation of RV’s SHOW BOTH GRAPHS ONE BY ONE. DIM THE BELOW FIRST THEN DIM THE ABOVE

Expected Value of Transformed RV We learned the following two formulas E(X) = ∑ xi pxi & E(Y) = ∑ g( xi ) pxi E(X) comes out to be 4/15 If Y = 2X + 1 then E(Y) = 23/15 and it obeys the formula E(Y) = 2*E(X) + 1 If Y = |X – 1| then E(Y) = 29/15 and it does not obey E(Y) = |E(X) – 1| Let’s briefly re-visit what we learned in last lecture about the expected value of transformed random variable. If we apply the formula given in last lecture, we see that the expected value of our original random variable X comes out to be -4*1/15 -3*2/15 -1*2/15 + 0*3/15 + 1*3/15 + 2*2/15 + 3*1/15 + 6*1/15 = (-4 -6 -2 + 3 + 4 + 3 + 6 )/15 = 4/15 If we take Y=2*X + 1 and find the expected value of Y explicitly using the PMF of Y or the other formula that involves the PMF of X and transformed values of X, it comes out to be -7*1/15 -5*2/15 -1*2/15 + 1*3/15 + 3*3/15 + 5*2/15 + 7*1/15 + 13*1/15 = (-7 -10 -2 + 3 + 9 + 10 + 7 + 13)/15 = 23/15 = 2(4/15) + 1. Hence, the expected value gets the same mapping as the random variables through a linear and invertible mapping. If we take Y = | X - 1 | and find the expected value of Y from the PMF of Y, it comes out to be 0*3/15 + 1*5/15 + 2*3/15 + 4*2/15 + 5*2/15 = (5 + 6 + 8 + 10)/15 = 29/15 ≠ |4/15 – 1| = 11/15. Hence, the expected value does not obey the same mapping if the mapping is non linear. The expectation operator gets mapped exactly through a linear transformation because the expectation operator is a summation and the summation commutes with linear transformations.

Conditional Expectation Expected value of a random variable with partial knowledge about it The conditioning event must have non- zero probability value The conditional expectation is found using conditional PMF Unconditional expectation is a number but conditional expectation could be a parameterized random variable Use your best judgment

Conditional Expectation Please draw these graphs using a better graphics tool. SHOW BOTH GRAPHS ONE BY ONE. DIM THE BELOW FIRST THEN DIM THE ABOVE

Co-variance/Correlation To investigate relationship of different expectations of jointly RV’s From joint PMF we find marginal PMF’s to find E(X) and E(Y) We also find joint expected value E(XY) = ∑∑ xi yj Pr(X=xi & Y=yj) Cov(X,Y) = E(XY) – E(X) E(Y) ρ=Cor(X,Y) = Cov(X,Y)/Sqrt(Var(X)Var(Y)) X and Y are uncorrelated if Cov(X,Y) = 0 Use your best judgement.