Transform Analysis.

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Probability and Statistics with Reliability, Queuing and Computer Science Applications: Chapter 6 on Stochastic Processes Kishor S. Trivedi Visiting Professor.
Continuous Random Variable (1). Discrete Random Variables Probability Mass Function (PMF)
Review of Basic Probability and Statistics
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE UNIVARIATE TRANSFORMATIONS.
1 Review of Probability Theory [Source: Stanford University]
Probability and Statistics Review
The moment generating function of random variable X is given by Moment generating function.
1 TCOM 501: Networking Theory & Fundamentals Lectures 9 & 10 M/G/1 Queue Prof. Yannis A. Korilis.
LECTURE UNIT 4.3 Normal Random Variables and Normal Probability Distributions.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
Chapter6 Jointly Distributed Random Variables
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
CMPE 252A: Computer Networks Review Set:
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
ST3236: Stochastic Process Tutorial 10
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Modeling and Analysis of Computer Networks
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
The M/M/ N / N Queue etc COMP5416 Advanced Network Technologies.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Chapter 2 Probability, Statistics and Traffic Theories
Continuous Random Variable (1) Section Continuous Random Variable What is the probability that X is equal to x?
Distributions of Functions of Random Variables November 18, 2015
Lecture 5,6,7: Random variables and signals Aliazam Abbasfar.
1 1 Slide Continuous Probability Distributions n The Uniform Distribution  a b   n The Normal Distribution n The Exponential Distribution.
ELEC 303, Koushanfar, Fall’09 ELEC 303 – Random Signals Lecture 8 – Continuous Random Variables: PDF and CDFs Farinaz Koushanfar ECE Dept., Rice University.
Ver Chapter 5 Continuous Random Variables 1 Probability/Ch5.
Chapter 4 Continuous Random Variables and Probability Distributions  Probability Density Functions.2 - Cumulative Distribution Functions and E Expected.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
Lecture 21 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Random Variables By: 1.
Z-Transforms.
Statistics Lecture 19.
Random Variable 2013.
Chapter 4 Continuous Random Variables and Probability Distributions
Exponential Distribution & Poisson Process
The Poisson Process.
Basic Modeling Components
Ch4.2 Cumulative Distribution Functions
Cumulative distribution functions and expected values
Pertemuan ke-7 s/d ke-10 (minggu ke-4 dan ke-5)
M/G/1 Busy Period & Power Optimization
Chapter 5 Statistical Models in Simulation
Multinomial Distribution
Tarbiat Modares University
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Example Suppose X ~ Uniform(2, 4). Let . Find .
Queueing Theory II.
Statistics Lecture 12.
ASV Chapters 1 - Sample Spaces and Probabilities
Dept. of Electrical & Computer engineering
11. Conditional Density Functions and Conditional Expected Values
Some Basic Relationships
Chapter 3 : Random Variables
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
11. Conditional Density Functions and Conditional Expected Values
Lectures prepared by: Elchanan Mossel Yelena Shvets
Further Topics on Random Variables: 1
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Berlin Chen Department of Computer Science & Information Engineering
Lectures prepared by: Elchanan Mossel Yelena Shvets
MATH 3033 based on Dekking et al
Moments of Random Variables
Presentation transcript:

Transform Analysis

Two Types of Transforms Discrete random variable: z-transform X(z) = Gp(z) = E[zX] = Σ{i = 0 to }p(i)zi, where P{X=i} = p(i) Note that X(1) = 1 Continuous random variable: Laplace transform X(s) = Lf(s) = E[e–sX] {t=0 to }e–stfX(t)dt where fX() is p.d.f. of X Note that X(0) = 1 Note also that Xx(s) = e–sx when P{X=x} = 1

Basic Transform Properties Let Z = X+Y, with X and Y two discrete independent r.v.’s Z(z) = X(z)Y(z) Let Z = X+Y, with X and Y two continuous independent r.v.’s with p.d.f. x(t) and y(t), t  0 Z(s) = X(s)Y(s) Let X = A with prob. p and X = B with prob. 1-p, then X(z) = pA(z) + (1 – p)B(z), if A & B are discrete r.v.’s X(s) = pA(s) + (1 – p)B(s), if A & B are continuous r.v.’s

Other Transform Properties – (1a) If Y is a discrete r.v. with probabilities pk, k = 0,1,2,…, and XY is a discrete/continuous r.v. that depends on Y, then XY(z) = {k=0 to }Xk(z)pk – (discrete) XY(s) = {k=0 to }Xk(s)pk – (continuous) In the discrete case, we have XY(z) = E[zXY] = {k=0 to } E[zXY|Y=k]pk = {k=0 to } E[zXk|Y=k]pk = {k=0 to } Xk(z)pk In the continuous case, we have XY(s) = E[e–sXY] = {k=0 to } E[e–sXY|Y=k]pk = {k=0 to } E[e–sXy|Y=k]pk = {k=0 to } Xy(s)pk

Other Transform Properties – (1b) If Y is a continuous r.v. with p.d.f. fY(y), and XY is a discrete/continuous r.v. that depends on Y, then XY(z) = {y=0 to }Xy(z)fY(y)dy – (discrete) XY(s) = {y=0 to }Xy(s)fY(y)dy – (continuous) In the discrete case, we have XY(z) = E[zXY] = {y=0 to } E[zXY|Y=y]fY(y)dy = {y=0 to } E[zXy|Y=y]fY(y)dy = {y=0 to } Xy(z)fY(y)dy In the continuous case, we have XY(s) = E[e–sXY] = {y=0 to } E[e–sXY|Y=y]fY(y)dy = {y=0 to } E[e–sXy|Y=y]fY(y)dy = {y=0 to } Xy(s)fY(y)dy

Applying to # Arrivals in a Service Time Number of Poisson arrivals A (of rate λ) during a (service) time with distribution S(t) – A is a discrete random variable and S a continuous one AS(z) = {t=0 to }AS(z|S=t)fS(t)dt = {t=0 to }At(z)fS(t)dt = {t=0 to }e–λ(1–z)tfS(t)dt (see next slide) = S(λ(1–z)) – a more direct derivation

z-Transform of Poisson Arrivals Given Poisson arrivals of rate λ, what is the z-transform of the number of arrivals in t?

Other Transform Properties – (2) Let Z = Y1+Y2+…+YX, where the Yi‘s are discrete/continuous i.i.d. r.v.’s and X is a discrete r.v. independent of the Yi’s Z(z) = X(Y(z)) – z-transform of Z is z-transform of X evaluated at the z-transform of Y – (discrete) Z(s) = X(Y(s)) – Laplace transform of Z is z-transform of X evaluated at the Laplace transform of Y – (continuous) Based on transform of sum of i.i.d. r.v.’s is product of transform and z-transform as an expectation after conditioning Assume Yi‘s are exponential with parameter μ, and X is Geom(p) X(z) = zp/[1-z(1–p)]; Y(s) = μ/(μ+s) Z(s) = μp/(μ+s–μ+μp) = μp/(s+μp) Z is exponential with parameter μp

Number in System for M/G/1 – (1) Consider the queue state at departure times This forms an embedded DTMC Recall that statistics seen by departures, arrivals, and at random times are identical (PASTA and level crossing argument) πj = π0aj+ Σ{i=1 to j+1}πiaj–i+1 where aj is the probability of j arrivals during one service time Recall that we know AS(z) = S(λ(1–z)), so we can express N(z) – z-transform of number in the system, in terms of the Laplace transform of the service time evaluated at λ(1–z)

Number in System for M/G/1 – (2) We know that π0= 1 – ρ, so that we have and we can now get all the moments of N(z)

Time in System for M/G/1 Let T be a job’s time in the system The z-transform of the number of Poisson arrivals of rate λ during T is given by AT(z) = T(λ–λz) But the number of arrivals during T is the number of jobs in the system seen by a departure, so T(λ–λz) = N(z) Hence, we have Similarly, since T = S + TQ, we get

Re-deriving the P-K Formula

Other Transform Properties – (3) Given a continuous r.v. with p.d.f. b() and c.d.f. B(), then B(s) = b(s)/s i.e., the Laplace transform of the distribution is the Laplace transform of the density function divided by s Note also that the Laplace transform of the complementary c.d.f. is B(s) = (1 – b(s))/s