Download presentation
Presentation is loading. Please wait.
Published byWilla Ray Modified over 9 years ago
1
Multivariate distributions Suppose we are measuring 2 or more different properties of a system –e.g. rotational and radial velocities of stars in a cluster –colours and magnitudes of stars in a cluster –redshifts and peak apparent magnitudes of distant supernovae To what extent does knowing the value of one random variable X inform us about the other? v sin i VrVr B-V mvmv Independent mvmv z Dependent
2
Joint distribution of X and Y Suppose X and Y are two random variables. Their joint probability distribution is f(X,Y) Normalisation: Projection on to f(X), f(Y): Y X f(X) f(Y)
3
Independence and correlation Independence: –knowing about X does not inform about Y –Definition: Partial correlation: –knowing about X tells you about Y –(but maybe not vice versa) Correlation: –knowing X determines Y X Y Y X Y X
4
Linear transformations: scaling Scaling a random variable X by a constant a: –Mean: –Variance:
5
Linear transformations: addition Adding together two random variables X and Y: True for any joint PDF!
6
Why it works... Centre of gravity of a joint PDF has a well- defined position independent of choice of coordinates. e.g. could use either (X,Y) or (X+Y,X-Y): X Y X+Y X-Y
7
Variance and covariance The variance of X+Y depends on whether X and Y are independent or correlated:
8
Linear transformations of many variables Summation of many scaled random variables: Note that we can express this in matrix form.
9
Correlation Correlation coefficient R is defined: Correlation matrix for several random variables: Hence variance: R = -1 R = +1 R = 0
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.