Download presentation
Presentation is loading. Please wait.
Published byDuane Haynes Modified over 6 years ago
1
Multiple Random Variables and Joint Distributions
The conditional dependence between random variables serves as a foundation for time series analysis. When multiple random variables are related they are described by their joint distribution and density functions
3
Conditional and Joint Probability
Definition Bayes Rule If D and E are independent Partition of domain into non overlaping sets D1 D2 D3 E Larger form of Bayes Rule
4
Conditional and joint density functions
Conditional density function Marginal density function If X and Y are independent
5
Marginal Distribution
6
Conditional Distribution
7
Expectation and moments of multivariate random variables
8
Covariance and Correlation are measures of Linear Dependence
9
Mutual Information Is there a relationship between these two variables plotted? Correlation, the linear measure of dependence is 0. How to quantify that a relationship exists?
10
Entropy Entropy is a measure of randomness. The more random a variable is, the more entropy it will have. f(x) f(x)
11
Mutual Information Mutual information is a general information theoretic measure of the dependence between two variables. The amount of information gained about X when Y is learned, and vice versa. I(X,Y) = 0 if and only if X and Y are independent
12
Mutual Information Sample Statistic
Requires Monte-Carlo procedure to determine significance. (See later)
13
The theoretical basis for time series models
A random process is a sequence of random variables indexed in time A random process is fully described by defining the (infinite) joint probability distribution of the random process at all times
14
Random Processes A sequence of random variables indexed in time
Infinite joint probability distribution xt+1 = g(xt, xt-1, …,) + random innovation (errors or unknown random inputs)
15
Classification of Random Quantities
16
A time series constitutes a possible realization of a random process completely described by the full (infinite) joint probability distribution Bras, R. L. and I. Rodriguez-Iturbe, (1985), Random Functions and Hydrology, Addison-Wesley, Reading, MA, 559 p.
17
The infinite set of all possible realizations is called the Ensemble.
Bras, R. L. and I. Rodriguez-Iturbe, (1985), Random Functions and Hydrology, Addison-Wesley, Reading, MA, 559 p.
18
Random process properties are formally defined with respect to the ensemble.
First order marginal density function from which the mean and variance can be evaluated f(x(t))
19
Stationarity A strictly stationary stochastic process {xt1, xt2, xt3, …} has the same joint distribution as the series of {xt1+h, xt2+h, xt3+h, …} for any given value of h. This applies for all values of N, i.e. all orders of joint distribution function
20
Stationarity of a specific order
1st Order. A random process is classified as first-order stationary if its first-order probability density function remains equal regardless of any shift in time to its time origin 2nd Order. A random process is classified as second-order stationary if its second-order probability density function does not vary over any time shift applied to both values. This means that the joint distribution is not a function of the absolute values of t1 and t2 but only a function of the lag =(t2-t1) f(x(t1)) = f(x(t1+h)) for any value of h d f(x(t1), x(t2)) = f(x(t1+h), x(t2+h)) for any value of h d
21
First order stationarity
f(x(t1)) = f(x(t2)) t1, t2 d Stationarity of moments
22
Second order density function
f(x(t1), x(t2)) Second order moments Correlation
23
Second order stationarity
f(x(t1), x(t2)) is not a function of the absolute values of t1 and t2 but only a function of the lag =(t2-t1) Second moment stationarity
24
Stationarity of the moments (weak or wide sense stationarity)
2nd Moment. A random process is classified as 2nd Moment stationary if its first and second moments are not a function of the specific time. mean: µ(t) = µ variance: σ2(t)= σ and: covariance: Cov( X(t1), X(t2)) = Cov( X(t1+h), X(t2+h)) This means that the covariance is not a function of the absolute values of t1 and t2 but only a function of the lag = (t2- t1). Subset of 2nd order stationarity For gaussian process equivalent to 2nd order stationarity
25
Periodic Stationarity
In hydrology it is common to work with data subject to a seasonal cycle, i.e. that is formally non-stationary, but is stationary once the period is recognized. Periodic variable y=year, m=month Periodic first order stationarity f(xy1,m) = f(xy2,m) y1, y2 for each m d Workshop 5 on cycles and trends will address incorporating trend components as part of the model. Periodic second moment stationarity Cov(Xy,m1, Xy+,m2) = Cov(m1, m2, )
26
Ergodicity Definitions givin are with respect to the ensemble
It is often possible to observe only one realization How can statistics be estimated from one realization The Ergodicity assumption for stationary processes asserts that averaging over the ensemble is equivalent to averaging over a realization
27
Discrete representation
A continuous random process can only be observed at discrete intervals over a finite domain Zt may be averages from t-1 to t (Rainfall) or instantaneous measurements at t (Streamflow)
28
Markov Property The infinite joint PDF construct is not practical.
A process is Markov order d if the joint PDF characterizing the dependence structure is of dimension no more than d+1. Joint Distribution Conditional Distribution Assumption of the Markov property is the basis for simulation of time series as sequences of later values conditioned on earlier values
29
Linear approach to time series modeling
e.g. Xt=Xt-1+Wt AR1 Model structure and parameters identified to match second moment properties Skewness accommodated using Skewed residuals Normalizing transformation (e.g. log, Box Cox) Seasonality through seasonally varying parameters
30
Nonparametric/Nonlinear approach to time series modeling
e.g. Multivariate nonparametric estimated directly from data then used to obtain NP1 2nd Moments and Skewness inherited by distribution Seasonality through separate distribution for each season Other variants Estimated directly using nearest neighbor method KNN Local polynomial trend function plus residual
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.