Multiple Random Variables and Joint Distributions

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Autocorrelation Functions and ARIMA Modelling
Chapter 3 Properties of Random Variables
Random Variables ECE460 Spring, 2012.
STAT 497 APPLIED TIME SERIES ANALYSIS
1 Alberto Montanari University of Bologna Simulation of synthetic series through stochastic processes.
Random Data Workshops 1 & 2 L. L. Koss. Random Data L.L. Koss Random Data Analysis.
Bootstrap in Finance Esther Ruiz and Maria Rosa Nieto (A. Rodríguez, J. Romo and L. Pascual) Department of Statistics UNIVERSIDAD CARLOS III DE MADRID.
Probability and Statistics Review
Review of Probability and Random Processes
K. Ensor, STAT Spring 2005 The Basics: Outline What is a time series? What is a financial time series? What is the purpose of our analysis? Classification.
Lecture II-2: Probability Review
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Review of Probability.
Hydrologic Statistics
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
STAT 497 LECTURE NOTES 2.
Monte Carlo Simulation CWR 6536 Stochastic Subsurface Hydrology.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Data analyses 2008 Lecture Last Lecture Basic statistics Testing Linear regression parameters Skill.
Elements of Stochastic Processes Lecture II
Spatial Analysis & Geostatistics Methods of Interpolation Linear interpolation using an equation to compute z at any point on a triangle.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Review of Random Process Theory CWR 6536 Stochastic Subsurface Hydrology.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Random processes. Matlab What is a random process?
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
3.Analysis of asset price dynamics 3.1Introduction Price – continuous function yet sampled discretely (usually - equal spacing). Stochastic process – has.
ENEE631 Digital Image Processing (Spring'04) Basics on 2-D Random Signal Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland, College Park.
Lab for Remote Sensing Hydrology and Spatial Modeling Dept of Bioenvironmental Systems Engineering National Taiwan University 1/45 GEOSTATISTICS INTRODUCTION.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Lecture 1: Basic Statistical Tools. A random variable (RV) = outcome (realization) not a set value, but rather drawn from some probability distribution.
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Stochastic Hydrology Random Field Simulation Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering National Taiwan University.
Computational Finance II: Time Series K.Ensor. What is a time series? Anything observed sequentially (by time?) Returns, volatility, interest rates, exchange.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
1 Ka-fu Wong University of Hong Kong A Brief Review of Probability, Statistics, and Regression for Forecasting.
Chapter 6 Random Processes
Stochastic Process - Introduction
Covariance, stationarity & some useful operators
Why Stochastic Hydrology ?
Introduction to Time Series Analysis
Probability Theory and Parameter Estimation I
水分子不時受撞,跳格子(c.p. 車行) 投骰子 (最穩定) 股票 (價格是不穏定,但報酬過程略穩定) 地震的次數 (不穩定)
SIGNALS PROCESSING AND ANALYSIS
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
The distribution function F(x)
Review of Probability Theory
Overview of Supervised Learning
Multiple Random Variables
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Autocorrelation.
Stochastic models - time series.
Stochastic Hydrology Random Field Simulation
Machine Learning Week 4.
Stochastic models - time series.
STOCHASTIC HYDROLOGY Random Processes
Chapter 14 Wiener Processes and Itô’s Lemma
The Spectral Representation of Stationary Time Series
These slides are based on:
Introduction to Time Series Analysis
Introduction to Time Series Analysis
Chapter 6 Random Processes
Lecturer Dr. Veronika Alhanaqtah
Autocorrelation.
Time Series introduction in R - Iñaki Puigdollers
CH2 Time series.
Basic descriptions of physical data
Presentation transcript:

Multiple Random Variables and Joint Distributions The conditional dependence between random variables serves as a foundation for time series analysis. When multiple random variables are related they are described by their joint distribution and density functions

Conditional and Joint Probability Definition Bayes Rule If D and E are independent Partition of domain into non overlaping sets D1 D2 D3 E Larger form of Bayes Rule

Conditional and joint density functions Conditional density function Marginal density function If X and Y are independent

Marginal Distribution

Conditional Distribution

Expectation and moments of multivariate random variables

Covariance and Correlation are measures of Linear Dependence

Mutual Information Is there a relationship between these two variables plotted? Correlation, the linear measure of dependence is 0. How to quantify that a relationship exists?

Entropy Entropy is a measure of randomness. The more random a variable is, the more entropy it will have. f(x) f(x)

Mutual Information Mutual information is a general information theoretic measure of the dependence between two variables. The amount of information gained about X when Y is learned, and vice versa. I(X,Y) = 0 if and only if X and Y are independent

Mutual Information Sample Statistic Requires Monte-Carlo procedure to determine significance. (See later)

The theoretical basis for time series models A random process is a sequence of random variables indexed in time A random process is fully described by defining the (infinite) joint probability distribution of the random process at all times

Random Processes A sequence of random variables indexed in time Infinite joint probability distribution xt+1 = g(xt, xt-1, …,) + random innovation (errors or unknown random inputs)

Classification of Random Quantities

A time series constitutes a possible realization of a random process completely described by the full (infinite) joint probability distribution Bras, R. L. and I. Rodriguez-Iturbe, (1985), Random Functions and Hydrology, Addison-Wesley, Reading, MA, 559 p.

The infinite set of all possible realizations is called the Ensemble. Bras, R. L. and I. Rodriguez-Iturbe, (1985), Random Functions and Hydrology, Addison-Wesley, Reading, MA, 559 p.

Random process properties are formally defined with respect to the ensemble. First order marginal density function from which the mean and variance can be evaluated f(x(t))

Stationarity A strictly stationary stochastic process {xt1, xt2, xt3, …} has the same joint distribution as the series of {xt1+h, xt2+h, xt3+h, …} for any given value of h. This applies for all values of N, i.e. all orders of joint distribution function

Stationarity of a specific order 1st Order. A random process is classified as first-order stationary if its first-order probability density function remains equal regardless of any shift in time to its time origin 2nd Order. A random process is classified as second-order stationary if its second-order probability density function does not vary over any time shift applied to both values. This means that the joint distribution is not a function of the absolute values of t1 and t2 but only a function of the lag =(t2-t1) f(x(t1)) = f(x(t1+h)) for any value of h d f(x(t1), x(t2)) = f(x(t1+h), x(t2+h)) for any value of h d

First order stationarity f(x(t1)) = f(x(t2)) t1, t2 d Stationarity of moments

Second order density function f(x(t1), x(t2)) Second order moments Correlation

Second order stationarity f(x(t1), x(t2)) is not a function of the absolute values of t1 and t2 but only a function of the lag =(t2-t1) Second moment stationarity

Stationarity of the moments (weak or wide sense stationarity) 2nd Moment. A random process is classified as 2nd Moment stationary if its first and second moments are not a function of the specific time. mean: µ(t) = µ variance: σ2(t)= σ and: covariance: Cov( X(t1), X(t2)) = Cov( X(t1+h), X(t2+h)) This means that the covariance is not a function of the absolute values of t1 and t2 but only a function of the lag  = (t2- t1). Subset of 2nd order stationarity For gaussian process equivalent to 2nd order stationarity

Periodic Stationarity In hydrology it is common to work with data subject to a seasonal cycle, i.e. that is formally non-stationary, but is stationary once the period is recognized. Periodic variable y=year, m=month Periodic first order stationarity f(xy1,m) = f(xy2,m) y1, y2 for each m d Workshop 5 on cycles and trends will address incorporating trend components as part of the model. Periodic second moment stationarity Cov(Xy,m1, Xy+,m2) = Cov(m1, m2, )

Ergodicity Definitions givin are with respect to the ensemble It is often possible to observe only one realization How can statistics be estimated from one realization The Ergodicity assumption for stationary processes asserts that averaging over the ensemble is equivalent to averaging over a realization

Discrete representation A continuous random process can only be observed at discrete intervals over a finite domain Zt may be averages from t-1 to t (Rainfall) or instantaneous measurements at t (Streamflow)

Markov Property The infinite joint PDF construct is not practical. A process is Markov order d if the joint PDF characterizing the dependence structure is of dimension no more than d+1. Joint Distribution Conditional Distribution Assumption of the Markov property is the basis for simulation of time series as sequences of later values conditioned on earlier values

Linear approach to time series modeling e.g. Xt=Xt-1+Wt AR1 Model structure and parameters identified to match second moment properties Skewness accommodated using Skewed residuals Normalizing transformation (e.g. log, Box Cox) Seasonality through seasonally varying parameters

Nonparametric/Nonlinear approach to time series modeling e.g. Multivariate nonparametric estimated directly from data then used to obtain NP1 2nd Moments and Skewness inherited by distribution Seasonality through separate distribution for each season Other variants Estimated directly using nearest neighbor method KNN Local polynomial trend function plus residual