(4) Random Variables.

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

ELEN 5346/4304 DSP and Filter Design Fall Lecture 15: Stochastic processes Instructor: Dr. Gleb V. Tcheslavski Contact:
Generating Random Numbers
Random Variables ECE460 Spring, 2012.
STAT 497 APPLIED TIME SERIES ANALYSIS
DEPARTMENT OF HEALTH SCIENCE AND TECHNOLOGY STOCHASTIC SIGNALS AND PROCESSES Lecture 1 WELCOME.
Probability By Zhichun Li.
Continuous Random Variables and Probability Distributions
Review of Probability and Random Processes
Linear and generalised linear models Purpose of linear models Least-squares solution for linear models Analysis of diagnostics Exponential family and generalised.
Lecture II-2: Probability Review
Random Variables.
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Probability Theory and Random Processes
STAT 497 LECTURE NOTES 2.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
1 LES of Turbulent Flows: Lecture 1 Supplement (ME EN ) Prof. Rob Stoll Department of Mechanical Engineering University of Utah Fall 2014.
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 1 Random Process
Computer simulation Sep. 9, QUIZ 2 Determine whether the following experiments have discrete or continuous out comes A fair die is tossed and the.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Discrete-time Random Signals
Review of Probability Concepts Prepared by Vera Tabakova, East Carolina University.
Continuous Random Variables and Probability Distributions
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
CY3A2 System identification Input signals Signals need to be realisable, and excite the typical modes of the system. Ideally the signal should be persistent.
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
Theoretical distributions: the Normal distribution.
Random signals Honza Černocký, ÚPGM.
The normal distribution
Expectations of Random Variables, Functions of Random Variables
Stochastic Process - Introduction
Physics 114: Lecture 13 Probability Tests & Linear Fitting
Linear Algebra Review.
12. Principles of Parameter Estimation
(5) Notes on the Least Squares Estimate
3. The X and Y samples are independent of one another.
SIGNALS PROCESSING AND ANALYSIS
Probability.
Properties of the Normal Distribution
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
STATISTICS Random Variables and Distribution Functions
Department of Civil and Environmental Engineering
Computational Data Analysis
Sampling rate conversion by a rational factor
Review of Probability Theory
3.1 Expectation Expectation Example
Introduction to Instrumentation Engineering
The normal distribution
STOCHASTIC HYDROLOGY Random Processes
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
EE513 Audio Signals and Systems
11. Conditional Density Functions and Conditional Expected Values
8. One Function of Two Random Variables
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
Chapter 6 Random Processes
Lecturer Dr. Veronika Alhanaqtah
11. Conditional Density Functions and Conditional Expected Values
12. Principles of Parameter Estimation
16. Mean Square Estimation
CH2 Time series.
8. One Function of Two Random Variables
Continuous Random Variables: Basics
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
Presentation transcript:

(4) Random Variables

Introduction In every measurement operation, random errors are always present. If we measure the same variable several times, the measured values will be different in each repeated trial. The randomness or unpredictability of the value of a certain variable in a modeling context arises generally from the limits of scientific knowledge or the desire to work with models of low complexity. Random signal can not be reproduced exactly in contrast to a deterministic signal. The study of system identification requires good background on random variables and signals.

A random signal example An example of an unpredictable signal is the acceleration measured on the rear wheel axis of a car during a particular time interval when it is driving on highway. The signal is nondeterministic: no prescribed formula to generate such a time record synthetically. The recording of the acceleration will be different when it is measured for a different period in time with the car driving at the same speed over the same road segment.

Generation of random signals Artificial generation of random signals is of interest in simulation and prediction purposes. Since these signals are nondeterministic, reproducing them exactly is not possible. A valid alternative is to generate a time sequence that has similar features to the original random signal. An example of such a feature is the sample mean of all the 2000 samples of the time record in the previous Figure: Where x(k) is the acceleration at time instant k. There are other features to describe random variables and signals as we will see next.

Cumulative distribution function (CDF) The CDF function FX(x) of a random variable X gives the probability of the event {X ≤ x}, which is denoted by

Cumulative distribution function (CDF) From axioms of probability, the CDF has the following properties:

Probability density function (PDF) More frequently used characterization of a random variable is the PDF. The PDF is the derivative of the CDF function The PDF has the property: and the probability of the event {a < X ≤ b} is given by

The expected value of a random variable The CDF and PDF fully specify the behavior of a random variable in the sense that they determine the probabilities of events corresponding to that random variable. However, these functions cannot be determined experimentally in a trivial way Fortunately, in many problems, it is sufficient to specify the behavior of a random variable in terms of certain features such as the expected value and variance of this random variable.

Expected value The expected value, also called mean, of a random variable X is given by The expected value is the average of all values, weighted by their probability. It is the value “expected” beforehand given the probability distribution. The expected value is often called the first-order moment. Higher-order moments of a random variable can also be obtained. The nth-order moment is given as

Variance The variance of a random variable X is the second order centered moment of a random variable: Sometimes the standard deviation is used: The expression for the variance can be simplified as follows: This shows that, for a zero-mean random variable (E[X] = 0), the variance equals its second-order moment E[X2].

Gaussian random variable (normal) A Gaussian random variable X has the following PDF & CDF:

Gaussian random variable (normal) The PDF of a Gaussian RV is completely specified by the mean µ and variance σ2 MATLAB command: randn

Uniform Distribution A uniformly distributed random variable assumes values in a range from a to b with equal probability. MATLAB command (rand) generates a uniformly distributed random variable in the range from 0 to 1.

Multiple random variables It often occurs in practice that several random variables are measured at the same time. This may be an indication that these random variables are related. The probability of events that involve the joint behavior of multiple random variables is described by the joint CDF or joint PDF functions.

Correlation With the definition of the joint PDF of two random variables, the expectation of functions of two random variables can be defined as well. Two relevant expectations are the correlation and the covariance of two random variables. The correlation of two random variables X and Y is

Covariance Let µX = E[X] and µY = E[Y] denote the means of the random variables X and Y, respectively. Then the covariance of variables X and Y is Intuition: the covariance describes how much the two variables “change together” (positive if they change in the same way, negative if they change in opposite ways) around their means.

Uncorrelated Random Variables Two random variables X and Y are uncorrelated if their covariance = 0, i.e. Examples: The education level of a person is correlated with their income. Hair color may be uncorrelated with income (at least in an ideal world).

Vector of Random Variables The case of two RVs can be extended to the vector case. Let X be a vector with entries Xi for i = 1, 2, . . . , n that jointly have a Gaussian distribution with mean:

Vector of Random Variables And the covariance matrix of X is: Remarks:

Random signals A random signal or a stochastic process x is a sequence of random variable x1, x2,…, xN with the index has the meaning of time step k. Observing the process for a certain interval of time yields a sequence of numbers or a record that is called a realization of the stochastic process. In system identification, signals (e.g., inputs, outputs) will often be stochastic processes evolving over discrete time.

Stationary Process Signal values at different time steps can be correlated (e.g. when they are the output of some dynamic system). Nevertheless, signals are usually required to be stationary, in the sense that That is, the mean is constant with time, whereas the covariance depends on only the relative positions of time steps, not their absolute positions. Stationarity implies that the dynamics of the system generating the signal are invariant in time.

Ergodicity Ergodicity offers an empirical tool with which to derive an estimates of the expected values of a random signal, that in practice can be observed only via (a single) realization. For a stationary random signal {x(k)} = {x(1), x(2), …, x(N)}, the time average converges with probability one to the mean value µx , provided that the number of observations N goes to infinity.

Auto-covariance & Cross-covariance The auto-covariance matrix of {x(k)} is Similarly, the cross-covariance matrix of two discrete vector signals {x(k)} and {y(k)} is

Auto-correlation & Cross-correlation functions The autocorrelation function ruu(τ) of a stationary signal {u(k)} can be estimated as Where τ is the lag time. Similarly, the cross-correlation function ruy(τ) between two signals {u(k)} and {y(k)} is defined as MATLAB function xcorr can be used to calculate the sample auto and cross-correlation functions.

Zero-mean white noise Let the stochastic process {e(k)} be a scalar sequence {e(1), e(2), ...}. The sequence {e(k)} is zero-mean white noise if White noise is one of the most important signals in system identification. Zero mean Values at different time steps are uncorrelated

Example: White noise A uniformly distributed white noise sequence u, generated with the MATLAB function rand, is shown. u = -0.5 + rand(125,1); Also shown is the associated normalized autocorrelation function, that is, ruu(τ)/ruu(0) for τ = 0,1,2,… calculated using xcorr(u). The autocorrelation function indicates that only at zero lag the autocorrelation is significant.

Example Consider two random sequences S(k) and R(k), k=1,…,1000 generated using Matlab commands: S = 3*randn(1000, 1) + 10; R = 6*randn(1000, 1) + 20; 1. What is the approximate mean value of each signal? 2. What is the empirical variance of each signal? 3. What is the empirical covariance matrix of a vector sequence

Answer Using Matlab, we can calculate the empirical (sample) mean values and covariances of the signals: Note: it is highly recommended to write your own code to calculate the variance, covariance and correlation functions instead of using only MATLAB built-in functions. mS = 9.9021 mR = 20.2296 covS = 8.9814 covR = 36.3954 covZ = [ 8.9814 0.5103 0.5103 36.3954] mS = mean(S) mR = mean(R) covS = cov(S) covR = cov(R) Z = [S R]; covZ = cov(Z)

Input Signals Random sequences are used as input signals in system identification. Input signals used in the identification experiment have a significant influence on the resulting parameter estimates. Some conditions on the input sequence must be introduced to yield a reasonable identification results. As a trivial illustration, we may think of an input that is identically zero. Such an input will not be able to yield full information about the input/output relationship of the system.

The input should excite the typical modes of the system, and this is called persistent exciting. In mathematical term this is related to the full rank condition of One of the simplest input signal is the step function (ones.m): For systems with a large signal-to-noise ratio, step input can give information about the dynamics such as dominant, time constant, rise time, overshoot and static gain. However, step input does not give sufficient changes to cover high frequency ranges.

Another input signal is the sum of sinusoids With m frequency wj, 0 ≤ w1 < w2 < ... < wm ≤ π. Gaussian white noise (randn.m) can be used as input signal as they excite all frequencies. In practice, with white noise as input, very large values may occur which cannot be implemented due to physical restrictions. Amplitude constrained signals such as a uniformly distributed random sequence (rand.m) or the Pseudo Random Binary sequence (PRBS) are preferred.

PRBS switches between two values and is generated using some specific algorithm such that its autocorrelation function approximates that of the white noise. MATLAB command: u = idinput(N,'prbs') Filtered white noise (colored noise) can also be used in order to emphasizing the identification in different frequency ranges. (filter.m).