Basic Random Processes. Introduction Annual summer rainfall in Rhode Island is a physical process has been ongoing for all time and will continue. We’d.

Slides:



Advertisements
Similar presentations
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Advertisements

Random Variables ECE460 Spring, 2012.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Simulation with ArenaAppendix C – A Refresher on Probability and StatisticsSlide 1 of 33 A Refresher on Probability and Statistics Appendix C.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Chapter 5 Discrete Random Variables and Probability Distributions
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 5-1 Chapter 5 Some Important Discrete Probability Distributions Basic Business Statistics.
Chapter 4 Probability and Probability Distributions
1 MF-852 Financial Econometrics Lecture 3 Review of Probability Roy J. Epstein Fall 2003.
STAT 497 APPLIED TIME SERIES ANALYSIS
Chapter 4 Discrete Random Variables and Probability Distributions
Introduction to stochastic process
Review.
Discrete Random Variables and Probability Distributions
Probability Distributions Finite Random Variables.
Chapter 6 Continuous Random Variables and Probability Distributions
Probability Distributions
1 Review of Probability Theory [Source: Stanford University]
Evaluating Hypotheses
Prediction and model selection
1 Engineering Computation Part 5. 2 Some Concepts Previous to Probability RANDOM EXPERIMENT A random experiment or trial can be thought of as any activity.
Probability Distributions Random Variables: Finite and Continuous Distribution Functions Expected value April 3 – 10, 2003.
Continuous Random Variables and Probability Distributions
Chapter 5 Continuous Random Variables and Probability Distributions
Irwin/McGraw-Hill © The McGraw-Hill Companies, Inc., 2000 LIND MASON MARCHAL 1-1 Chapter Five Discrete Probability Distributions GOALS When you have completed.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Probability Distributions: Finite Random Variables.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
6- 1 Chapter Six McGraw-Hill/Irwin © 2005 The McGraw-Hill Companies, Inc., All Rights Reserved.
Class 3 Binomial Random Variables Continuous Random Variables Standard Normal Distributions.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Chapter 5 Discrete Random Variables and Probability Distributions ©
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Review and Preview This chapter combines the methods of descriptive statistics presented in.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Discrete probability Business Statistics (BUSA 3101) Dr. Lari H. Arjomand
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal
Chapter 01 Discrete Probability Distributions Random Variables Discrete Probability Distributions Expected Value and Variance Binomial Probability Distribution.
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
King Saud University Women Students
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
One Random Variable Random Process.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Multiple Discrete Random Variables. Introduction Consider the choice of a student at random from a population. We wish to know student’s height, weight,
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Expected values of discrete Random Variables. The function that maps S into S X in R and which is denoted by X(.) is called a random variable. The name.
Statistics What is the probability that 7 heads will be observed in 10 tosses of a fair coin? This is a ________ problem. Have probabilities on a fundamental.
Statistical Estimation Vasileios Hatzivassiloglou University of Texas at Dallas.
Discrete Random Variables. Introduction In previous lectures we established a foundation of the probability theory; we applied the probability theory.
Continuous Random Variables and Probability Distributions
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Probability and Moment Approximations using Limit Theorems.
Lecture 7 Dustin Lueker.  Experiment ◦ Any activity from which an outcome, measurement, or other such result is obtained  Random (or Chance) Experiment.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
Evaluating Hypotheses. Outline Empirically evaluating the accuracy of hypotheses is fundamental to machine learning – How well does this estimate accuracy.
3.1 Discrete Random Variables Present the analysis of several random experiments Discuss several discrete random variables that frequently arise in applications.
Chapter Six McGraw-Hill/Irwin
Appendix A: Probability Theory
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
3.1 Expectation Expectation Example
STA 291 Spring 2008 Lecture 7 Dustin Lueker.
Chapter 5 Some Important Discrete Probability Distributions
Econometric Models The most basic econometric model consists of a relationship between two variables which is disturbed by a random error. We need to use.
Probability distributions
STOCHASTIC HYDROLOGY Random Processes
Econ 3790: Business and Economics Statistics
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Presentation transcript:

Basic Random Processes

Introduction Annual summer rainfall in Rhode Island is a physical process has been ongoing for all time and will continue. We’d better study the probabilistic characteristics of the rainfall for all times. Let X[n] be a RV that denotes the annual summer rainfall for year n. We will be interested in the behavior of the infinite tuple of RV (…, X[-1], X[0], X[1],…)

Introduction Given our interest in the annual summer rainfall, what types of questions are pertinent? A meteorologist might wish to determine if the rainfall totals are increasing with time (there is trend in the data). Assess the probability that the following year the rainfall will be 12 inches or more if we know the entire past history or rainfall totals (prediction). The Korea Composite Stock Price Index or KOSPI ( 코스피지 수 ) is the index of common stocks traded on the Stock Market Division common stocks traded on the Stock Market Division

What is a random process Assume we toss a coin and the repeat subexperiment at one second intervals for all times. Letting n denoting the time in seconds, we generate outcomes at n = 0,1,…. Since there are two possible outcomes a head (X = 1) with probability p and a tail (X = 0) with probability 1 – p the processes is termed a Bernoulli RP. S = {(H,H,T,…), (H,T,H,…), (T,T,H,…)} S X = {(1,1,0,…), (1,0,1,…), (0,0,1,…)} Random process generator (x[0],x[1],…)(X[0], X[1],…)

What is a random process Each realization is a sequence of numbers. The set of all realizations is called the ensemble of realizations. s3s3 s1s1 s2s2

What is a random process The probability density/mass function describes the general distribution of the magnitude of the random process, but it gives no information on the time or frequency content of the process f X (x) time, t x(t)

Type of Random processes Discrete-time/discrete-valued(DTDV)Discrete-time/continuous-valued(DTCV) Continuous-time/discrete-valued(CTDV) Continuous-time/continuous-valued(CTCV) Bernoulli RPGaussian RP Binomial RPGaussian RP

Random Walk Let U i for i = 1,2,…,N be independent RV with the same PMF At each “time” n the new RV X n changes from the old RV X n-1 by ±1 since X n = X n-1 + U n. The the joint PMF is where Conditional probability of independent events and define

Random Walk Note that can be fond by observing that X n = X n-1 + U n and therefore if X n-1 = x n-1 we have that Step 1 – due to Step – due to independence U n ’s have same PMF Finally Realization of U n ’sRealization of X n ’s Random walk

The important property of Stationary The simplest type of RP is Identically and Independent Distributed (IID) process. (For ex. Bernoulli). The joint PMF of any finite number of samples is For example the probability of the first 10 samples being 1,0,1,0,1,0,1,0,1,0 is p 5 (1-p) 5. We are able to specify the joint PMF for any finite number of sample times that is referred as being able to specify the finite dimensional distribution (FDD). If the FDD does not change with the time origin Such processes called stationary.

IID random process is stationary To prove that the IDD RP is a special case of a stationary RP we must show that the following equality holds This follows from By independence By identically distributed By independence If a RP is stationary, then all its joint moments and more generally all expected values of the RP, must be stationary since

Non-stationary processes RP that are not stationary are ones whose means and/or variances change in time, which implies that the marginal PMF/PDF change with time. mean increasing with n Variance decreasing with n

Sum random process Similar to Random walk we have The difference is that U[i] can have any, although the same PMF. Thus, the sum random process is not stationary since mean and variance change with n. It is possible sometimes o transform a nonstationary RP into a stationary one.

Transformation of nonstationary RP into stationary one Example: for the sum RP this can be done by “reversing” the sum. The difference or increment RV U[n] are IID. More generally Nonoverlapping increments for a sum RP are independent If furthermore, n 4 – n 3 = n 2 – n 1, then increments have same PMF since they are composed of the same number of IID RV. Such the sum processes, is said to have stationary independent increments (Random walk is one of them). and

Binomial counting random process Consider the repeated coin tossing experiment. We are interested in the number of heads that occur. Let U[n] be a Bernoulli random process The number of heads is given by the binomial counting (sum) or The RP has stationary and independent increments.

Binomial counting random process Lets determine p X[1],X[2] [1,2] = P[X[1] = 1, X[2] =2]. Note that the event X[1] = 1, X[2] = 2 is equivalent to the event Y 1 = X[1] – X[-1] = 1, Y 2 = X[2] – X[1] = 1, where X[-1] is defined to be identically zero. Y 1 and Y 2 are nonoverlapping increments (but of unequal length), making them independent RV, Thus

Example: Randomly phased sinusoid Consider the DTCV RP given as where θ = 3.43 θ = 6.01 Matlab code This RP is frequently used to model an analog sinusoid whose phase is unknown and that has been sampled by analog to digital convertor. Once two successive are observed, all the remaining ones are known. continuous

Joint moments The first (mean), second (variance) moments and covariance between two samples can always be estimated in practice, in contrast to the joint PMF, which may be difficult to determine. The mean and the variance sequence is defined as The covariance sequence is defined as Note that usual symmetry property of the covariance holds where and

Example: Randomly phased sinusoid Recalling that the phase is uniformly distributed Θ ~ (0, 2π) we have 2π 1/2π Θ p Θ (θ) For all n.

Example: Randomly phased sinusoid Noting that the mean sequence is zero, the covariance sequence becomes The covariance sequence depends only on the spacing between the two samples or on n 2 – n 1.

Example: Randomly phased sinusoid Note the symmetry of the covariance sequence about Δn = 0. The variance follows as for all n.

Real-world Example – Statistical Data Analysis Early we discussed an increase in the annual summer rainfall totals. Why questions is whether it supports global warming or not ? Let’s fine exact increase of the rainfall by fitting a line an + b the historical data. a

Real-world Example – Statistical Data Analysis We estimate a by fitting a straight line to the data set using a least squares procedure that minimizes the least square error (LSE) To find b and a we perform This results in two simultaneous linear equations Where N = 108 four our data set. We used similar approach then were predicting a random variable outcome.

Real-world Example – Statistical Data Analysis In vector/matrix form this is Solving it we get estimation for a and b Note that the mean indeed appears to be increasing with time. The LSE sequence is defined as The error can be quite large. and

Real-world Example – Statistical Data Analysis The increase is a = per year for a total increase of about 1.85 inches over the course of 108 years. Is it possible that the true value of a being zero? Let’s assume that a is zero and then generate 20 realizations assuming the true mode is Where U[n] is uniformly distributed process with var( U ) = The estimating estimating a and b for each realization we get some of the estimated values of a are even negative.

Real-world Example – Statistical Data Analysis Matlab code

Homework 1.Describe a random process that you are likely to encounter In the following situations. 1.Listening to the daily weather forecast 2.Paying the monthly telephone bill 3.Leaving for work in the morning 4.Why is each process random one? 2.For a Bernoulli RP determine the probability that we will observe an alternating sequence of 1’s and 0’s for the first 100 samples with the first sample a 1. What is the probability that we will observe an alternating sequence of 1’s and o’s for all n? 3.Classify the following random processes as either Discrete- time/discrete valued, discrete-time/continuous valued, continuous valued/discrete value and continuous time/continuous value: Temperature in Rhode Island Outcomes for continued spins of a roulette wheel Daily weight of person Number of cars stopped at an intersection

Homework A random process X[n] is stationary. If it know that E[X[10]] = 10 and var(X[10]) = 1, then determine E[X[100]] and var(X[100]). A Bernoulli random process X[n] that takes on values 0 or 1, each with probability of p = ½, is transformed using Y[n] = (-1) n X[n]. Is the random process Y[n] IID? For the randomly phased sinusoid(see slide 19) determine the minimum mean square estimate of X[10] based on observing x[0]. How accurate do you think this prediction will be? For a random process X[n] the mean sequence μ X [n] and covariance sequence c X [n 1,n 2 ] are known. It is desired to predict k samples into the future. If x[n 0 ] is observed, find the minimum mean square estimate of X[n 0 + k]. Next assume that μ X [n] = cos(2πf 0 n) and c X [n 1, n 2 ] = 0.9 |n2 – n1| and evaluate the estimate. Finally, what happens to your prediction as k  ∞ and why?

Homework Verify that by differentiating with respect to b, setting the derivative equal to zero, and solving for b, we obtain the sample mean.