Download presentation
Presentation is loading. Please wait.
Published bySusan Bennett Modified over 9 years ago
1
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern Mediterranean University
2
2 EE571 Random Processes
3
3 EE571 Kinds of Random Processes
4
4 EE571 A RANDOM VARIABLE X, is a rule for assigning to every outcome, of an experiment a number X( . –Note: X denotes a random variable and X( denotes a particular value. A RANDOM PROCESS X(t) is a rule for assigning to every a function X(t, –Note: for notational simplicity we often omit the dependence on . Random Processes
5
5 EE571 Conceptual Representation of RP
6
6 EE571 The set of all possible functions is called the ENSEMBLE. Ensemble of Sample Functions
7
7 EE571 A general Random or Stochastic Process can be described as: –Collection of time functions (signals) corresponding to various outcomes of random experiments. –Collection of random variables observed at different times. Examples of random processes in communications: –Channel noise, –Information generated by a source, –Interference. t1t1 t2t2 Random Processes
8
8 EE571 Let denote the random outcome of an experiment. To every such outcome suppose a waveform is assigned. The collection of such waveforms form a stochastic process. The set of and the time index t can be continuous or discrete (countably infinite or finite) as well. For fixed (the set of all experimental outcomes), is a specific time function. For fixed t, is a random variable. The ensemble of all such realizations over time represents the stochastic Random Processes
9
9 EE571 Random Process for a Continuous Sample Space
10
10 EE571 Random Processes
11
11 EE571 Wiener Process Sample Function
12
12 EE571
13
13 EE571 Sample Sequence for Random Walk
14
14 EE571 Sample Function of the Poisson Process
15
15 EE571 Random Binary Waveform
16
16 EE571 Autocorrelation Function of the Random Binary Signal
17
17 EE571 Example
18
18 EE571
19
19 EE571 Random Processes Introduction (1)
20
20 EE571 Introduction A random process is a process (i.e., variation in time or one dimensional space) whose behavior is not completely predictable and can be characterized by statistical laws. Examples of random processes –Daily stream flow –Hourly rainfall of storm events –Stock index
21
21 EE571 Random Variable A random variable is a mapping function which assigns outcomes of a random experiment to real numbers. Occurrence of the outcome follows certain probability distribution. Therefore, a random variable is completely characterized by its probability density function (PDF).
22
22 EE571 STOCHASTIC PROCESS
23
23 EE571 STOCHASTIC PROCESS
24
24 EE571 STOCHASTIC PROCESS
25
25 EE571 STOCHASTIC PROCESS The term “stochastic processes” appears mostly in statistical textbooks; however, the term “random processes” are frequently used in books of many engineering applications.
26
26 EE571 STOCHASTIC PROC ESS
27
27 EE571 DENSITY OF STOCHASTIC PROCESSES First-order densities of a random process A stochastic process is defined to be completely or totally characterized if the joint densities for the random variables are known for all times and all n. In general, a complete characterization is practically impossible, except in rare cases. As a result, it is desirable to define and work with various partial characterizations. Depending on the objectives of applications, a partial characterization often suffices to ensure the desired outputs.
28
28 EE571 For a specific t, X(t) is a random variable with distribution. The function is defined as the first-order distribution of the random variable X(t). Its derivative with respect to x is the first-order density of X(t). DENSITY OF STOCHASTIC PROCESSES
29
29 EE571 If the first-order densities defined for all time t, i.e. f(x,t), are all the same, then f(x,t) does not depend on t and we call the resulting density the first-order density of the random process ; otherwise, we have a family of first-order densities. The first-order densities (or distributions) are only a partial characterization of the random process as they do not contain information that specifies the joint densities of the random variables defined at two or more different times. DENSITY OF STOCHASTIC PROCESSES
30
30 EE571 Mean and variance of a random process The first-order density of a random process, f(x,t), gives the probability density of the random variables X(t) defined for all time t. The mean of a random process, m X (t), is thus a function of time specified by MEAN AND VARIANCE OF RP For the case where the mean of X(t) does not depend on t, we have The variance of a random process, also a function of time, is defined by
31
31 EE571 Second-order densities of a random process For any pair of two random variables X(t 1 ) and X(t 2 ), we define the second-order densities of a random process as or. Nth-order densities of a random process The nth order density functions for at times are given by or. HIGHER ORDER DENSITY OF RP
32
32 EE571 Given two random variables X(t 1 ) and X(t 2 ), a measure of linear relationship between them is specified by E[X(t 1 )X(t 2 )]. For a random process, t 1 and t 2 go through all possible values, and therefore, E[X(t 1 )X(t 2 )] can change and is a function of t 1 and t 2. The autocorrelation function of a random process is thus defined by Autocorrelation function of RP
33
33 EE571 Autocovariance Functions of RP
34
34 EE571 Strict-sense stationarity seldom holds for random processes, except for some Gaussian processes. Therefore, weaker forms of stationarity are needed. Stationarity of Random Processes
35
35 EE571 Time, t PDF of X(t) X(t)X(t) Stationarity of Random Processes
36
36 EE571 Wide Sense Stationarity (WSS) of Random Processes
37
37 EE571 Equality Note that “x(t, i ) = y(t, i ) for every i ” is not the same as “x(t, i ) = y(t, i ) with probability 1”. Equality and Continuity of RP
38
38 EE571 Equality and Continuity of RP
39
39 EE571 Mean square equality Mean Square Equality of RP
40
40 EE571 Equality and Continuity of RP
41
41 EE571
42
42 EE571 Random Processes Introduction (2)
43
43 EE571 Stochastic Continuity
44
44 EE571 Stochastic Continuity
45
45 EE571 Stochastic Continuity
46
46 EE571 Stochastic Continuity
47
47 EE571 Stochastic Continuity
48
48 EE571 Stochastic Continuity
49
49 EE571 A random sequence or a discrete-time random process is a sequence of random variables {X 1 ( ), X 2 ( ), …, X n ( ),…} = {X n ( )}, . For a specific , {X n ( )} is a sequence of numbers that might or might not converge. The notion of convergence of a random sequence can be given several interpretations. Stochastic Convergence
50
50 EE571 The sequence of random variables {X n ( )} converges surely to the random variable X( ) if the sequence of functions X n ( ) converges to X( ) as n for all , i.e., X n ( ) X( ) as n for all . Sure Convergence (Convergence Everywhere)
51
51 EE571 Stochastic Convergence
52
52 EE571 Stochastic Convergence
53
53 EE571 Almost-sure convergence (Convergence with probability 1)
54
54 EE571 Almost-sure Convergence (Convergence with probability 1)
55
55 EE571 Mean-square Convergence
56
56 EE571 Convergence in Probability
57
57 EE571 Convergence in Distribution
58
58 EE571 Convergence with probability one applies to the individual realizations of the random process. Convergence in probability does not. The weak law of large numbers is an example of convergence in probability. The strong law of large numbers is an example of convergence with probability 1. The central limit theorem is an example of convergence in distribution. Remarks
59
59 EE571 Weak Law of Large Numbers (WLLN)
60
60 EE571 Strong Law of Large Numbers (SLLN)
61
61 EE571 The Central Limit Theorem
62
62 EE571 Venn Diagram of Relation of Types of Convergence Note that even sure convergence may not imply mean square convergence.
63
63 EE571 Example
64
64 EE571 Example
65
65 EE571 Example
66
66 EE571 Example
67
67 EE571 Ergodic Theorem
68
68 EE571 Ergodic Theorem
69
69 EE571 The Mean-Square Ergodic Theorem
70
70 EE571 The above theorem shows that one can expect a sample average to converge to a constant in mean square sense if and only if the average of the means converges and if the memory dies out asymptotically, that is, if the covariance decreases as the lag increases. The Mean-Square Ergodic Theorem
71
71 EE571 Mean-Ergodic Process
72
72 EE571 Strong or Individual Ergodic Theorem
73
73 EE571 Strong or Individual Ergodic Theorem
74
74 EE571 Strong or Individual Ergodic Theorem
75
75 EE571 Examples of Stochastic Processes iid random process A discrete time random process {X(t), t = 1, 2, …} is said to be independent and identically distributed (iid) if any finite number, say k, of random variables X(t 1 ), X(t 2 ), …, X(t k ) are mutually independent and have a common cumulative distribution function F X ( ).
76
76 EE571 The joint cdf for X(t 1 ), X(t 2 ), …, X(t k ) is given by It also yields where p(x) represents the common probability mass function. iid Random Stochastic Processes
77
77 EE571 Bernoulli Random Process
78
78 EE571 Random walk process
79
79 EE571 Let 0 denote the probability mass function of X 0. The joint probability of X 0, X 1, X n is Random walk process
80
80 EE571 Random walk process
81
81 EE571 The property is known as the Markov property. A special case of random walk: the Brownian motion. Random walk process
82
82 EE571 Gaussian process A random process {X(t)} is said to be a Gaussian random process if all finite collections of the random process, X 1 =X(t 1 ), X 2 =X(t 2 ), …, X k =X(t k ), are jointly Gaussian random variables for all k, and all choices of t 1, t 2, …, t k. Joint pdf of jointly Gaussian random variables X 1, X 2, …, X k :
83
83 EE571 Gaussian process
84
84 EE571 Time series – AR random process
85
85 EE571 The Brownian motion (one-dimensional, also known as random walk) Consider a particle randomly moves on a real line. Suppose at small time intervals the particle jumps a small distance randomly and equally likely to the left or to the right. Let be the position of the particle on the real line at time t.
86
86 EE571 Assume the initial position of the particle is at the origin, i.e. Position of the particle at time t can be expressed as where are independent random variables, each having probability 1/2 of equating 1 and 1. ( represents the largest integer not exceeding.) The Brownian motion
87
87 EE571 Distribution of X (t) Let the step length equal, then For fixed t, if is small then the distribution of is approximately normal with mean 0 and variance t, i.e.,.
88
88 EE571 Graphical illustration of Distribution of X (t) Time, t PDF of X(t) X(t)X(t)
89
89 EE571 If t and h are fixed and is sufficiently small then
90
90 EE571 Graphical Distribution of the displacement of The random variable is normally distributed with mean 0 and variance h, i.e.
91
91 EE571 Variance of is dependent on t, while variance of is not. If, then, are independent random variables.
92
92 EE571 t X
93
93 EE571 Covariance and Correlation functions of
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.