Download presentation
Presentation is loading. Please wait.
Published byClement Pierce Parks Modified over 9 years ago
1
Estimation of Random Variables Two types of estimation: 1) Estimating parameters/statistics of a random variable (or several) from data. 2)Estimating the value of an inaccessible random variable X based on observations of another random variable,Y. e.g. Estimate the future price of a stock based on its present (and past) price.
2
Two conditional estimators : 1) Maximum A Posteriori Probability (MAP) Estimator: So we need to know the probabilities on the right hand side to do this estimate, especially P(X=x) which may not be available ( Remember, X is hard to observe) If X and Y are jointly continuous :
3
2) Maximum Likelihood (ML) Estimator: i.e. Find the likeliest X value based on the observation. This is useful when P( Y = y | X ) is available, i.e. The likelihood of observing a Y value given the value of X is known. e.g. Probability of receiving a 0 on a communication channel given that a 0 or 1 was sent. If X and Y are jointly continuous :
4
Example 6.26 : X, Y are jointly Gaussian
6
Minimum Mean Square Estimator (MMSE) Estimate X given observations of Y
7
i.e. The best constant MMSE of X, is its mean. The estimation error in this case is Case 2: Linear Estimator g(Y) = aY + b
9
Error
10
high variance X is harder to estimate Best linear estimator is the constant estimator in this case.
11
Heuristic explanation of linear MMSE (standardized version of Y )
12
Case 3: Nonlinear estimator g(Y) Using conditional expectation:
13
is called the regression curve. The error achieved by the regression curve is
14
which is same as the linear MMSE. for jointly Gaussian X, Y, linear MMSE is optimal and the same as MAP estimator. For jointly Gaussian X, Y Thus, the optimal nonlinear MMSE is
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.