Download presentation
Presentation is loading. Please wait.
1
Introduction to Signal Estimation
2
94/10/142 Outline
3
94/10/143 Scenario For physical considerations, we know that the voltage is between –V and V volts. The measurement is corrupted by noise which may be modeled as an independent additive zero-mean Gaussian random variable n. The observed variable is r. Thus, The probability density governing the observation process is given as
4
94/10/144 Estimation model Parameter Space: the output of the source is a Probabilistic Mapping from parameter space to observation space :the probability law that governs the effect of parameters on the observation space. Observation space: a finite-dimensional space Estimation rule: A mapping of observation space into an estimate
5
94/10/145 Parameter Estimation Problem In the composite hypothesis-testing problem , a family of distributions on the observation space, indexed by a parameter or set of parameters, a binary decision is wanted to make about the parameter 。 In the estimation problem, values of parameters want to be determined as accurately as possible from the observation enbodied 。 Estimation design philosophies are different due to o the amount of prior information known about the parameter o the performance criteria applied 。.
6
94/10/146 Basic approaches to the parameter estimation Bayesian estimation: oassuming that parameters under estimation are to be a random quantity related statistically to the observation 。 Nonrandom parameter estimation: o the parameters under estimation are assumed to be unknown but without being endowed with any probabilistic structure 。
7
94/10/147 Bayesian Parameter Estimation Statement of problem: o : the parameter space and the parameter o :random variable observation space o :denoting a distribution on the observation space, and mapping from to Finding a function s.t is the best guess of the true value of based on the observation Y=y 。
8
94/10/148 Bayesian Parameter Estimation Performance evaluation: o the solution of this problem depends on the criterion of goodness by which we measure estimation performance --the Cost function assignment 。 is the cost of estimating a true value as for in oThe conditional risk/cost averaged over Y for each oThe Bayes risk: if we adopt the interpretation that the actual parameter value is the realization of a random variable, the Bayes risk/average risk is defined as
9
94/10/149 Bayesian Parameter Estimation owhere is the prior for random variable 。 the appropriate design goal is to find an estimator minimizing , and the estimator is known as a Bayes estimate of oActually , the conditional risk can be rewritten as othe Bayes risk can be formulated as othe Bayes risk can also be written as
10
94/10/1410 Bayesian Parameter Estimation othe Bayes risk can also be written as The equation suggests that for each, the Bayes estimate of can be found by minimizing the posterior cost given Y=y oAssuming that has a conditional density given Y=y for each, then the posterior cost,given Y=y, is given by oDeriving If we know,priori and othe performance of Bayesian estimation depends on a cost function
11
94/10/1411 (MMSE) Minimum-Mean-Squared-Error Estimation oMeasuring the performance of an estimator in terms of the squared of the estimation error oThe corresponding Bayes risk is defined as the mean-squared-error(MMSE) estimator 。 oThe Bayes estimation is the Minimum-mean-squared-error(MMSE) estimator 。 othe posterior cost given Y=y under this condition is given by
12
94/10/1412 (MMSE) Minimum-Mean-Squared-Error Estimation othe cost function is a quadratic form of,and is a convex function 。 oTherefore, it achieves its unique minimum at the point where its derivative with respective to is zero 。 othe conditional mean of given Y=y 。 The estimator is also sometimes termed the conditional mean estimator-CME 。
13
94/10/1413 (MMSE) Minimum-Mean-Squared-Error Estimation Another derivation:
14
94/10/1414 (MMAE) : Minimum-Mean-Absolute-Error Estimation Measuring the performance of an estimator in terms of the absolute value of the estimation error, The corresponding Bayes risk is , which is defined as the mean-absolute-error (MMAE) 。 The Bayes estimation is the Minimum-mean-absolute- error(MMAE) estimator 。
15
94/10/1415 (MMAE) : Minimum-Mean-Absolute-Error Estimation From the definition By change variable and for the first and the second integration, respectively,
16
94/10/1416 (MMAE) : Minimum-Mean-Absolute-Error Estimation Actually,the expression represents that it is a differentiable function of , thus, it can be shown that oThe derivative is a non-decreasing function of oIf approaches -,its value approaches -1 oIf approaches,its value approaches 1 o achieves its minimum over at the point where its derivative changes sign
17
94/10/1417 (MMAE) : Minimum-Mean-Absolute-Error Estimation the cost function can be also expressed as othe minimum-mean-absolute-error estimator is to estimate the median of the conditional density function of,Y=y 。 othe MMAE is also termed as condition median estimator oFor a given density function of, if its mean and median are the same, then,MMSE and MMAE coincide each other,i.e. they have the same performance based on different criterion adopted 。
18
94/10/1418 MAP Maximum A posterior Probability Estimation Assuming the uniform cost function The average posterior cost, given Y=y, to estimate is given by
19
94/10/1419 MAP Maximum A posterior Probability Estimation Consideration I: o Assuming is a discrete random variable taking values in a finite set and with the average posterior cost is given as which suggests that to minimize the average posterior cost, The Bayes estimate in this case is given for each by any value of which can maximizes the posterior probability over , i.e. the Bayes estimate is the value of that has the maximum a posterior probability of occurring given Y=y
20
94/10/1420 MAP Maximum A posterior Probability Estimation Consideration II: o is a continuous random variable with conditional density function given Y=y 。 Thus, the posterior cost become which suggests that the average posterior cost is minimized over by maximizing the area under over the interval 。 Actually, the area can be approximately maximized by choosing to be a point of maximum of 。 the value of can be chosen as small as possible and smooth, then we can obtain where is chosen to the value of maximizing over 。
21
94/10/1421 MAP Maximum A posterior Probability Estimation The MAP estimator can be formulated as o The uniform cost criterion leads to the procedure for estimating as that value maximizing the a posteriori density, which is known as the maximum a posteriori probability (MAP) estimate and is denoted by 。 oIt approximates the Bayes estimate for uniform cost with small
22
94/10/1422 MAP Maximum A posterior Probability Estimation The MAP estimates are often easier to compute than MMSE 、 MMAE , or other estimates 。 A density achieves its maximum value is termed a mode of the corresponding probability 。 Therefore, the MMSE 、 MMAE 、 and MAP estimates are the mean 、 median 、 and mode of the corresponding distribution, respectively 。
23
94/10/1423 Remarks -Modeling the estimation problem Given conditions: oConditional probability density function of Y given oPrior distribution for oConditional probability density of given Y=y oThe MMSE estimator o The MMAE estimator
24
94/10/1424 Remarks -Modeling the estimation problem The MAP estimator oFor MAP estimator, it is not necessary to calculate because the unconditional probability density of y does not affect the maximization over 。 o can be found by maximizing over. oFor the logarithm is an increasing function, therefore, also maximizes over 。 oIf is a continuous random variable given Y=y , then for sufficient smooth and,a necessary condition for MAP is given by MAP equation
25
94/10/1425 Example Probability density of observation given The prior probability density of the posterior probability density of given Y=y where
26
94/10/1426 Example The MMSE othe Bayes risk is the average of the posterior cost the minimum MSE is the average of the conditional variance of
27
94/10/1427 Example the conditional variance given Y=y is shown as MMSE
28
94/10/1428 Example MMAE oFrom the definition the MMAE estimate is the median of oBecause is a continuous random variable given Y=y, the MMAE estimate can be obtained oBy changing the variable
29
94/10/1429 Example othe solution is where T is the solution for ¸ and T~1.68. The MAP o
30
94/10/1430 Example multiple observation
31
94/10/1431 Nonrandom parameter(real) estimation A problem in which we have a parameter indexing the class of observation statistics, that is not modeled as a random variable but nevertheless is unknown. Don’t have enough prior information about the parameter to assign a prior probability distribution to it. Treat the estimation of such parameters in an organized manner.
32
94/10/1432 Statement of problem Given the observation Y=y, what is the best estimate of is real and no information about the true value of the only averaging of cost that we can be done is with respect to the distribution of Y given,the conditional risk owe can not generally expect to minimize the conditional risk uniformly oFor any particular value of, the conditional mean-squared error can be made zero by choosing to be identically for all observations oHowever, it can be poor if the value of is not near the true value of
33
94/10/1433 Statement of problem oUnless oit is not a good estimator due to not with minimum conditional mean- squared-error 。 the conditional mean oif we say that the estimate is unbiased oin general we have biased estimate ovariance of estimator o
34
94/10/1434 MVUE for the unbias estimator,and the variance is the conditional mean-squared error under The best we can hope for is minimum variance unbiased estimator-MVUE
35
94/10/1435 Q & A
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.