General Classes of Lower Bounds on Outage Error Probability and MSE in Bayesian Parameter Estimation Tirza Routtenberg Dept. of ECE, Ben-Gurion University.

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

Advanced topics in Financial Econometrics Bas Werker Tilburg University, SAMSI fellow.
Principles of Density Estimation
1 SAMPLE COVARIANCE BASED PARAMETER ESTIMATION FOR DIGITAL COMMUNICATIONS Javier Villares Piera Advisor: Gregori Vázquez Grau Signal Processing for Communications.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
Fast Bayesian Matching Pursuit Presenter: Changchun Zhang ECE / CMR Tennessee Technological University November 12, 2010 Reading Group (Authors: Philip.
Chap 8: Estimation of parameters & Fitting of Probability Distributions Section 6.1: INTRODUCTION Unknown parameter(s) values must be estimated before.
Second order cone programming approaches for handing missing and uncertain data P. K. Shivaswamy, C. Bhattacharyya and A. J. Smola Discussion led by Qi.
Visual Recognition Tutorial
The Mean Square Error (MSE):. Now, Examples: 1) 2)
0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Part 2b Parameter Estimation CSE717, FALL 2008 CUBS, Univ at Buffalo.
SYSTEMS Identification
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Introduction to Signal Estimation. 94/10/142 Outline 
Prediction and model selection
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 8 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
Visual Recognition Tutorial
Statistical Background
Lattices for Distributed Source Coding - Reconstruction of a Linear function of Jointly Gaussian Sources -D. Krithivasan and S. Sandeep Pradhan - University.
Dorin Comaniciu Visvanathan Ramesh (Imaging & Visualization Dept., Siemens Corp. Res. Inc.) Peter Meer (Rutgers University) Real-Time Tracking of Non-Rigid.
July 3, A36 Theory of Statistics Course within the Master’s program in Statistics and Data mining Fall semester 2011.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Short Resume of Statistical Terms Fall 2013 By Yaohang Li, Ph.D.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 13 Oct 14, 2005 Nanjing University of Science & Technology.
ECE 8443 – Pattern Recognition LECTURE 06: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Bias in ML Estimates Bayesian Estimation Example Resources:
Speech Recognition Pattern Classification. 22 September 2015Veton Këpuska2 Pattern Classification  Introduction  Parametric classifiers  Semi-parametric.
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
Random Sampling, Point Estimation and Maximum Likelihood.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Overview Particle filtering is a sequential Monte Carlo methodology in which the relevant probability distributions are iteratively estimated using the.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
ELEC 303 – Random Signals Lecture 18 – Classical Statistical Inference, Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 4, 2010.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.
Consistency An estimator is a consistent estimator of θ, if , i.e., if
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
Chapter 3: Maximum-Likelihood Parameter Estimation l Introduction l Maximum-Likelihood Estimation l Multivariate Case: unknown , known  l Univariate.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
Simulation Study for Longitudinal Data with Nonignorable Missing Data Rong Liu, PhD Candidate Dr. Ramakrishnan, Advisor Department of Biostatistics Virginia.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
1 Introduction to Statistics − Day 4 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Lecture 2 Brief catalogue of probability.
Bayesian Speech Synthesis Framework Integrating Training and Synthesis Processes Kei Hashimoto, Yoshihiko Nankaku, and Keiichi Tokuda Nagoya Institute.
1 6. Mean, Variance, Moments and Characteristic Functions For a r.v X, its p.d.f represents complete information about it, and for any Borel set B on the.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 04: GAUSSIAN CLASSIFIERS Objectives: Whitening.
Lecture 5 Introduction to Sampling Distributions.
Joint Moments and Joint Characteristic Functions.
Friday 23 rd February 2007 Alex 1/70 Alexandre Renaux Washington University in St. Louis Minimal bounds on the Mean Square Error: A Tutorial.
Introduction to Estimation Theory: A Tutorial
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Computacion Inteligente Least-Square Methods for System Identification.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Presentation : “ Maximum Likelihood Estimation” Presented By : Jesu Kiran Spurgen Date :
Part 3: Estimation of Parameters. Estimation of Parameters Most of the time, we have random samples but not the densities given. If the parametric form.
12. Principles of Parameter Estimation
Tirza Routtenberg Dept. of ECE, Ben-Gurion University of the Negev
LECTURE 03: DECISION SURFACES
Parameter Estimation 主講人:虞台文.
Chapter 2 Minimum Variance Unbiased estimation
Parametric Methods Berlin Chen, 2005 References:
12. Principles of Parameter Estimation
Presentation transcript:

General Classes of Lower Bounds on Outage Error Probability and MSE in Bayesian Parameter Estimation Tirza Routtenberg Dept. of ECE, Ben-Gurion University of the Negev Supervisor: Dr. Joseph Tabrikian

Outline Introduction Derivation of a new class of lower bounds on the probability of outage error Derivation of a new class of lower bounds on the MSE Bounds properties: tightness conditions, relation to the ZZLB Examples Conclusion

Introduction Bayesian parameter estimation Goal: to estimate the unknown parameter θ based on the observation vector x. Assumptions: θ and x are random variables The observation cdf and posterior pdf are known Applications: Radar/Sonar, Communication, Biomedical, Audio/speech,…

 Mean-square error (MSE)  Probability of outage error Introduction Parameter estimation criteria

Advantages of the probability of outage error criterion: Provides meaningful information in the presence of large errors case. Dominated by the all error distribution. Prediction of the operation region. SNR Large-errors Threshold Small errors MSE Introduction Parameter estimation criteria SNR Large-errors Threshold Small errors Probability of outage error

Introduction MMSE estimation The minimum MSE is attained by MMSE:

Introduction h-MAP estimation The h-MAP estimator is The corresponding minimum probability of h-outage error is

Performance lower bounds Motivation Performance analysis Threshold prediction System design Feasibility study Threshold bound PERFORMANCE MEASURE SNR or number of samples

Performance lower bounds Bounds desired features Computational simplicity Tightness Asymptotically coincides with the optimal performance Validity: independent of the estimator. Threshold bound PERFORMANCE MEASURE SNR or number of samples

Previous work: probability of outage error bounds Most of the existing bounds on the probability of outage error are based on the relation to the probability of error in decision procedure (binary/multiple). Kotelnikov inequality - lower bound for uniformly distributed unknown parameter.

Previous work: Bayesian MSE bounds Bayesian Cramér–Rao (Van Trees, 1968) Bayesian Bhattacharyya bound (Van Trees 1968) Weiss–Weinstein (1985) Reuven-Messer (1997) Bobrovski–Zakai (1976) Bayesian MSE bounds Weiss–Weinstein class The covariance inequality Ziv-Zakai class Relation to probability of error in decision problem Ziv–Zakai (ZZLB) (1969) Bellini–Tartara (1974) Chazan–Zakai–Ziv (1975) Extended ZZLB (Bell, Steinberg, Ephraim,Van Trees,1997)

General class of outage error probability lower bounds The probability of outage error ? (Reverse) Hölder inequality for Taking

Objective: obtain valid bounds, independent of. General class of outage error probability lower bounds

Theorem: A necessary and sufficient condition to obtain a valid bound which is independent of the estimator, is that the function is periodic in θ with period h, almost everywhere. General class of outage error probability lower bounds

Using Fourier series representation the general class of bounds is General class of outage error probability lower bounds

Example: Linear Gaussian model The model The minimum h-outage error probability: The single coefficient bound:

■ The bound is maximized w.r.t. for given p The tightest subclass of lower bounds Convergence condition: There exists l 0 h (θ,x), α>0 such that for all │l│≥│l 0 h (θ,x)│ This mild condition guaranties that converges for every p≥1.

Under the convergence condition, the tightest bounds are The tightest subclass of lower bounds h – sampling period Repeat for all x and

Properties: ■ The bound exists ■ The bound becomes tighter by decreasing p. ■ For p→1 +, the tightest bound is The tightest subclass of lower bounds Under the convergence condition, the tightest bounds are h – sampling period

General class of MSE lower bounds The probability of outage error and MSE are related via: Chebyshev's inequality Known probability identity

New MSE lower bounds can be obtained by using and lower bounding the probability of outage error For example: ■ General class of MSE bounds: ■ The tightest MSE bound: General class of MSE lower bounds

General class of lower bounds on different cost functions Arbitrary cost function C(·) that is non-decreasing and differentiable satisfies Thus, it can be bounded using lower bounds on the probability of outage error Examples: the absolute error, higher moments of the error.

The extended ZZLB is The tightest proposed MSE bound can be rewritten as Properties: Relation to the ZZLB Theorem The proposed tightest MSE bound is always tighter than the extended ZZLB.

For any converging sequence of non-negative numbers Therefore, Properties: Relation to the ZZLB ZZLBThe proposed bound max out

Properties: unimodal symmetric pdf Theorem: A. If the posterior pdf f θ| x (θ| x) is unimodal, then the proposed tightest outage error probability bound coincides with the minimum probability of outage error for every h>0. B. If the posterior pdf f θ| x (θ| x) is unimodal and symmetric, then the proposed tightest MSE bound coincides with the minimum MSE.

Example 1 Statistics

The model Statistics Example 2

Conclusion The concept of probability of outage error criterion is proposed. New classes of lower bounds on the probability of outage error and on the MSE in Bayesian parameter estimation were derived. It is shown that the proposed tightest MSE bound is always tighter than the Ziv-Zakai lower bound. Tightness of the bounds:  Probability of outage error- condition: Unimodal posterior pdf.  MSE – condition: Unimodal and symmetric posterior pdf.