Download presentation
Presentation is loading. Please wait.
Published byTiffany French Modified over 9 years ago
1
Correntropy as a similarity measure Weifeng Liu, P. P. Pokharel, Jose Principe Computational NeuroEngineering Laboratory University of Florida http://www.cnel.ufl.edu weifeng@cnel.ufl.edu Acknowledgment: This work was partially supported by NSF grant ECS- 0300340 and ECS-0601271.
2
Outline What is correntropy Interpretation as a similarity measure Correntropy Induced Metric robustness Applications
3
Correntropy: General Definition For random variables X, Y correntropy is where K is Gaussian kernel Sample estimator
4
Correntropy = ‘Correlation’ + ‘Entropy’ Correlation with high order moments – Taylor expansion of Gaussian kernel – Kernel size large, second order moment dominates Average over dimensions is the argument of Renyi’s quadratic entropy
5
Reproducing Kernel Hilbert Space induced by Correntropy- (VRKHS) V(t,s) is symmetric and positive-definite Defines a unique Reproducing Kernel Hilbert Space---VRKHS – Wiener filter is an optimal projection in RKHS defined by autocorrelation – Analytical nonlinear Wiener filter framed as an optimal projection in VRKHS
6
Probabilistic Interpretation Integration of joint PDF along x=y line Probability of Probability density of X=Y
7
Probabilistic Interpretation
8
Geometric meaning Two vectors Define a function CIM
9
Correntropy Induced Metric CIM is Non-negative CIM is Symmetric CIM obeys the triangle inequality Therefore it is a metric that is induced in the input space when one operates with correntropy
10
Metric contours Contours of CIM(X,0) in 2D sample space close, like L2 norm Intermediate, like L1 norm far apart, saturates with large-value elements (direction sensitive)
11
CIM versus MSE as a cost function Localized similarity measure
12
CIM is robust to outliers measure similarity in a small interval; Do not care how different outside the interval Resistant to outliers (in the sense of Huber’s M- estimation)
13
Application 1: Matched filter S transmitted binary signal N channel noise Y received signal
14
Application 1: Matched filter Sampled (1,-1) received signal Linear matched filter Correntropy matched filter
15
Application 1: Matched filter SNR (dB) BER
16
Application 2: Robust Regression X input variable f unknown function N noise Y observation
17
Application 2: Robust Regression Maximum Correntropy Criterion (MCC) X y=g(x)
18
MCC is M- Estimation MCC
19
Significance Correntropy is a building block of – correntropy nonlinear Wiener filter – correntropy matched filter – correntropy nonlinear MACE filter – correntropy Principal Component Analysis – Renyi’s quadratic entropy This understanding is crucial to explain the behavior of nonlinear algorithms and high- order statistics!
20
References [1] I. Santamaria, P. P. Pokharel, J. C. Principe, “Generalized correlation function: definition, properties and application to blind equalization,” IEEE Trans. Signal Processing, vol 54, no 6, pp 2187- 2186 [2] P. P. Pokharel, J. Xu, D. Erdogmus, J. C. Principe, “A closed form solution for a nonlinear Wiener filter”, ICASSP2006 [3] Weifeng Liu, P. P. Pokharel, J. C. Principe, “Correntropy: Properties and Applications in Non-Gaussian Signal Processing”, submitted to IEEE Trans. Signal Proc.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.