Download presentation
Presentation is loading. Please wait.
Published byJonah Williams Modified over 9 years ago
1
1 Analytic Solution of Hierarchical Variational Bayes Approach in Linear Inverse Problem Shinichi Nakajima, Sumio Watanabe Nikon Corporation Tokyo Institute of Technology
2
2 Contents Introduction Linear inverse problem Hierarchical variational Bayes [Sato et al.04] James-Stein estimator Purpose Theoretical analysis Setting Solution Discussion Conclusions
3
3 : parameter to be estimated : constant matrix : magnetic field detected by N detectors : electric current at M sites : lead field matrix Linear inverse problem Ill-posed ! Example : Magnetoencephalography (MEG) : observable : noise: observation noise
4
4 Methods for ill-posed problem Model : Prior :, where B -2 is constant. 2. Maximum A posterior (MAP) B -2 is also a parameter to be estimated! 3. Hierarchical Bayes 1, 2 : similar. 3 : very different from 1, 2. 1. Minimum norm maximum likelihood
5
5 Hierarchical Bayes Model : Prior : hyperprior : Why ? Estimate from observation, introducing If estimate and by Bayesian methods, many small elements become zero. (relevance determination) singularities, hierarchy a.k.a. Automatic Relevance Determination (ARD) [Mackay94,Neal96] See [9] if interested.
6
6 Hierarchical variational Bayes But, Bayes estimation requires huge computational costs. Apply VB [Sato et al.04]. Free energy: Trial posterior: where Restriction: Variational method Optimum = Bayes posterior
7
7 James-Stein (JS) estimator : ML estimator ( arithmetic mean) true mean Domination of over : for any true for a certain true K-dimensional mean estimation (Regular model) : samples ML is efficient (never dominated by any unbiased estimator), but is inadmissible (dominated by biased estimator) when [Stein56]. James-Stein estimator [James&Stein61] JS (K=3) ML A certain relation between EB and JS was discussed in [Efron&Morris73] shrinkage factor
8
8 We theoretically analyze the HVB and derive its solution, and discuss a relation between HVB and positive-part JS, focusing on simplified version of Sato’s approach. Purpose [Sato et al.04] have derived simple iterative algorithm based on HVB in MEG application, and experimentally shown good performance. : degree of shrinkage Positive part JS :
9
9 Contents Introduction Linear inverse problem Hierarchical variational Bayes [Sato et al.04] James-Stein estimator Purpose Theoretical analysis Setting Solution Discussion Conclusions
10
10 Setting Consider time series data. time u a’a’ b ARD Model : Prior : U Use constant hyperparameter during U [ Sato et al. 04 ]
11
11 Summary of setting Observable :Parameter : Hyperparmeter (constant during U ): Model : priors: where : d -dimensional normal : identity matrix Constant matrix : n : # of samples m-th element
12
12 Variational condition Restriction: Variational method
13
13 Theorem 1: The VB estimator of m -th element is given by Theorem 1 where HVB solution is similar to positive-part JS estimator with degree of shrinkage proportional to U. Not explicit!
14
14 Contents Introduction Linear inverse problem Hierarchical variational Bayes [Sato et al.04] James-Stein estimator Purpose Theoretical analysis Setting Solution Discussion Conclusions
15
15 Proposition Simply use positive-part JS estimator : where Only requires calculation of Moore-Penrose inverse. (HVB needs iterative calculation.)
16
16 Difference between VB and JS asymptotically equivalent. - When s are orthogonal, - When all s are parallel or orthogonal, - Otherwise, future work. JS suppresses overfitting more than HVB. (ehhances relevant determination.)
17
17 Contents Introduction Linear inverse problem Hierarchical variational Bayes [Sato et al.04] James-Stein estimator Purpose Theoretical analysis Setting Solution Discussion Conclusions
18
18 U Conclusions HVB provides similar result to JS estimation in linear inverse problem. Time duration U affects learning. (large U enhances relevance determination . ) Future work Difference from JS. Bounds of Generalization Error. Conclusions & future work time u a’a’ b
19
19 Thank you!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.