Download presentation
Presentation is loading. Please wait.
1
Independent Factor Analysis
H. Attias University of California
2
1. Statistical Modeling and Bind Source Seperation
BBS (blind source separation) problem L’ sensors , L source signals Source signals : mutually independent Sources are not observable and unknown Mixing process(linear) and noise unknown Orderly Factor Analysis Cannot perform BSS Gaussian model for p(xj) : 2nd order statistics -> rotation-invariant in factor space Attacks : projection pursuit, generalized additive models
3
ICA Mixing is squre (L’ = L), invertible, instantaneous and noiseless Non-Gaussian p(xj) : not rotation-invariant, maximum-likelihood of mixing matrix is unique p(xj) : restricted Gradient-ascent maximization methods IFA p(xj) : non-Gaussian Generative model : independent sources EM method 2 steps Learning IF model : mixing matrix, noise covariance, source density Source reconstruction
4
2. Independent Factor Generative Model
Noise : IF parameters : Model sensor decity
5
P(xi) : need to be general & t ractable
Source Model : Factorial Mixture of Gaussians P(xi) : need to be general & t ractable MOG (mixture of Gaussian model) q(xi) : work as hidden states
6
Sensor Model Strongly constraint
Modification of mean & variance of a single source states qi would result in shifting a whole column of q => “factorial MOG” Sensor Model
7
Generation of sensor signal y
(i) Pick a unit qi for each source i with probability (ii) Top-down first-order Markov chain
8
Co-adaptive MOG Rotation & scailing of whole line of states
9
3. Learning the IF Model Error Function & the Maximum Likelyhood
Kullback-Leibler distance Maximizing E : maximizing likelyhood of data Relation to mean square point-by-point distance
10
EM Algorithm (E) step :calculate the expected value of the complete-data likelyhood (M) step : minimize
11
Parameter is given by
12
4. Recovering the Sources
If noise free and mixing is invertible, 2 ways LMS estimator, MAP estimator Both are non-linear functions of the data Each satisfies a different optimality criterion
13
LMS Estimator MAP Estimator Minimizes where,
Maximizes the source posterior p(x | y) Simple way : iterative method of gradient ascent
14
5. IFA : Simulation Results
5sec-long speech, music signal and synthesized signal
17
6. IFA with Many Sources: Factorized Variational Approximation
EM becomes intractable as the number of sources in IF model increases. (exponentially with number of sources) Intractability is the choise of p’ as the exact posterior Variational approach : feedforward probabilistic models Factorized Posterior IF model : sources conditioned on a data vector are correlated : non-diagonal In the factorized variational approximation : even when conditioned on a data vector, the sources are independent
18
EM learning rule
19
Mean-Field Equations Learning rules for are similarly derived by fixing W=W’ and solving
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.