Presentation is loading. Please wait.

Presentation is loading. Please wait.

Modeling and Estimation of Dependent Subspaces J. A. Palmer 1, K. Kreutz-Delgado 2, B. D. Rao 2, Scott Makeig 1 1 Swartz Center for Computational Neuroscience.

Similar presentations


Presentation on theme: "Modeling and Estimation of Dependent Subspaces J. A. Palmer 1, K. Kreutz-Delgado 2, B. D. Rao 2, Scott Makeig 1 1 Swartz Center for Computational Neuroscience."— Presentation transcript:

1 Modeling and Estimation of Dependent Subspaces J. A. Palmer 1, K. Kreutz-Delgado 2, B. D. Rao 2, Scott Makeig 1 1 Swartz Center for Computational Neuroscience 2 Department of Electrical and Computer Engineering University of California San Diego September 11, 2007

2 Outline Previous work on adaptive source densities Types of dependency – Variance dependency – Skew dependency – Non-radially symmetric dependency Normal Variance-Mean Mixtures Examples from EEG

3 Independent Source Densities A general classification of sources: Sub- and Super-Gaussian Super-Gaussian = more peaked, than Gaussian, heavier tail Sub-Gaussian = flatter, more uniform, shorter tail than Gaussian Sub- AND Super-Gaussian Super-Gaussian Sub-Gaussian Gaussian

4 Extended Infomax The (independent) source models used in the Extended Infomax algorithm (Lee) are: Super-Gaussian (Logistic)Sub-Gaussian (Gaussian mixture)

5 Scale Mixture Representation Gaussian Scale Mixtures (GSMs) are sums of Gaussians densities with different variances, but all zero mean: A random variable with a GSM density can be represented as a product of Standard Normal random variable Z, and an arbitrary non-negative random variable : X =  1/2 Z Gaussians Gaussian Scale Mixture Sums of random number of random variables leads to GSM (Renyi) Multivariate densities can be modeled by product non-negative scalar and Gaussian random vector:

6 Super-Gaussian Mixture Model Generalize of Gaussian mixture model to super-Gaussian mixtures: The update rules are similar to the Gaussian mixture model, but include the variational parameters ,

7 Gaussian Scale Mixture Examples 1 Generalized Gaussian, 0 <  < 2: Mixing density is related to positive alpha stable density, S  /2 :

8 Gaussian Scale Mixture Examples 2 Generalized Logistic, > 0: Mixing density is Generalized Kolmogorov:

9 Gaussian Scale Mixture Examples 3 Generalized Hyperbolic: Mixing density is Generalized Inverse Gaussian:

10 Dependent Subspaces Dependent sources modeled by Gaussian scale mixture, i.e. Gaussian vector with common scalar multiplier, yielding “variance dependence”:

11 Dependent Multivariate Densities Multiply Gaussian vector by common scalar: Taking derivatives of both sides: For GSM evaluated at :

12 Define the linear operator V : Then we have, Thus, given univariate GSM, can form multivariate GSM: Posterior moments can be calculated for EM: Dependent Multivariate Densities

13 Example: Generalized Gaussian: Example: Generalized Logistic: Given a univariate GSM p(x), a dependent multivariate density in R 3 is given by: Examples in R 3

14 Non-radial Symmetry Use Generalized Gaussian vectors to model non-radially symmetric dependence:

15 For a Generalized Gaussian scale mixture, The Generalized Hyperbolic density (Barndorff-Nielsen, 1982) is a GSM: The posterior is Generalized Inverse Gaussian: Generalized Hyperbolic

16 The posterior moment for EM is given by: This yields the “Hypergeneralized Hyperbolic density” Hypergeneralized Hyperbolic

17 Integrating this over R d we get: Thus, given a multivariate GSM, we can formulate a multivariate GGSM: More generally, evaluating a multivariate GSM at x p/2 : Generalized Gauss. Scale Mixtures

18 Skew Dependence Skew is modeled with “location-scale mixtures”:

19 Now, for any vector , we have: This can be written in the form, For a multivariate GSM: This is equivalent to the following model: Skew Models

20 EEG brain sources ocular sources scalp muscle sources external EM sources heartbeat

21

22

23 Pairwise Mutual Information

24 Maximize Block Diagonality

25

26

27 Variance Dependency Variance dependence can be estimated directly using 4 th order cross moments Find covariance of source power: Finds components whose activations are “active” at the same times, “co-modulated”

28 Mutual Information/Power Covariance Most of the dependence in mutual information is captured by covariance of power. Summed over 50 lags Some pairs of sources are more sensitive to variance dependence.

29 Variance Dependent Sources

30 Marginal Histograms are “Sparse” However product density is approximately “radially symmetric” Radially symmetric non-Gaussian densities are dependent

31 Conclusion We described a general framework for modeling dependent sources Estimation of model parameters is carried out using the EM algorithm Models include variance dependency, non- radial symmetric dependence, and skew dependence Application to analysis of EEG sources


Download ppt "Modeling and Estimation of Dependent Subspaces J. A. Palmer 1, K. Kreutz-Delgado 2, B. D. Rao 2, Scott Makeig 1 1 Swartz Center for Computational Neuroscience."

Similar presentations


Ads by Google