Application of Independent Component Analysis (ICA) to Beam Diagnosis

Slides:



Advertisements
Similar presentations
Independent Component Analysis
Advertisements

EigenFaces and EigenPatches Useful model of variation in a region –Region must be fixed shape (eg rectangle) Developed for face recognition Generalised.
Gravitational Wave Astronomy Dr. Giles Hammond Institute for Gravitational Research SUPA, University of Glasgow Universität Jena, August 2010.
Noise & Data Reduction. Paired Sample t Test Data Transformation - Overview From Covariance Matrix to PCA and Dimension Reduction Fourier Analysis - Spectrum.
Air Force Technical Applications Center 1 Subspace Based Three- Component Array Processing Gregory Wagner Nuclear Treaty Monitoring Geophysics Division.
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Color Imaging Analysis of Spatio-chromatic Decorrelation for Colour Image Reconstruction Mark S. Drew and Steven Bergner
Dimension reduction (1)
Principal Component Analysis (PCA) for Clustering Gene Expression Data K. Y. Yeung and W. L. Ruzzo.
Principal Component Analysis
Application of Statistical Techniques to Neural Data Analysis Aniket Kaloti 03/07/2006.
Dimensional reduction, PCA
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Independent Component Analysis (ICA) and Factor Analysis (FA)
ICA Alphan Altinok. Outline  PCA  ICA  Foundation  Ambiguities  Algorithms  Examples  Papers.
Principal Component Analysis Principles and Application.
(1) A probability model respecting those covariance observations: Gaussian Maximum entropy probability distribution for a given covariance observation.
Techniques for studying correlation and covariance structure
Multidimensional Data Analysis : the Blind Source Separation problem. Outline : Blind Source Separation Linear mixture model Principal Component Analysis.
Survey on ICA Technical Report, Aapo Hyvärinen, 1999.
Review of Probability.
Principal Component Analysis (PCA) for Clustering Gene Expression Data K. Y. Yeung and W. L. Ruzzo.
Latent Variable Models Christopher M. Bishop. 1. Density Modeling A standard approach: parametric models  a number of adaptive parameters  Gaussian.
Independent Components Analysis with the JADE algorithm
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
ECE 8443 – Pattern Recognition LECTURE 10: HETEROSCEDASTIC LINEAR DISCRIMINANT ANALYSIS AND INDEPENDENT COMPONENT ANALYSIS Objectives: Generalization of.
Simulation of direct space charge in Booster by using MAD program Y.Alexahin, A.Drozhdin, N.Kazarinov.
An Introduction to Blind Source Separation Kenny Hild Sept. 19, 2001.
Blind Information Processing: Microarray Data Hyejin Kim, Dukhee KimSeungjin Choi Department of Computer Science and Engineering, Department of Chemical.
Double RF system at IUCF Shaoheng Wang 06/15/04. Contents 1.Introduction of Double RF System 2.Phase modulation  Single cavity case  Double cavity case.
CSC2515: Lecture 7 (post) Independent Components Analysis, and Autoencoders Geoffrey Hinton.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 12: Advanced Discriminant Analysis Objectives:
Principal Component Analysis (PCA)
Analyze of Synchrotron Sidebands with ICA Honghuan Liu Indiana University March 15 th 2012.
Independent Component Analysis Independent Component Analysis.
Introduction to Independent Component Analysis Math 285 project Fall 2015 Jingmei Lu Xixi Lu 12/10/2015.
Orbits, Optics and Beam Dynamics in PEP-II Yunhai Cai Beam Physics Department SLAC March 6, 2007 ILC damping ring meeting at Frascati, Italy.
An Introduction of Independent Component Analysis (ICA) Xiaoling Wang Jan. 28, 2003.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
Xiaoying Pang Indiana University March. 17 th, 2010 Independent Component Analysis for Beam Measurement.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 10: PRINCIPAL COMPONENTS ANALYSIS Objectives:
Intro. ANN & Fuzzy Systems Lecture 16. Classification (II): Practical Considerations.
Irena Váňová. B A1A1. A2A2. A3A3. repeat until no sample is misclassified … labels of classes Perceptron algorithm for i=1...N if then end * * * * *
HST.582J/6.555J/16.456J Gari D. Clifford Associate Director, Centre for Doctoral Training, IBME, University of Oxford
Central limit theorem - go to web applet. Correlation maps vs. regression maps PNA is a time series of fluctuations in 500 mb heights PNA = 0.25 *
Dimension reduction (1) Overview PCA Factor Analysis Projection persuit ICA.
CLASSIFICATION OF ECG SIGNAL USING WAVELET ANALYSIS
Part 3: Estimation of Parameters. Estimation of Parameters Most of the time, we have random samples but not the densities given. If the parametric form.
1 C.A.L. Bailer-Jones. Machine Learning. Data exploration and dimensionality reduction Machine learning, pattern recognition and statistical data modelling.
Weiming Guo Accelerator Physics Group / ASD Advanced Photon Source
Lectures 15: Principal Component Analysis (PCA) and
LECTURE 11: Advanced Discriminant Analysis
Vertical emittance measurement and modeling Correction methods
LECTURE 10: DISCRIMINANT ANALYSIS
Brain Electrophysiological Signal Processing: Preprocessing
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Blind Signal Separation using Principal Components Analysis
Principal Component Analysis
Bayesian belief networks 2. PCA and ICA
Blind Source Separation: PCA & ICA
Presented by Nagesh Adluru
EE513 Audio Signals and Systems
Parallelization of Sparse Coding & Dictionary Learning
A Fast Fixed-Point Algorithm for Independent Component Analysis
Feature space tansformation methods
LECTURE 09: DISCRIMINANT ANALYSIS
TI8 analysis / J. Wenninger
Lecture 16. Classification (II): Practical Considerations
Marios Mattheakis and Pavlos Protopapas
Presentation transcript:

Application of Independent Component Analysis (ICA) to Beam Diagnosis 5th MAP Meeting Application of Independent Component Analysis (ICA) to Beam Diagnosis Xiaobiao Huang Indiana University / Fermilab 5th MAP meeting at IU, Bloomington 3/18/2004 11/10/2018

Content Review of MIA* Principles of ICA Comparisons (ICA vs. PCA**) Brief Summary of Booster Results *Model Independent Analysis (MIA), See J. Irvin, Chun-xi Wang, et al **MIA is a Principal Component Analysis (PCA) method. 11/10/2018

Review of MIA 1. Organize BPM turn-by-turn data 2. Perform SVD Each raw is made zero mean 1. Organize BPM turn-by-turn data 2. Perform SVD 3. Identify modes spatial pattern, m×1 vector temporal pattern, 1×T vector 11/10/2018

Review of MIA Features 1. The two leading modes are betatron modes 2. Noise reduction 3. Degree of freedom analysis to locate locale modes (e.g. bad BPM) 4. And more … Comments: MIA is a Principal Component Analysis (PCA) method 11/10/2018

A Model of Turn-by-turn Data BPM turn-by-turn data is considered as a linear* mixture of source signals** (1) Global sources Betatron motion, synchrotron motion, higher order resonance, coupling, etc. (2) Local sources Malfunctioning BPM. Note: *Assume linear transfer function of BPM system. ** This is also the underlying model of MIA 11/10/2018

A Model of Turn-by-turn Data Source signals are assumed to be independent, meaning where p{} is joint probability density function (pdf) and pi {si} represents marginal pdf of si. This property is called statistical independence. Independence is a stronger condition than uncorrelatedness. Independence Uncorrelatedness The source signals can be identified from measurements under some assumptions with Independent Component Analysis (ICA). 11/10/2018

An Introduction to ICA* Three routes toward source signal separation, each makes a certain assumption of source signals. 1. Non-gaussian: source signals are assumed to have non-gaussian distribution. Gaussian pdf 2. Non-stationary: source signals have slowly changing power spectra 3. Time correlated: source signals have distinct power spectra. This is the one we are going to explore * Often also referred as Blind Source Separation (BSS). 11/10/2018

ICA with Second-order Statistics* The model with Measured signals Source signals Random noises Mixing matrix Note:*See A. Belouchrani, et al, for Second Order Blind Identification (SOBI) 11/10/2018

ICA with Second-order Statistics Assumptions (1) Source signals are temporally correlated. No overlapping between power spectra of source signals. As a convention, source signals are normalized, so (2) Noises are temporally white and spatially decorrelated. 11/10/2018

ICA with Second-order Statistics Covariance matrix So the mixing matrix A is the diagonalizer of the sample covariance matrix Cx. Although theoretically mixing matrix A can be found as an approximate joint diagonalizer of Cx() with a selected set of , to facilitate the joint diagonalization algorithm and for noise reduction, a two-phase approach is taken. 11/10/2018

ICA with Second-order Statistics Algorithm 2. Joint approximate diagonalization 1. Data whitening with Set to remove noise D1,D2 are diagonal Benefits of whitening: Reduction of dimension Noise reduction Only rotation (unitary W) is needed to diagonalize. n×n for The mixing matrix A and source signals s 11/10/2018

Linear Optics Functions Measurements The spatial and temporal pattern can be used to measure beta function (), phase advance () and dispersion (Dx) 1. Betatron function and phase advance Betatron motion is decomposed to a sine-like signal and a cosine-like signal a, b are constants to be determined 2. Dispersion Orbit shift due to synchrotron oscillation coupled through dispersion 11/10/2018

Comparison between PCA and ICA Both take a global view of the BPM data and aim at re-interpreting the data with a linear transform. Both assume no knowledge of the transform matrix in advance. Both find un-correlated components. 1. However, the two methods have different criterion in defining the goal of the linear transform. For PCA: to express most variance of data in least possible orthogonal components. (de-correlation + ordering) For ICA: to find components with least mutual information. (Independence) 2. ICA makes use of more information of data than just the covariance matrix (here it uses the time-lagged covariance matrix). 11/10/2018

Comparison between PCA and ICA So, ICA modes are more likely of single physical origin, while PCA modes (especially higher modes) could be mixtures. ICA has extra benefits (potentially) while retaining that of PCA method : 1. More robust betatron motion measurements. (Less sensitive to disturbing signals) 2. Facilitate study of other modes (synchrotron mode, higher order resonance, etc.) 11/10/2018

A case study: PCA vs. ICA Data taken with Fermilab Booster DC mode, starting turn index 4235, length 1000 turns. Horizontal and vertical data were put in the same data matrix (x, z)^T. Similar results if only x or z are considered. Only temporal pattern and its FFT spectrum are shown. Only first 4 modes are compared due to limit of space. The example supports the statement made in the previous slide. 11/10/2018

A case study: PCA vs. ICA ICA Mode 1,4 11/10/2018

A case study: PCA vs. ICA ICA Mode 2,3 11/10/2018

A case study: PCA vs. ICA PCA Mode 1,4 11/10/2018

A case study: PCA vs. ICA PCA Mode 2,3 11/10/2018

A case study: PCA vs. ICA ICA Mode 8, 14 11/10/2018

A case study: PCA vs. ICA PCA Mode 8, 14 11/10/2018

Another Case Study with APS data* ICA Mode 1,3 11/10/2018 *Data supplied by Weiming Guo

Another Case Study with APS data* PCA Mode 1,3 11/10/2018 *Data supplied by Weiming Guo

Booster Results (, ) (b) (a) (c) (1915,1000)*, MODE 1: (a) Spatial pattern; (b) temporal pattern; (c) FFT spectrum of (b) *(Starting turn index, number of turns) 11/10/2018

Booster Results (, ) (b) (a) (c) (1915,1000)*, MODE 2: (a) Spatial pattern; (b) temporal pattern; (c) FFT spectrum of (b) 11/10/2018

Booster Results (, ) (a) σ=7% (b) σ =3 deg Comparison of (, ) between MAD model and measurements. (a) Measured  with error bars. (b) phase advance in a period (S-S). Note: Horizontal beam size is about 20-30 mm at large ; Betatron amplitude was about 0.6mm; BPM resolution 0.08mm. 11/10/2018

Booster Results (Dx) (b) (a) 1000 turns from turn index 1. Temporal pattern. (b) Spatial pattern. (t=0)= -0.3×10-3 11/10/2018

Booster Results (Dx) (a) σD=0.11 m Comparison of dispersion between MAD model and measurements. 11/10/2018

Summary ICA provides a new perspective and technique for BPM turn-by-turn data analysis. ICA could be more useful to study coupling and higher order modes than PCA method. More work is needed to: Explore new algorithms. Refine the algorithms to suit BPM data. More rigorous understanding of ICA and PCA. 11/10/2018