Download presentation
Presentation is loading. Please wait.
1
LHC beam mode classification
Dr Ing Gianluca Valentino Department of Communications and Computer Engineering University of Malta Data Science Research Group
2
Outline Introduction – LHC Machine Cycle Features used
LSTM model training Results Conclusions
3
Introduction – LHC Machine Cycle
Adjust Injection Ramp Flat Top Stable Beams Squeeze
4
Problem formulation We want to build a classifier which can predict transitions amongst the following four beam modes using only beam loss data: Transition between injection to ramp Transition between ramp and flat top Transition between flat top and squeeze Transition between squeeze and collisions
5
Dataset generation There are ~3600 Beam Loss Monitors (BLM) around the LHC to measure local beam losses. They provide 1 Hz data at various running sums (RS09 = 1.31 s chosen, similar to what is normally used in multi-turn loss analysis. Data obtained from Timber (CERN logging service), from 168 pp physics fills. +/- 50 seconds around change in beam mode. Considered only losses from the 42 BLMs at IR7 collimators. Therefore each training sample has dimension 100 x 42. BLM Ionization Chambers in LHC
6
Start Squeeze Start Flat Top Some examples of the multivariate time-series BLM data Start Ramp Start Adjust
7
Dataset generation Feature scaling: Summary of classes:
Beam mode transition # samples in dataset Start of ramp 168 Start of flat top 166 Start of squeeze 131 Start of adjust 151 Total 616 Feature scaling: Each multivariate time series was normalized by the BLM signal at the TCP.C6L7.B1 (primary) collimator - where we generally expect highest losses.
8
Recurrent Neural Network
Motivation: Not all problems can be solved with a neural network structure having a fixed number of inputs and outputs. Measurement data is often sequential (time-series) In practice we can have different input/output scenarios
9
Recurrent Neural Network
Output Sequence Output Yt Yt-1 Yt Yt+1 wy wy wh wh … … = ht ht-1 ht ht+1 wx wx Xt Xt-1 Xt Xt+1 Input Input Sequence
10
Problem of long-term dependencies
Consider the difference between: “The clouds are in the sky”. “I lived in France for three years while I was working for a software development company. I can speak fluent French”. Classical RNNs are not capable of learning these long-term dependencies -> we need LSTMs.
11
Long Short Term Memory (LSTM)
Instead of a single neural network, there are four networks.
12
LSTM cell state The cell state is the horizontal line running across the top Information can flow along it unchanged or with minor modifications.
13
“Forget gate” layer Decides which values of cell to reset
14
“Input gate” layer Sigmoid layer: decides which values of cell to write to Tanh layer: creates vector of new candidate values to write to cell
15
Update cell state The LSTM applies the decisions to the memory cell:
16
“Output gate” layer A sigmoid layer decides which values of cell to output.
17
ML training A RNN-LSTM model was trained to predict the output class.
The output of the LSTM was forwarded to a single Dense layer of size 4, each with a softmax activation function One-hot encoding was used to represent the output (e.g. class #2 -> [0,0,1,0]) Train/test ratio used: 80% / 20%. Implementation: Python and Keras.
18
ML training Cross-validation was done to determine the best parameters for the following: Number of LSTM neurons: [8, 16, 32, 64, 128, 256] -> 32 picked as best Optimizer: [Adam, SGD, RMSprop, Adadelta] -> Adam picked as best Dropout: [0.1, 0.2, 0.3, 0.4] -> 0.2 picked as best
19
Results Accuracy and loss on training and testing sets
20
Results Classification Report from Scikit-Learn: Beam mode transition
precision recall f1-score Start of ramp 1.00 0.94 0.97 Start of flat top 0.83 0.89 Start of squeeze 0.75 0.86 0.80 Start of adjust 0.72 0.84 0.78
21
Conclusions Demonstrated applicability of LSTMs in classifying transitions in beam mode First time (I believe) RNNs used on LHC data Achieved accuracy on unseen data of ~87% Future work: try also using orbit data from BPMs
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.