Machine Learning Study

Slides:



Advertisements
Similar presentations
A brief review of non-neural-network approaches to deep learning
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
EE 690 Design of Embodied Intelligence
Division of Operation and Maintenance Engineering Wear prediction of grinding mill liners Farzaneh Ahmadzadeh, Jan Lundberg
Neural Network Approach to Modeling the Laser Material-Removal Process By Basem. F. Yousef London, Canada, N6A 5B9 December 2001.
ImageNet Classification with Deep Convolutional Neural Networks
S TEERING ALGORITHM EXPERIENCE AT CTF3 Davide Gamba 14 November 2013 The International Workshop on Future Linear Colliders LCWS13.
Back-propagation Chih-yun Lin 5/16/2015. Agenda Perceptron vs. back-propagation network Network structure Learning rule Why a hidden layer? An example:
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Chapter 5 NEURAL NETWORKS
Final Project: Project 9 Part 1: Neural Networks Part 2: Overview of Classifiers Aparna S. Varde April 28, 2005 CS539: Machine Learning Course Instructor:
Bingxin Yang High resolution effective K September 22-23, 2004 High-Resolution Effective K Measurements Using Spontaneous.
Radu Calin Dimitriu Modelling the Hot-Strength of Creep-Resistant Ferritic Steels and Relationship to Creep-Rupture.
BBA Related Issues Linac Coherent Light Source Stanford Synchrotron Radiation Laboratory Stanford Linear Accelerator Center Undulator.
© N. Kasabov Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, MIT Press, 1996 INFO331 Machine learning. Neural networks. Supervised.
Appendix B: An Example of Back-propagation algorithm
Technical Board ATF2 GM FF progress report A.Jeremie ATF2 GM System team: K.Artoos, C.Charrondière, A.Jeremie, J.Pfingstner (a lot of figures from him),
Computer Go : A Go player Rohit Gurjar CS365 Project Presentation, IIT Kanpur Guided By – Prof. Amitabha Mukerjee.
A Simulated-annealing-based Approach for Simultaneous Parameter Optimization and Feature Selection of Back-Propagation Networks (BPN) Shih-Wei Lin, Tsung-Yuan.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Speech Communication Lab, State University of New York at Binghamton Dimensionality Reduction Methods for HMM Phonetic Recognition Hongbing Hu, Stephen.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
BPMs period General BPM Tasks/Projects New single bunch BPM electronics on ALICE AR1 + ST2 They had been tested already last year by Alex and Ian.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
D. Still-FNAL/Tevatron HALO '03 Tevatron Collider II Halo Removal System Dean Still Fermilab Tevatron Department 5/21/2003 Motives for the Collider Run.
Each neuron has a threshold value Each neuron has weighted inputs from other neurons The input signals form a weighted sum If the activation level exceeds.
ImageNet Classification with Deep Convolutional Neural Networks Presenter: Weicong Chen.
The Introduction to CSNS Accelerators Oct. 5, 2010 Sheng Wang AP group, Accelerator Centre,IHEP, CAS.
Machine Learning: A Brief Introduction Fu Chang Institute of Information Science Academia Sinica ext. 1819
J. Wu March 06, 2012 ICFA-FLS 2012 Workshop Jefferson Lab, Newport News, VA Tolerances for Seeded Free Electron Lasers FEL and Beam Phys. Dept. (ARD/SLAC),
Nonlinear balanced model residualization via neural networks Juergen Hahn.
GPGPU Performance and Power Estimation Using Machine Learning Gene Wu – UT Austin Joseph Greathouse – AMD Research Alexander Lyashevsky – AMD Research.
Neural networks (2) Reminder Avoiding overfitting Deep neural network Brief summary of supervised learning methods.
Feasibility of Using Machine Learning Algorithms to Determine Future Price Points of Stocks By: Alexander Dumont.
1 Creating Situational Awareness with Data Trending and Monitoring Zhenping Li, J.P. Douglas, and Ken. Mitchell Arctic Slope Technical Services.
Do Now: What are some important science skills (not topics) that you learned how to do this year in class? The practical – This Thursday in class!! Your.
Evaluation of Gender Classification Methods with Automatically Detected and Aligned Faces Speaker: Po-Kai Shen Advisor: Tsai-Rong Chang Date: 2010/6/14.
LCFI physics studies meeting, 7 th December 04Sonja Hillertp. 1 Charge efficiency and leakage rates  purity vs efficiency plots give only part of the.
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
Progress in CLIC DFS studies Juergen Pfingstner University of Oslo CLIC Workshop January.
Neural network based hybrid computing model for wind speed prediction K. Gnana Sheela, S.N. Deepa Neurocomputing Volume 122, 25 December 2013, Pages 425–429.
Bassem Makni SML 16 Click to add text 1 Deep Learning of RDF rules Semantic Machine Learning.
CS 4501: Introduction to Computer Vision Computer Vision + Natural Language Connelly Barnes Some slides from Fei-Fei Li / Andrej Karpathy / Justin Johnson.
Action-Grounded Push Affordance Bootstrapping of Unknown Objects
ECG data classification with deep learning tools
Steering algorithm experience at CTF3
Syntax-based Deep Matching of Short Texts
The design of smart glasses for VR applications The CU-GLASSES
Date of download: 12/22/2017 Copyright © ASME. All rights reserved.
PSB-PS instrumentation for LIU
Beam Optics Set-Up at SLAC End Station A
For Monochromatic Imaging
A “Holy Grail” of Machine Learing
Medium-term Precipitation Projections with Neural Networks
Master’s Thesis defense Ming Du Advisor: Dr. Yi Shang
Learning Hierarchical Features from Generative Models
A Comparative Study of Convolutional Neural Network Models with Rosenblatt’s Brain Model Abu Kamruzzaman, Atik Khatri , Milind Ikke, Damiano Mastrandrea,
Status of Phase Feedforward Tests
Startup program, 1 Machine setup (i.e. R:\Controls\matlab\acceleratorcontrol\docs\SRSetupUsingMatlab_ALS.doc) Bergoz BPM/orbit checks: monbpms, correct.
Chap. 7 Regularization for Deep Learning (7.8~7.12 )
Towards a ground motion orbit feed-forward at ATF2
Detecting Myocardial Infarctions (Heart Attack) using Neural Network
Visualizing and Understanding Convolutional Networks
Deep Neural Networks: A Hands on Challenge Deep Neural Networks: A Hands on Challenge Deep Neural Networks: A Hands on Challenge Deep Neural Networks:
Automated Recipe Completion using Multi-Label Neural Networks
Artificial Intelligence 10. Neural Networks
Automatic Handwriting Generation
Bidirectional LSTM-CRF Models for Sequence Tagging
LHC beam mode classification
Beam Stability of the LHC Beam Transfer Line TI 8
Presentation transcript:

Machine Learning Study SPEAR3 BTS Injection Efficiency Faya Wang and Xiaobiao Huang Feb. 19 2019

Outline Introduction of BTS injection efficiency Raw data study Neural network for machine learning Learning from the model

BOOST to SPEAR3 (BTS) layout Injection efficiency from BOOST to SPEAR3 is affected by Injected beam emittance and trajectory SPEAR3 dynamic aperture Environment (like temperature) SPEAR3 BOOST

Injection efficiency data Injection efficiency measured by 3 different methods: ACM, ToroidBeam, BOO-QM 2017 2018 2019 BOO-QM: the most high quality data (less noise and more accuracy)

Injection efficiency data preparation – Clean up Cut injeff above 200% and below 50% Remove jitter Total of 2.96% of data are manipulated by the process. 2017 2018 2019

Beam trajectory in BTS [G=FY17, B=FY18, Y=FY19]

Overall beam trajectory -- upstream

Overall beam trajectory -- downstream

SPEAR3 setup: Undulator gap

Enviorment: Temperature Outside

Correlation of beam trajectory and injection efficiency FY17 FY18 FY19 injection efficiency by 3 different methods

Correlation of temperatures and injection efficiency FY17 FY18 FY19 Ambient temperature Ground temperature

Correlation of gaps and injection efficiency FY17 FY18 FY19

2 Model: inputs Injection efficiency is a affected by beam trajectory, SPEAR3 acceptance and some hidden features. As some downstream steering will affect beam which will not be able to captured by BPMs, upstream BPMs together with downstream steering will be used as the inputs Inputs are: upstream BPMS (10) downstream steerings (7) Temperature (2) undulator gaps (3) Total of 22 variables.

3. Data setup Total number of data entries: 130698 40 days of data are used for test Training Validation Test 63.8% 27.4 8.8 Each block = 2 days

5 layers of network (include Dropout to reduce overfitting) 4. Neural network setup 5 layers of network (include Dropout to reduce overfitting) 1st layer: RNN(LSTM), hidden layers 30, drop rate 0.5 2nd layer: CNN, hidden layers 30, drop rate 0.5 3rd layer: CNN, hidden layers 20, drop rate 0.25 4th layer: CNN, hidden layers 10, drop rate 0 5th layer: output Total params: 5611 Validation STD = 3.36

5. Test results with 40 days Test STD = 4.42

Study of temperature and gap effects with the model

Prediction for all the data with the model

Temperature and “Gap05” affects

Temperature and Gap05 affects Major contribution

Ideal beam orbit at different temperatures Feed all the beam trajectory (BPM readings) and steering current to the model Take beam trajectory and steering current within the top 10% of the injection efficiency Major contribution from ground temperature

Ideal beam orbit at different temperatures - BPMs [24 degC] [30 degC] Ideal orbit by the BPM readings at the density peak

Ideal beam orbit at different temperatures - BPMs

Summary A shallow neural network model has been built which is able to obtain reasonable results With the model, we learned The major hidden figure that affects BTS injection efficiency is the ground temperature How the idea beam orbit manipulated by the ground temperature