Neural Network Analysis of Dimuon Data within CMS Shannon Massey University of Notre Dame Shannon Massey1.

Slides:



Advertisements
Similar presentations
Memristor in Learning Neural Networks
Advertisements

A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Neural networks Introduction Fitting neural networks
1 Image Classification MSc Image Processing Assignment March 2003.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Tuomas Sandholm Carnegie Mellon University Computer Science Department
Artificial Neural Networks
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Supervised and Unsupervised learning and application to Neuroscience Cours CA6b-4.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Slide 1 EE3J2 Data Mining EE3J2 Data Mining Lecture 15: Introduction to Artificial Neural Networks Martin Russell.
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Multi Layer Perceptrons (MLP) Course website: The back-propagation algorithm Following Hertz chapter 6.
CS 4700: Foundations of Artificial Intelligence
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Machine learning Image source:
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Machine learning Image source:
Artificial Neural Networks
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks Debrup Chakraborty Pattern Recognition and Machine Learning 2006.
Explorations in Neural Networks Tianhui Cai Period 3.
Chapter 9 Neural Network.
ANNs (Artificial Neural Networks). THE PERCEPTRON.
Waqas Haider Khan Bangyal. Multi-Layer Perceptron (MLP)
Appendix B: An Example of Back-propagation algorithm
Michigan REU Final Presentations, August 10, 2006Matt Jachowski 1 Multivariate Analysis, TMVA, and Artificial Neural Networks Matt Jachowski
NEURAL NETWORKS FOR DATA MINING
Classification / Regression Neural Networks 2
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
From Machine Learning to Deep Learning. Topics that I will Cover (subject to some minor adjustment) Week 2: Introduction to Deep Learning Week 3: Logistic.
Comparison of Bayesian Neural Networks with TMVA classifiers Richa Sharma, Vipin Bhatnagar Panjab University, Chandigarh India-CMS March, 2009 Meeting,
Use of Multivariate Analysis (MVA) Technique in Data Analysis Rakshya Khatiwada 08/08/2007.
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
CS621 : Artificial Intelligence
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Face Image-Based Gender Recognition Using Complex-Valued Neural Network Instructor :Dr. Dong-Chul Kim Indrani Gorripati.
EEE502 Pattern Recognition
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
CS 188: Artificial Intelligence Learning II: Linear Classification and Neural Networks Instructors: Stuart Russell and Pat Virtue University of California,
C - IT Acumens. COMIT Acumens. COM. To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Artificial Neural Networks By: Steve Kidos. Outline Artificial Neural Networks: An Introduction Frank Rosenblatt’s Perceptron Multi-layer Perceptron Dot.
A Document-Level Sentiment Analysis Approach Using Artificial Neural Network and Sentiment Lexicons Yan Zhu.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Machine Learning Supervised Learning Classification and Regression
Machine Learning for Data Certification at CMS
CSE 473 Introduction to Artificial Intelligence Neural Networks
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Jet Identification in QCD Samples at DØ
Synaptic DynamicsII : Supervised Learning
network of simple neuron-like computing elements
Lecture Notes for Chapter 4 Artificial Neural Networks
Introduction to Radial Basis Function Networks
Artificial Neural Networks / Spring 2002
Presentation transcript:

Neural Network Analysis of Dimuon Data within CMS Shannon Massey University of Notre Dame Shannon Massey1

Overview The CMS Detector Muons and Event Display Classification and Machine Learning Signal and Background Discrimination in Dimuon Final State Summary Acknowledgements Shannon Massey2

The Compact Muon Solenoid (CMS) Detector General-purpose collider detector Modular design Four main parts Tracker, Electromagnetic Calorimeter (ECAL), Hadron Calorimeter (HCAL), Muon System A large solenoid magnet Operated a field of 3.8 Tesla ~100,000 times the magnetic field of the Earth Shannon Massey3

Compact Muon Solenoid (CMS) Shannon Massey4

Muons CMS identifies and measures muons using silicon tracker and the outer muon system Produced in the decay of Standard Model and potential new particles Example: Z→μμ, H→ZZ→4μ, Z’ →μμ Charge: ± 1 e Mass of ~105.7 MeV/c 2 Mass of an electron = ~.51 MeV/c 2 Half-Life of 2.2 μs Good to study since it can be identified easily and measure well Also, smaller backgrounds in high p T muon final states Shannon Massey5

Fireworks Event display through Fireworks (cmsShow) Ability to visualize reconstructed data Shannon Massey6

Binary Classification with Neural Networks Discrimination between signal and background events/objects Essential to find interesting and rare signal events within gigantic data sets Analogous to finding a needle in a haystack Machine Learning algorithms can be used to classify data Shannon Massey7

Machine Learning Supervised Learning Trained through numerous labelled examples Examples: Siri, Image Recognition, Text Recognition (Spam), many others Unsupervised Learning No labels are given to the learning algorithm in the examples Must find structure in the inputs on its own Used to: recognize patterns within data, categorize data Shannon Massey8

Toolkit for Multivariate Analysis (TMVA) Integrated into ROOT Includes many different multivariate classification algorithms Fisher discriminants (linear discriminant analysis) K-Nearest Neighbor Boosted Decision and Regression Trees Artificial Neural Networks (ANN) All algorithms are supervised learning My work focused on ANNs More specifically Multilayer Perceptrons (MLPs) Shannon Massey9

Multilayer Perceptrons Network of “hidden”, simple neurons (perceptrons) Linked by feed-forward connections In order to learn from a set of inputs, TMVA MLPs use back- propagation (of errors) To change the weights, we must compute the partial derivative of the weights (gradient descent method) The derivatives give us the direction (+/-) the weights must change to reduce error Goal: minimize miscalculation error Shannon Massey SignalBackground

Multilayer Perceptrons Shannon Massey11

Toy Data within TMVA TMVA provides an example with a toy data set Signal and Background for var 1, var 2, var 3, var 4 Shannon Massey12

MLP Output from Toy Data MLP Attributes –Sigmoid activation function –600 training cycles –8 different MLPs 1 and 2 Hidden Layers 4, 9, 14, 19 Nodes Difficult to determine effectiveness of the algorithms by eye Shannon Massey13 One Hidden Layer Two Hidden Layers HN:4 HN:19 HN:14 HN:9 HN:4 HN:9 HN:14 HN:19 Signal Background

MLP Analysis of Toy Data Receiver Operating Characteristic Curve (ROC Curve) useful in determining effectiveness of an algorithm Plots background at each x-value/total background vs. signal at each x- value/total signal AUC of a perfect algorithm = 1 AUC of a completely random =.5 Shannon Massey14 Toy Data: Area Under the Curve Table Total Hidden Layers: Nodes in Layer One: Nodes in Layer Two: Area Under the ROC Curve (AUC)

Event Reconstruction Many Thanks to Grace for dimuon samples! Applied a Python script with C++ analyzer to.root generation file Select desired events (Muons) Include necessary attributes (p t, eta, phi) Shannon Massey15

Classification with Dimuons (Case 1) MLP implementation to discriminate Dimuons in Mass Peak and side-bands Signal input: Kinematic Variables Dimuon Mass Peak 80 GeV – 100 GeV (1,758 Events within the range) Background input: surrounding Drell-Yan background Mass: 60 GeV - 80GeV (1,867 Events within the range) Shannon Massey16 Case 2 Case 1 Case 3 Mass [GeV] Number of Events

Classification of Dimuon Mass Peak Shannon Massey17 Naturally not as separated as the toy data input variables Case 1

Classification (Case 1) MLP Attributes Sigmoid activation function 5,000 training cycles 2 MLPs 1 and 2 Hidden Layers 4 Nodes Shannon Massey18

Classification (Case 2) Signal and Background Signal input: Kinematic Variables in Dimuon Mass Peak GeV (1,758 events) Background input: Surrounding Drell-Yan Mass Peak GeV (1,275 events) Shannon Massey19

Classification (Case 2) MLP Attributes Sigmoid activation function 5,000 training cycles 2 MLPs 1 and 2 Hidden Layers 4 Nodes Shannon Massey20

Classification (Case 3) Signal and Background Signal input: Dimuon Mass Peak GeV (1,758 events) Background input: Surrounding Drell-Yan Mass Peak GeV (1,867 events) and GeV (1,275 events) Shannon Massey21

Classification (Case 3) MLP Attributes Sigmoid activation function 5,000 training cycles 2 MLPs 1 and 2 Hidden Layers 4 Nodes Shannon Massey22

MLP Analysis of Dimuon Mass Shannon Massey23

Summary Neural Networks can be used to separate signal and background events in collisions at CMS We used TMVA to apply neural networks in dimuon final state events Applied to Z→μμ and Drell-Yan Deeper MLPs do not increase separation much in these examples Shannon Massey24

Acknowledgements Advisors: Dr. Pushpa Bhat and Dr. Leonard Spiegel Co-workers: Graham Stoddard (Northern Illinois University) and Grace Cummings (Virginia Commonwealth University) Mentor: Dr. Elliot McCrory Staff: Sandra Charles and the SIST Committee Shannon Massey25

Simple Example X Y F * Shannon Massey26 Activation Function Inputs Output

Advances in Neural Networks; Deep Learning Deep Neural Networks Very difficult to train in the past due to many layers Recent advances in many research labs have made it easier to train Relatively new Python modules such as Theano and PyLearn2 provide necessary framework for new DN studies Deep Belief Networks Greedy layer-wise unsupervised learning Self-Organizing Maps Check out LISA Lab at University of Montreal for interesting work regard Deep Learning Shannon Massey27