Position Reconstruction in Miniature Detector Using a Multilayer Perceptron By Adam Levine.

Slides:



Advertisements
Similar presentations
Slides from: Doug Gray, David Poole
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Perceptron Learning Rule
Artificial Intelligence 13. Multi-Layer ANNs Course V231 Department of Computing Imperial College © Simon Colton.
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Introduction to Training and Learning in Neural Networks n CS/PY 399 Lab Presentation # 4 n February 1, 2001 n Mount Union College.
CS 678 –Boltzmann Machines1 Boltzmann Machine Relaxation net with visible and hidden units Learning algorithm Avoids local minima (and speeds up learning)
Adaptive Resonance Theory (ART) networks perform completely unsupervised learning. Their competitive learning algorithm is similar to the first (unsupervised)
Kostas Kontogiannis E&CE
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks / Fall 2004 Shreekanth Mandayam ECE Department Rowan University.
The back-propagation training algorithm
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Radial Basis Functions
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2008 Shreekanth Mandayam ECE Department Rowan University.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
September 23, 2010Neural Networks Lecture 6: Perceptron Learning 1 Refresher: Perceptron Training Algorithm Algorithm Perceptron; Start with a randomly.
Data Mining with Neural Networks (HK: Chapter 7.5)
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Radial Basis Function (RBF) Networks
Radial Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University Course website:
Radial Basis Function Networks
Artificial Neural Networks
DIGITAL IMAGE PROCESSING Dr J. Shanbehzadeh M. Hosseinajad ( J.Shanbehzadeh M. Hosseinajad)
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
1 Chapter 6: Artificial Neural Networks Part 2 of 3 (Sections 6.4 – 6.6) Asst. Prof. Dr. Sukanya Pongsuparb Dr. Srisupa Palakvangsa Na Ayudhya Dr. Benjarath.
Appendix B: An Example of Back-propagation algorithm
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
Radial Basis Function Networks:
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Modelling Language Evolution Lecture 1: Introduction to Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
ADVANCED PERCEPTRON LEARNING David Kauchak CS 451 – Fall 2013.
Multi-Layer Perceptron
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Non-Bayes classifiers. Linear discriminants, neural networks.
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
CS621 : Artificial Intelligence
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Chapter 6 Neural Network.
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Machine Learning Supervised Learning Classification and Regression
Ranga Rodrigo February 8, 2014
CSE 473 Introduction to Artificial Intelligence Neural Networks
Classification with Perceptrons Reading:
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
of the Artificial Neural Networks.
network of simple neuron-like computing elements
Backpropagation.
Capabilities of Threshold Neurons
Development of a Large Area Gamma-ray Detector
Artificial Neural Networks
Artificial Neural Networks / Spring 2002
Presentation transcript:

Position Reconstruction in Miniature Detector Using a Multilayer Perceptron By Adam Levine

Introduction Detector needs algorithm to reconstruct point of interaction in horizontal plane

Geant4 Simulation Implement Geant4 C++ libraries Generate primary particles randomly and map PMT signal to primary position Simulate S2 to get horizontal position, drift time to get vertical

Simulation µ ij = # of photons that hit PMT i during cycle j. X j = position of primary Generate Primary j Fill and store µ ij Store x j Cycle j

PMT Construction

Simulation Stats Ran 8000 cycles on campus computer Each cycle, fired 1keV e - into GXe just above LXe surface Scintillation yield of the GXe was set to /keV (unphysical, just used to generate photons) Number was chosen so that the average number of photon hits per pmt per run ~10000

PMT hits versus Position of Primary PMT 1 PMT 2 PMT 4PMT 3

Making the Algorithm Goal: Find a function ƒ: R N -> R 2 (where N is the number of PMTs) that assigns a PMT signal to its primary’s position N=4 if we, N=16 if we do Work backwards to train a Neural Network

What is a Neural Network? A neural network is a structure that processes and transmits information Modeled directly after the biological neuron

What is a MultiLayer Perceptron? Subset of Artificial Neural Networks Uses structure of neurons, along with training algorithm and an objective functional Reduces problem to extremization of functional/function Implement FLOOD Open Source Neural Networking library

MultiLayer Perceptron Structure Take in scaled input, calculate hidden layer vector with N components where N is the number of hidden neurons Send each component through an “Activation Function” often threshold functions that range between 0 and 1 or -1 and 1 Repeat, until out of hidden layers, send it through Objective Function and then unscale the output.

Training Structure

KEY: W ij = Weight Matrix µ i = input vector f aj = activation function O= output activation function o j = output vector The Math Behind the MultiLayer Perceptron Repeat Until Out of Hidden Layers Unscale output, Send through objective function TRAIN (if needed )

Objective Function and Training Algorithm Used Conjugate Gradient algorithm to train Calculates gradient of Objective function in parameter space, steps down function until stopping criteria are reached x i = ideal position o i = outputted position

Radial Error vs. Epoch Used to check if overtraining has occurred.

Final Radial Error vs. Number of Hidden Neurons Odd Point: Overtraining doesn’t seem to be happening even up to 19 hidden layer neurons!

Ideal coordinates minus outputted coordinates (mm)

GOAL: Get Mean down to ~1 mm Error(mm) of 2000 primaries after Perceptron has been trained Note: These 2000 points were not used to train the Perceptron

Error(mm) vs. primary position

Example Both outputs used perceptron trained with just 4 PMTs

What’s Next Still need to figure out why radial error seems to plateau at around 3mm Possible Solutions: Simulate extra regions of sensitivit y to effectively increase number of PMTs Also: Not getting 100% reflectivity in TPC

With extra SubDetectors Quickly ran the simulation 3000 times with this added sensitivity (16 distinct sensitive regions) Preliminary Graphs: Still need to run more simulations…