Multivariate Methods of Data Analysis in Cosmic Ray Astrophysics A. Chilingarian, A. Vardanyan Cosmic Ray Division, Yerevan Physics Institute, Armenia.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Support Vector Machines
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Acoustic design by simulated annealing algorithm
Continuous simulation of Beyond-Standard-Model processes with multiple parameters Jiahang Zhong (University of Oxford * ) Shih-Chang Lee (Academia Sinica)
Software Quality Ranking: Bringing Order to Software Modules in Testing Fei Xing Michael R. Lyu Ping Guo.
Simple Neural Nets For Pattern Classification
Introduction to Predictive Learning
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Particle Identification in the NA48 Experiment Using Neural Networks L. Litov University of Sofia.
Learning From Data Chichang Jou Tamkang University.
Algorithms and Methods for Particle Identification with ALICE TOF Detector at Very High Particle Multiplicity TOF simulation group B.Zagreev ACAT2002,
Prediction Networks Prediction –Predict f(t) based on values of f(t – 1), f(t – 2),… –Two NN models: feedforward and recurrent A simple example (section.
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
Visual Recognition Tutorial
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
CSCI 347 / CS 4206: Data Mining Module 04: Algorithms Topic 06: Regression.
Radial Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Radial Basis Function Networks
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
A neural network approach to high energy cosmic rays mass identification at the Pierre Auger Observatory S. Riggi, R. Caruso, A. Insolia, M. Scuderi Department.
Approximating the Algebraic Solution of Systems of Interval Linear Equations with Use of Neural Networks Nguyen Hoang Viet Michal Kleiber Institute of.
Efficient Direct Density Ratio Estimation for Non-stationarity Adaptation and Outlier Detection Takafumi Kanamori Shohei Hido NIPS 2008.
Biointelligence Laboratory, Seoul National University
 1  Outline  stages and topics in simulation  generation of random variates.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
A Monte Carlo exploration of methods to determine the UHECR composition with the Pierre Auger Observatory D.D’Urso for the Pierre Auger Collaboration
Kumar Srijan ( ) Syed Ahsan( ). Problem Statement To create a Neural Networks based multiclass object classifier which can do rotation,
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
NEURAL NETWORKS FOR DATA MINING
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
Size and Energy Spectra of incident cosmic radiation obtained by the MAKET - ANI surface array on mountain Aragats. (Final results from MAKET-ANI detector)‏
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
Comparison of Bayesian Neural Networks with TMVA classifiers Richa Sharma, Vipin Bhatnagar Panjab University, Chandigarh India-CMS March, 2009 Meeting,
CSC321: Neural Networks Lecture 2: Learning with linear neurons Geoffrey Hinton.
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.6: Linear Models Rodney Nielsen Many of.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 24 Nov 2, 2005 Nanjing University of Science & Technology.
The short particle bursts during thunderstorms: EAS or lightning seeds? G. Hovsepyan, A. Chilingarian, A. Alikhanian National Laboratory, Armenia.
Linear Discrimination Reading: Chapter 2 of textbook.
Experimental Design Experimental Designs An Overview.
Spectra of the Thunderstorm Correlated Electron and Gamma-Ray Measured at Aragats Bagrat Mailyan and Ashot Chilingarian.
Akram Bitar and Larry Manevitz Department of Computer Science
MINIMUM WORD CLASSIFICATION ERROR TRAINING OF HMMS FOR AUTOMATIC SPEECH RECOGNITION Yueng-Tien, Lo Speech Lab, CSIE National.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Statistical Significance Hypothesis Testing.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 9: Review.
N. Saoulidou & G. Tzanakos1 Analysis for new Period 3 Events N. Saoulidou and G. Tzanakos University of Athens, Department of Physics’ Div. Of Nuclear.
C. Kiesling, MPI for Physics, Munich - ACAT03 Workshop, KEK, Japan, Dec Jens Zimmermann, Christian Kiesling Max-Planck-Institut für Physik, München.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Swarm Intelligence. Content Overview Swarm Particle Optimization (PSO) – Example Ant Colony Optimization (ACO)
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Big data classification using neural network
Deep Feedforward Networks
LECTURE 33: STATISTICAL SIGNIFICANCE AND CONFIDENCE (CONT.)
LECTURE 28: NEURAL NETWORKS
Going Backwards In The Procedure and Recapitulation of System Identification By Ali Pekcan 65570B.
Generalization ..
network of simple neuron-like computing elements
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
LECTURE 28: NEURAL NETWORKS
The Naïve Bayes (NB) Classifier
Using Clustering to Make Prediction Intervals For Neural Networks
Random Neural Network Texture Model
Presentation transcript:

Multivariate Methods of Data Analysis in Cosmic Ray Astrophysics A. Chilingarian, A. Vardanyan Cosmic Ray Division, Yerevan Physics Institute, Armenia

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI Topics Main tasks to be solved in cosmic ray astrophysics Analysis methods Preprocessing and indication of the best parameters Neural Networks for the main data analysis Multi Start Random Search learning algorithm Training, Validation and Generalization errors Overtraining control

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI Individual event weights Results of NN classification and estimation Examples of applications

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI The MAGIC telescope for detecting  -rays from point sources

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI The MAKET-ANI installation for the registration of Extensive Air Showers

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI The development of an extensive air shower induced by primary cosmic ray particle in the atmosphere

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI The Monte-Carlo Simulation is the key problem of any physical inference in indirect experiments

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI What tasks we want to solve measuring EAS characteristics? An inverse problem to be solved: Experimental data Simulated data Experimental data Simulated data ?,?(N e,N μ,N h,S…)  E,A(N e,N μ,N h,S…) ?,?(N e,N μ,N h,S…)  E,A(N e,N μ,N h,S…) Identification of primary particle type Estimation of primary particle energy

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI Why Neural Networks? Neural Networks belong to the general class of nonparametric methods that do not require any assumption about the parametric form of a statistical model they use Are appropriate technique for classification and estimation tasks Are able to treat multidimensional input data

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI The Neural information techniques The central issue of Neural Networks is a bounded mapping of n-dimensional input to m-dimensional output: The functional form of is accumulated in -NN parameters (weights) during the NN training process. The NN training process consists in iterative processing of simulated events, The aim of the training process consists in finding that provides the minimum of error (quality) function:

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI A Feed-Forward Neural Network

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI An example of the NN output distribution in case of classification task

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI Common drawbacks in NN training process Training only one network can lead to the suboptimal generalization Insufficient training events and a risk of overtraining

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI Multi start random search algorithm The Random search learning algorithm implements the following steps: 1. The initial values of NN weights are chosen randomly from Gaussian distribution with  =0 and  The random step in the multidimensional space of NN weights is performed from initial point to modify the weights, the alternation of weights is done according to: where is the NN weight vector at th iteration, is the step size, RNDM is a random number from [0,1] interval, and the term introduces and controls the degree of dependence of the random step value on the already achieved quality function

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI 3. The quality function  is calculated at each iteration by presenting all the training events to NN 4. If  i ≤  i-1, then the vector is kept as new weights of NN and the next step is initializing from that point in space of NN weights, otherwise – return to the previous point is implemented and a new random step is performed. Multi start technique consists in training many Neural Nets starting from different initial weights and using different step size parameters, allowing to scan many points in the multidimensional space of NN weights

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI Training and Validation errors, Overtraining control

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI An acceptable procedure to avoid the overtraining after each successful iteration of the learning process the net error is calculated for the validation sample if the validation error is less than the one obtained at previous iteration, then the NN weights obtained at the current training iteration are memorized else, the NN weights obtained at the previous iteration are stored At the end of training process the weights which provide the minimal error on the validation sample are found and used as the final best weights for NN

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI The multi start RS technique provides a possibility to select the NN with best performance on the control data set

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI Results of energy estimation by NN

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI Results of mass classification by NN

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI Application of NN for gamma/hadron separation task in gamma-ray astronomy

ACAT 2002, June, Moscow, RussiaA.Chilingarian, A.Vardanyan, CRD-YerPhI Cosmic Ray differential energy spectra obtained by NN classification and estimation