Presentation is loading. Please wait.

Presentation is loading. Please wait.

E. Barbuto, C. Bozza, A. Cioffi, M. Giorgini

Similar presentations


Presentation on theme: "E. Barbuto, C. Bozza, A. Cioffi, M. Giorgini"— Presentation transcript:

1 E. Barbuto, C. Bozza, A. Cioffi, M. Giorgini
OPERA Collaboration Meeting Nagoya, Japan, 30 Jan – 4 Feb 2004 Updated results from a new Neural Network tool: e-/p- separation energy measurement from e.m. showers E. Barbuto, C. Bozza, A. Cioffi, M. Giorgini

2 Outline Features of the Neural Network (NN) Software
MC simulation used for the training Results on : Particle (e-/p-) separation Energy reconstruction in e.m. showers Conclusions

3 Neural Network library structure
(see C. Bozza, Gran Sasso Coll. Meeting, May `03) Full generality/modularity Platform and language independence (the library has been tested and debugged both under Windows and Linux) It can be used by any user program, even in server mode Users can extend the library even without recompiling GUI and console interface allowing an interactive monitoring of the training and testing phases Network Type Output layer Multilayer Perceptron : several layers of “neurons”, each one with its transfer function Hidden layer Input layer

4 MC simulation for training and test
lead (1mm) p ,e x y z GEANT simulation of a complete OPERA brick (56 lead targets and 56 emulsion-base-emulsion sheets) First step : e/p discrimination only through their multiple Coulomb scattering before interacting/showering (see E. Barbuto, Phys. Coord. July `03) New step : algorithm taking into account the showers Full analysis : MCS of primary particle + shower analysis

5 Shower analysis (MC data)
1 GeV e- 1 GeV p- 5 GeV e- 5 GeV p- 10 GeV p- 10 GeV e-

6 Description of the shower analysis
Goals : electron/pion separation electron energy reconstruction Requests for a good shower description : Consider as much as possible tracks belonging to the shower Do not integrate over background tracks In order to achieve these requirements : we opened a 50 mrad cone around the primary for each track, we required a maximum relative angle of 400 mrad with respect to the primary direction the tracking thresholds was set at ~10 MeV (due to the tracking efficiency)

7 NN structure Input variables Training NN NN Output
INPUT LAYER : 672 NEURONS Emulsion films crossed by primary particle before showering (1 var.) Number of film where the cone starts (1 var.) Number of charged particle detected per film inside the cone (112 var.) x-coordinates sigma for charged particles in the cone (112 var.) y-coordinates sigma for charged particles in the cone (112 var.) x-angular sigma for charged particles in the cone (112 var.) y-angular sigma for charged particles in the cone (112 var.) Dqx by Multiple Coulomb Scattering of the primary (55 var.) Dqy by Multiple Coulomb Scattering of the primary (55 var.) Input variables NN Output NN Training HIDDEN LAYER : 100 sigmoid neurons OUTPUT LAYER : 1 logistic neuron giving a number between 0 and 1 67300 (672x x1) weights

8 A special feature : missing data
Real emulsion data come with inefficiencies (missing tracks in one or more sheets) Each missing track corresponds to some missing network inputs In order to have a tool able to analyze real data, the situation of missing data must be correctly handled The decreasing of the primary particle energy leads to a higher number of missing input data Example: percentage of events with a minimum number of films crossed by particles of the shower E (GeV) nfilm28 nfilm72 nfilm108 1 67% 2% 0% 3 90% 88% 3% 5 95% 94% 18% 7 96% 29% 9 97% 35%

9 Electron ID: training the network
Training events: 2000 e - with a continuous energy spectrum [0-10] GeV Weight updating every 10 events 90 epochs

10 Electron ID: test with variable energies
First test : MC data having a continuous spectrum [0-10] GeV Electrons (5000 ev.) Pions (2500 ev.) Testing events: e - Separation value x]0;1[: if network output < x  electron if network output > x  pion x = 0.9 A suitable value could be x = 0.9

11 Electron ID: test on fixed energies
We tested the NN also with e-/- particles with fixed energies 3 GeV e- 3 GeV - NN output NN output 5 GeV e- 5 GeV - NN output NN output

12 e-/p- separation efficiency
ID(e-): fractions of electrons correctly identified ID(-): fractions of pions correctly identified f = ID(e-)ID(-) E (GeV) f ID(e-) ID(-) 1 77.9% 81.3% 95.8% 3 98.3% 99.9% 5 99.3% 100% 7 99% 99.2% 99.8% 9 98% 99.7% 0-10 82% 83.3% 98.5%

13 Energy reconstruction: variable e- energies
GeV electrons Total entries : 5000 Mean Value : RMS : 1.667 Fit Mean: Sigma : 0.885 Ereal-Erec (GeV) 0-2 GeV electrons Total entries : 1000 Mean Value : RMS : 2.084 Fit Mean: Sigma : 0.625 Ereal-Erec (GeV) GeV electrons Total entries : 1000 Mean Value: RMS: 1.366 Fit Mean : Sigma: 0.664 Ereal-Erec (GeV)

14 Energy reconstruction: fixed e- energies
1 GeV electrons Total entries : 5000 Mean Value : RMS : 1.957 Mean Fit : Sigma : 0.795 5 GeV electrons Total entries : 5000 Mean Value : RMS : 1.502 Mean Fit : Sigma : 1.164 Erec(GeV) Erec(GeV) 7 GeV electrons Total entries : 5000 Mean Value : RMS : 1.562 Mean Fit : Sigma : 0.980 9 GeV electrons Total entries : 5000 Mean Value : RMS : 1.523 Mean Fit : Sigma : 0.512 Erec(GeV) Erec(GeV)

15 Energy reconstruction resolution
As already said, this NN can work with missing input data, so it was trained and tested with ALL THE EVENTS, independently from the number of crossed layers or particle produced. The reconstruction precision could be improved selecting data with a minimum number of emulsion films crossed by charged particles in the selected cone (this is the usual procedure used in standard NN) 3 GeV electrons (all events) Total entries : 5000 Mean Value : RMS : 1.444 Mean Fit : Sigma : 0.914 DE/E ≈ 31% 3 GeV electrons, ≥ 28 films Total entries : 4532 Mean Value : RMS : Mean Fit: Sigma : 0.631 Erec(GeV) DE/E ≈ 23% Erec(GeV)

16 Conclusions A general neural network software has been developed
Special feature: deals with missing input data It can be used in various modes and environments (Windows/Linux, Interactive/Server) Results: (1) e-/p- separation with f > 98% for 3<E<10 GeV (2) Electron momentum reconstruction can be done without selecting the data (like in the real case!)

17 Electron ID : 7,9 GeV 7GeV p- 7GeV e- 9GeV e- 9GeV -

18 Energy reconstruction: 3 GeV, 72films
3 GeV electrons, ≥ 72 films Total entries: 4386 Mean Value: RMS: 0.652 Mean Fit : Sigma : 0.620 DE/E ≈ 22% Erec(GeV)

19 Class hierarchy NeuronDescriptor Constructor TransferFunction
DerivativeOfTransferFunction BiasNeuronDescriptor ExpNeuronDescriptor GaussianNeuronDescriptor HardLimNeuronDescriptor LinearNeuronDescriptor LogisticNeuronDescriptor SigmoidNeuronDescriptor TrainingHistoryEntry TrainingSet MLPerceptron Constructor Evaluate Train Weights MLPerceptronWithBackPropagation MLPerceptronWithLangevinTraining MLPerceptronWithManhattanTraining MLPerceptronWithStabilizedBPTraining

20 Standard back propagation
learning parameter teaching value neuron output  is a global parameter: if is small the training can be too slow in flat regions and it can be trapped il local minimum if is too big the training could never converge because it always exceed a minimum


Download ppt "E. Barbuto, C. Bozza, A. Cioffi, M. Giorgini"

Similar presentations


Ads by Google