“Statistical Approach Using Neural Nets” Nuclear Masses and Half-lives E. Mavrommatis S. Athanassopoulos A. Dakos University of Athens K. A. Gernoth J.

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

Slides from: Doug Gray, David Poole
NEURAL NETWORKS Backpropagation Algorithm
「1」 E. M. Burbidge, G. R. Burbidge, W. A. Fowler, and F. Hoyle, Rev. Mod. Phys. 29, 547 (1957) 「 2 」 A. G. W. Cameron, Chalk River Lab. Rep. CRL-41, A.E.C.L.
Machine Learning Neural Networks
Neurons layered ANNs are structures, inspired from the corresponding biological neural systems, that consist of interconnected processing units (called.
Decision Support Systems
Supervised learning 1.Early learning algorithms 2.First order gradient methods 3.Second order gradient methods.
The Atomic Mass Evaluation: Present and Future WANG Meng Institute of Modern Physics, CAS ARIS2014, Tokyo, June 2.
Chapter 5 NEURAL NETWORKS
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
Chapter 6: Multilayer Neural Networks
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Artificial neural networks:
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
A Shaft Sensorless Control for PMSM Using Direct Neural Network Adaptive Observer Authors: Guo Qingding Luo Ruifu Wang Limei IEEE IECON 22 nd International.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Neural Networks Chapter 6 Joost N. Kok Universiteit Leiden.
Chapter 9 Neural Network.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 23 Nov 2, 2005 Nanjing University of Science & Technology.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Data Mining Techniques in Stock Market Prediction
Appendix B: An Example of Back-propagation algorithm
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
NEURAL NETWORKS FOR DATA MINING
Chapter 7 Neural Networks in Data Mining Automatic Model Building (Machine Learning) Artificial Intelligence.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
ARTIFICIAL NEURAL NETWORKS. Overview EdGeneral concepts Areej:Learning and Training Wesley:Limitations and optimization of ANNs Cora:Applications and.
1 Introduction to Neural Networks And Their Applications.
Applying Neural Networks Michael J. Watts
BACKPROPAGATION: An Example of Supervised Learning One useful network is feed-forward network (often trained using the backpropagation algorithm) called.
Multi-Layer Perceptron
Chapter 6: Artificial Neural Networks for Data Mining
Neural Networks - lecture 51 Multi-layer neural networks  Motivation  Choosing the architecture  Functioning. FORWARD algorithm  Neural networks as.
ARTIFICIAL NEURAL NETWORK ANALYSIS OF MULTIPLE IBA SPECTRA H.F.R.Pinho 1,2, A. Vieira 2, N.R.Nené 1,N. P. Barradas 1,3 1 Instituto Tecnológico e Nuclear,
Computer Architecture Lecture 26 Past and Future Ralph Grishman November 2015 NYU.
Chapter 8: Adaptive Networks
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Chapter 6 Neural Network.
Dynamic Neural Network Control (DNNC): A Non-Conventional Neural Network Model Masoud Nikravesh EECS Department, CS Division BISC Program University of.
Artificial Neural Networks for Data Mining. Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall 6-2 Learning Objectives Understand the.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
An Evolutionary Algorithm for Neural Network Learning using Direct Encoding Paul Batchis Department of Computer Science Rutgers University.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Artificial Neural Networks
Applying Neural Networks
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Introduction to Neural Networks And Their Applications
Going Backwards In The Procedure and Recapitulation of System Identification By Ali Pekcan 65570B.
Prof. Carolina Ruiz Department of Computer Science
Artificial Intelligence Methods
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Pattern Recognition: Statistical and Neural
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

“Statistical Approach Using Neural Nets” Nuclear Masses and Half-lives E. Mavrommatis S. Athanassopoulos A. Dakos University of Athens K. A. Gernoth J. W. Clark UMIST Washington University, Saint Louis NUPECC, Town Meeting, GSI, 2003

Contents Introduction ANNs for global modeling of nuclear properties Nuclear Masses Half-lives of β - decaying nuclides Conclusions - Prospects

Global models Hamiltonian Masses Möller et al. (FRDM) Pearson et al. (HFBCS-1) Half-lives Möller et al. (FRDM) Klapdor et al Statistical Neural networks ……………………. Number of parameters Input

Artificial Neural Networks ANNs Systems of neuron-like units that are arranged in some architecture and are connected with each other through weights Different applications of ANNs… Scientific problems J. W. Clark, T. Lindenau & F. Ristig, Scientific Applications of NNs (Springer, 99) We focus on the task Approximation of a fundamental mapping from one set of physical quantities to another ANN is required under training with a subset of estimating data base to create a statistical model for subsequent use in prediction.

Neural network models We use multi-layered feed-forward supervised neural networks 1.Architecture and Dynamics [I-H 1 -H 2 -…H L -O], activation function 2. Training Algorithms Back-propagation (SB), Modified back-propagation (MB1) 3. Data sets Learning, validation, test 4. Coding at Input and Output Interfaces 5. Performance measures

Neural networks elements w 13 w 36 Input units Output unit 6. Learning continues until the error criteria is satisfied. 5. The procedure repeats for the next pattern. 4. The connection weights change so that the cost function proportional to (t-o) 2 reduces. 3. The output value is compared with target value t 2. The information proceeds towards the output unit 1. Independent variables of a known pattern are presented at the input units. We use multi-layered feed-forward supervised neural networks. χ1χ1 χ2χ2 ο

Supervised learning (on line updating) Feedforward neural network Training data Input patterns Target outputs Adjust weights to reduce error YesInput pattern Error criteria satisfied No Calculate error Stop Get next pattern Target output

References (calculations of ΔΜ with artificial neural networks) - S. Gazula, J. W. Clark and H. Bohr, Nucl. Phys. A540 (1992) 1 - K. A. Gernoth, J. W. Clark, J. S. Prater and H. Bohr, Phys. Lett. B300 (1993) 1 - K. A. Gernoth and J. W. Clark, Comp. Phys. Commun. 88 (1995) 1 - E. Mavrommatis, S. Athanassopoulos, K. A. Gernoth, and J. W. Clark, Condensed Matter Theories, Vol. 15, edited by G. S. Anagnostakos et al. (Nova Science Publishers, N.Y. 2000) p J. W. Clark, E. Mavrommatis, S. Athanassopoulos, A. Dakos and K. A. Gernoth in Proceedings of the Conference on ``Fission Dynamics of Atomic Clusters and Nuclei'', D. M. Brink et al., eds. (World Scientific, Singapore 2002) p.76 - S. Athanassopoulos, E. Mavrommatis, K. A. Gernoth, and J. W. Clark, submitted to Phys. Rev. C. (calculations of T 1/2 with artificial neural networks) - E. Mavrommatis, A. Dakos, K. A. Gernoth, and J. W. Clark, Condensed Matter Theories, Vol. 13, J. Da Providencia and F. B. Malik, eds. (Nova Science Publishers, Commack, NY, 1998) p E. Mavrommatis, S. Athanassopoulos, A. Dakos, K. A. Gernoth, and J. W. Clark, in Proceedings of the International Conference on “Structure of the Nucleus at the Dawn of the 21 st Century”, eds. Bonsignori et al. (World Scientific, Singapore 2001) p A. Dakos, E. Mavrommatis, K. A. Gernoth, and J. W. Clark, to be submitted for publication

Network architecture (I-H 1 -…-H L -O) [P] Learning set σ RMS (MeV) Validation set σ RMS (MeV) Test set σ RMS (MeV) Möller et al (O) (N) ( ) [421] Z & N in binary A & Z – N in analog (O) (N) (4-40-1) [245]1.068 (O) (N) ( ) + [281] Z & N in analog and parity (O)2.280 (NB)2.158 (N) Möller et al. (FRDM) ANDT 59 (1995) (O)0.735 (N)0.697 (NB) Pearson et al. (HFBCS -1) ANDT 77 (2001) (NB) ( ) * [281] Z & N in analog and parity (O)0.962 (N)1.485 (NB) ( ) ** [281] Z & N in analog and parity (NB) Results (Mass Excess ΔΜ)

Nuclear Masses with ANNs Mass Excess ΔΜ [Binding Energies, Separation Energies, Q-values] Experimental values from NUBASE (G. Audi et al. Nucl. Phys. A624 (1997) 1) Net: [ ] ** [281] Data sets: learning: 1303, validation: 351 from mixed MN (FRDM) prediction: NUBASE 158 (NB) Training:as above Coding: 4 input units: Z, N in analog, Z, N parities 1 output unit: ΔΜ analog (S3 scaling) Performance measure: σ RMS Net: [ ] * [281] Data sets: learning: 1323(O), validation: 351(N) from MN 1981 prediction: NUBASE 158 (NB) Training:Modified Back Propagation Algorithm Modification of learning and momentum parameters Coding: 4 input units: Z, N in analog, Z, N parities 1 output unit: ΔΜ analog (S3 scaling) Performance measure: σ RMS

Separate O & N data sets (ΜΝ) net [ ] *

Mixed data sets (O & N) (MN) net [ ] **

Nuclear Half-lives of β - decaying nuclides with ANNs Half life T 1/2 (lnT 1/2) ) [ground state, β - mode, branching ratio=1] Experimental values from NUBASE (G. Audi et al. Nucl. Phys. A624 (1997) 1) Best net: [ ] * [191] Data sets: T 1/2 ≤10 6 sec [ learning: 518, prediction: 174 ] (base B) Training:Standard Back - propagation (with momentum term) Coding: 16 input units: Z, N in binary ; 1 input unit: Q in analog 1 output unit: lnT 1/2 in analog Performance measures: σ RMS (Möller et al.) (Klapdor et al.)

Conclusions Global Models based on ANNs for Nuclear Masses are approaching the accuracy of models based on Hamiltonian theory. Global Models based on ANNs for half-lives of β - decay are promising. Prospects Further development of global models based on ANNs for nuclear masses and half-lives etc. (optimization techniques, pruning, construction, etc.) Further investigation for models of the mass differences D M=ΔM exp -ΔM FRDM Further insight in the statistical interpretation and modeling with ANNs Inverse problem

Neural Network Modeling as well as other statistical strategies based on new algorithms of artificial intelligence may prove to be a useful asset in the further exploration of nuclear phenomena far from β - stability.