ARTIFICIAL NEURAL NETWORK ANALYSIS OF MULTIPLE IBA SPECTRA H.F.R.Pinho 1,2, A. Vieira 2, N.R.Nené 1,N. P. Barradas 1,3 1 Instituto Tecnológico e Nuclear,

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Neural Networks  A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Study of sputtering on thin films due to ionic implantations F. C. Ceoni, M. A. Rizzutto, M. H. Tabacniks, N. Added, M. A. P. Carmignotto, C.C.P. Nunes,
Modular Neural Networks CPSC 533 Franco Lee Ian Ko.
Machine Learning Neural Networks
“Statistical Approach Using Neural Nets” Nuclear Masses and Half-lives E. Mavrommatis S. Athanassopoulos A. Dakos University of Athens K. A. Gernoth J.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
MACHINE LEARNING 12. Multilayer Perceptrons. Neural Networks Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
The Calibration Process
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Cross section measurements for analysis of D and T in thicker films Liqun Shi Institute of Modern Physics, Fudan University, Shanghai, , People’s.
Radial Basis Function Networks
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Presented by: Kamakhaya Argulewar Guided by: Prof. Shweta V. Jain
Machine Learning. Learning agent Any other agent.
PPT 206 Instrumentation, Measurement and Control SEM 2 (2012/2013) Dr. Hayder Kh. Q. Ali 1.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Multiple-Layer Networks and Backpropagation Algorithms
Cascade Correlation Architecture and Learning Algorithm for Neural Networks.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Artificial Neural Networks An Overview and Analysis.
Explorations in Neural Networks Tianhui Cai Period 3.
Chapter 9 Neural Network.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Ion Beam Analysis Dolly Langa Physics Department, University of Pretoria, South Africa Blane Lomberg Physics Department, University of the Western Cape,
NEURAL NETWORKS FOR DATA MINING
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
COMPARISON OF IMAGE ANALYSIS FOR THAI HANDWRITTEN CHARACTER RECOGNITION Olarik Surinta, chatklaw Jareanpon Department of Management Information System.
Gap filling of eddy fluxes with artificial neural networks
Handwritten Recognition with Neural Network Chatklaw Jareanpon, Olarik Surinta Mahasarakham University.
Multi-Layer Perceptron
2. RUTHERFORD BACKSCATTERING SPECTROMETRY Basic Principles.
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
COSIRES 2004 © Matej Mayer Bayesian Reconstruction of Surface Roughness and Depth Profiles M. Mayer 1, R. Fischer 1, S. Lindig 1, U. von Toussaint 1, R.
UL UNIVERSIDADE DE LISBOA CENTRO DE FÍSICA NUCLEAR People: Senior Researchers (responsible for activity lines): Eduardo Alves (Head of Lab) Adelaide Pedro.
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Basics of Ion Beam Analysis
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Lattice location studies of the “anti-site” impurities As and Sb in ZnO and GaN Motivation 73 As in ZnO 73 As in GaN 124 Sb in GaN Conclusions U. Wahl.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Neural networks.
Multiple-Layer Networks and Backpropagation Algorithms
Neural Network Architecture Session 2
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Prof. Carolina Ruiz Department of Computer Science
Artificial Neural Network & Backpropagation Algorithm
network of simple neuron-like computing elements
An Introduction To The Backpropagation Algorithm
Ch4: Backpropagation (BP)
The Network Approach: Mind as a Web
Ch4: Backpropagation (BP)
Artificial Intelligence Chapter 3 Neural Networks
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

ARTIFICIAL NEURAL NETWORK ANALYSIS OF MULTIPLE IBA SPECTRA H.F.R.Pinho 1,2, A. Vieira 2, N.R.Nené 1,N. P. Barradas 1,3 1 Instituto Tecnológico e Nuclear, E.N. 10, 2685 Sacavém, Portugal, 2 ISEP, R. São Tomé, 4200 Porto, Portugal 3 Centro de Física Nuclear da Universidade de Lisboa, Avenida Prof. Gama Pinto 2, 1699 Lisboa, Portugal 1. INTRODUCTION We have previously used artificial neural networks (ANNs) for automated analysis of Rutherford backscattering (RBS) data [1,2] One of the limitations was that one single spectrum could be analyzed from each sample. When more than one spectrum is collected, each has to be analyzed separately, leading to different results and hence reduced accuracy. Furthermore, often complementary data are collected, either using different experimental conditions, or even using a different technique such as elastic recoil detection (ERDA) or particle- induced x-ray analysis. In that case, simultaneous and self- consistent analysis of all data is essential. In this work we developed a code based on ANN to perform the analysis of multiple RBS and ERDA spectra collected from the same sample. The ANN developed was applied to a very simple case : determination of the stoichiometry of TiNOH samples 3. EXPERIMENTAL CONDITIONS Samples 15 TiNxOyHz samples were deposited by reactive rf magnetron sputtering from a high purity Ti target ( %) onto polished high- speed stainless steel (AISI M2). Spectra 35 MeV 35 Cl 7+ ERDA spectra were collected. Five spectra were obtained simultaneously from each sample: one recoil spectrum for each element, and the backscattering spectrum from the Ti (fig. 2). Conventional data analysis was done with the code NDF [4]. 2. ARTIFICIAL NEURAL NETWORKS Recognize recurring patterns in input data, without Physics knowledge Ideal to do automatically what analysts have long done: relate specific data features to specific sample properties …because spectra can be treated just as pictures. How? By supervised learning from known examples: -training phase: give the ANN many examples where the solution is known beforehand; -let the ANN adjust itself to this training set; -check it against an independent set of data: the test set; -do it until no improvement can be found. Network Architecture (N, I 1,….., I n, M) (fig.1) N-number of inputs M-number of outputs I i -number of nodes in hidden layer i Present work Without pre-processing N- 130 yield values (26 channels of each spectrum one after the other, including leading edge) M- 4 (concentration of each element) With pre-processing N- 5 (integrals of the 26 channels of each spectrum) M- 4 (concentration of each element) Network Connectivity Fully linked- full connectivity (standard architecture) from all nodes in one layer to all nodes in the next layer. Cluster linked- input corresponding to each spectrum (either 26 yields or their integral) as being one cluster, connected only to a given set of nodes in the first hidden layer. Other clusters, corresponding to the other spectra, are not connected to that set of nodes. Backpropagation Each connection has a given weight, initially random; Give known values to the inputs and outputs; Adjust the weights to minimize the difference between the given outputs and the calculated ones. ANN Training examples corresponding to a very broad range of possible elemental concentrations were fed to train the net. These examples were constructed from simulated experimental data (fig.3) with Poisson noise added. The training consisted in 50 iteration steps. ANN Testing independent examples (not used during training) were used to test the coefficients. A thin carbon layer of varying thickness was considered, in order to emulate real-life conditions, where such layers are often deposited during the experiment due to poor vacuum conditions. 4. TRAIN AND TEST SET 6. SUMMARY AND OUTLOOK We developed artificial neural networks capable of analysing multiple ion beam analysis spectra collected from the same sample. The ANNs were applied to a simple problem, the determination of the stoichiometry of TiNOH samples measured with heavy ion ERDA. This allowed us to make a thorough study of network architecture, connectivity, and effectiveness of pre-processing. Small networks using the spectral yields with no pre-processing achieve the best results in real experimental data. However, the ANNs must be cluster-linked so that each spectrum is only connected to a subset of nodes in the first hidden layer exclusively dedicated to that spectrum. Effective automatic pre-processing (as opposed to a-priori pre- processing) is achieved, leading to very efficient networks that are easy to train. We expect this type of architecture to be easy to scale up to complex multiple IBA spectra problems. The authors gratefully acknowledge the financial support of FCT under grant POCTI/CTM/40059/ RESULTS inputs outputs Fig.1 [1]N. P. Barradas and A. Vieira, Phys. Rev. E62 (2000) [2]V. Matias, G. Öhl, J.C. Soares, N. P. Barradas, A. Vieira, P.P. Freitas, S. Cardoso, Phys. Rev. E 67 (2003) [3]E. Alves, A. Ramos, N. P. Barradas, F. Vaz, P. Cerqueira, L. Rebouta, U. Kreissig, Surf. Coatings Technology (2004) 372 [4]N.P. Barradas, C. Jeynes, and R.P. Webb, Appl. Phys. Lett. 71 (1997) 291. [5]C. M. Bishop, Neural Networks for Pattern Recognition (Oxford: Oxford University Press 1995) Fig.2 Architecturee rms (train set) e rms (test set)  (Ti)  (N)  (O)  (H) (130,5,4) (130,15,4) (130,25,4) (130,50,4) (130,10,5,4) (130,20,10,4) (130,20,20,4) (130,30,20,4) (130,40,20,4) (130,40,30,4) (130,50,30,4) (130,50,40,4) Architecturee rms (train set) e rms (test set)  (Ti)  (N)  (O)  (H) (5,5,4) (5,15,4) (5,25,4) (5,50,4) (5,10,5,4) (5,20,10,4) (5,20,20,4) (5,30,20,4) (5,40,20,4) (5,40,30,4) (5,50,30,4) (5,50,40,4) Architecturee rms (train set) e rms (test set)  (Ti)  (N)  (O)  (H) (130,5,4) (130,10,4) (130,15,4) (130,25,4) (130,50,4) (5,5,4) (5,10,4) (5,15,4) (5,25,4) (5,50,4) e rms - root mean square error  - relative error average over all samples measured taking as reference the results given by NDF (fig.3 and tab.1, 2, 3) Fully linked networks without pre-processing Fully linked networks with pre-processing Cluster linked networks with and without pre-processing generate NDF ANN RBS and ERDA measurements Train Set Test Set Training process Results Fig.3 Tab.1 Tab.3 Tab.2 Fig.4