Training and Testing Neural Networks 서울대학교 산업공학과 생산정보시스템연구실 이상진.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
EE 690 Design of Embodied Intelligence
Artificial Intelligence 13. Multi-Layer ANNs Course V231 Department of Computing Imperial College © Simon Colton.
Mehran University of Engineering and Technology, Jamshoro Department of Electronic Engineering Neural Networks Feedforward Networks By Dr. Mukhtiar Ali.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Reading for Next Week Textbook, Section 9, pp A User’s Guide to Support Vector Machines (linked from course website)
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Decision Support Systems
Supervised learning 1.Early learning algorithms 2.First order gradient methods 3.Second order gradient methods.
Neural Networks Basic concepts ArchitectureOperation.
Chapter 5 NEURAL NETWORKS
1 Abstract This study presents an analysis of two modified fuzzy ARTMAP neural networks. The modifications are first introduced mathematically. Then, the.
An Illustrative Example
Neural Networks Chapter Feed-Forward Neural Networks.
CS Instance Based Learning1 Instance Based Learning.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Introduction to undirected Data Mining: Clustering
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Radial Basis Function Networks
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Data Mining and Neural Networks Danny Leung CS157B, Spring 2006 Professor Sin-Min Lee.
Neural Networks Chapter 6 Joost N. Kok Universiteit Leiden.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
Chapter 9 Neural Network.
Chapter 11 – Neural Networks COMP 540 4/17/2007 Derek Singer.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 23 Nov 2, 2005 Nanjing University of Science & Technology.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
NEURAL NETWORKS FOR DATA MINING
Chapter 7 Neural Networks in Data Mining Automatic Model Building (Machine Learning) Artificial Intelligence.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Dünaamiliste süsteemide modelleerimine Identification for control in a non- linear system world Eduard Petlenkov, Automaatikainstituut, 2013.
Multi-Layer Perceptron
Neural Networks - Lecture 81 Unsupervised competitive learning Particularities of unsupervised learning Data clustering Neural networks for clustering.
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-1 Chapter 12 Advanced Intelligent Systems.
Neural Networks - Berrin Yanıkoğlu1 Applications and Examples From Mitchell Chp. 4.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
Neural Networks - lecture 51 Multi-layer neural networks  Motivation  Choosing the architecture  Functioning. FORWARD algorithm  Neural networks as.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Neural Networks The Elements of Statistical Learning, Chapter 12 Presented by Nick Rizzolo.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Machine Learning Artificial Neural Networks MPλ ∀ Stergiou Theodoros 1.
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Machine Learning 12. Local Models.
Machine Learning Supervised Learning Classification and Regression
Big data classification using neural network
Learning in Neural Networks
One-layer neural networks Approximation problems
Real Neurons Cell structures Cell body Dendrites Axon
CSE P573 Applications of Artificial Intelligence Neural Networks
Chapter 12 Advanced Intelligent Systems
Artificial Intelligence Methods
BACKPROPAGATION Multlayer Network.
Competitive Networks.
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Artificial Neural Networks
Competitive Networks.
The Network Approach: Mind as a Web
Presentation transcript:

Training and Testing Neural Networks 서울대학교 산업공학과 생산정보시스템연구실 이상진

Contents Introduction When Is the Neural Network Trained? Controlling the Training Process with Learning Parameters Iterative Development Process Avoiding Over-training Automating the Process

Introduction (1) Training a neural network –perform a specific processing function 1) 어떤 parameter? 2) how used to control the training process 3) management of the training data - training process 에 미치는 영향 ? –Development Process 1) Data preparation 2) neural network model & architecture 선택 3) train the neural network –neural network 의 구조와 그 function 에 의해 결정 –Application –“trained”

Introduction (2) Learning Parameters for Neural Network Disciplined approach to iterative neural network development

Introduction (3)

When Is the Neural Network Trained? When the network is trained? –the type of neural network –the function performing classification clustering data build a model or time-series forecast –the acceptance criteria meets the specified accuracy –the connection weights are “locked” –cannot be adjusted

When Is the Neural Network Trained? Classification (1) Measure of success : percentage of correct classification –incorrect classification –no classification : unknown, undecided threshold limit

When Is the Neural Network Trained? Classification (2) confusion matrix : possible output categories and the corresponding percentage of correct and incorrect classifications

When Is the Neural Network Trained? Clustering (1) Output a of clustering network –open to analysis by the user Training regimen is determined: –the number of times the data is presented to the neural network –how fast the learning rate and the neighborhood decay Adaptive resonance network training (ART) –vigilance training parameter –learn rate

When Is the Neural Network Trained? Clustering (2) Lock the ART network weights –disadvantage : online learning ART network are sensitive to the order of the training data

When Is the Neural Network Trained? Modeling (1) Modeling or regression problems Usual Error measure –RMS(Root Square Error) Measure of Prediction accuracy –average –MSE(Mean Square Error) –RMS(Root Square Error) The Expected behavior – 초기의 RMS error 는 매우 높으나, 점차 stable minimum 으로 안정화된다

When Is the Neural Network Trained? Modeling (2)

When Is the Neural Network Trained? Modeling (3) 안정화되지 않는 경우 –network fall into a local minima the prediction error doesn’t fall oscillating up and down – 해결 방법 reset(randomize) weight and start again training parameter data representation model architecture

When Is the Neural Network Trained? Forecasting (1) Forecasting –prediction problem –RMS(Root Square Error) –visualize : time plot of the actual and desired network output Time-series forecasting –long-term trend influenced by cyclical factor etc. –random component variability and uncertainty –neural network are excellent tools for modeling complex time-series problems recurrent neural network : nonlinear dynamic systems –no self-feedback loop & no hidden neurons

When Is the Neural Network Trained? Forecasting (2)

Controlling the Training Process with Learning Parameters (1) Learning Parameters depends on –Type of learning algorithm –Type of neural network

Controlling the Training Process with Learning Parameters (2) - Supervised training Neural Network Pattern Prediction Desired Output Desired Output 1) How the error is computed 2) How big a step we take when adjusting the connection weights

Controlling the Training Process with Learning Parameters (3) - Supervised training Learning rate –magnitude of the change when adjusting the connection weights –the current training pattern and desired output large rate –giant oscillations small rate –to learn the major features of the problem generalize to patterns

Controlling the Training Process with Learning Parameters (4) - Supervised training Momentum –filter out high-frequency changes in the weight values –oscillating around a set values 방지 –Error 가 오랫동안 영향을 미친다 Error tolerance –how close is close enough – 많은 경우 0.1 – 필요성 net input must be quite large?

Controlling the Training Process with Learning Parameters (5) -Unsupervised learning Parameter –selection for the number of outputs granularity of the segmentation (clustering, segmentation) –learning parameters (architecture is set) neighborhood parameter : Kohonen maps vigilance parameter : ART

Controlling the Training Process with Learning Parameters (6) -Unsupervised learning Neighborhood –the area around the winning unit, where the non-wining units will also be modified –roughly half the size of maximum dimension of the output layer –2 methods for controlling square neighborhood function, linear decrease in the learning rate Gaussian shaped neighborhood, exponential decay of the learning rate –the number of epochs parameter –important in keeping the locality of the topographic amps

Controlling the Training Process with Learning Parameters (7) -Unsupervised learning Vigilance –control how picky the neural network is going to be when clustering data –discriminating when evaluating the differences between two patterns –close-enough –Too-high Vigilance use up all of the output units

Iterative Development Process (1) Network convergence issues –fall quickly and then stays flat / reach the global minima –oscillates up and down / trapped in a local minima – 문제의 해결 방법 some random noise reset the network weights and start all again design decision

Iterative Development Process (2)

Iterative Development Process (3) Model selection –inappropriate neural network model for the function to perform –add hidden units or another layer of hidden units –strong temporal or time element embedded recurrent back propagation radial basis function network Data representation –key parameter is not scaled or coded –key parameter is missing from the training data –experience

Iterative Development Process (4) Model architecture –not converge : too complex for the architecture –some additional hidden units, good –adding many more? Just, Memorize the training patterns –Keeping the hidden layers as this as possible, get the best results

Avoiding Over-training Over-training – 같은 pattern 을 계속적으로 학습 –cannot generalize – 새로운 pattern 에 대한 처리 – switch between training and testing data

Automating the Process Automate the selection of the appropriate number of hidden layers and hidden units –pruning out nodes and connections –genetic algorithms –opposite approach to pruning –the use of intelligent agents