Neural networks – Hands on

Slides:



Advertisements
Similar presentations
NEURAL NETWORKS Backpropagation Algorithm
Advertisements

Associative Learning Memories -SOLAR_A
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Principle Components & Neural Networks How I finished second in Mapping Dark Matter Challenge Sergey Yurgenson, Harvard University Pasadena, 2011.
Decision Support Systems
The back-propagation training algorithm
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
Neural Nets How the brain achieves intelligence 10^11 1mhz cpu’s.
Learning From Data Chichang Jou Tamkang University.
Introduction to Neural Networks Simon Durrant Quantitative Methods December 15th.
Data Mining with Neural Networks
Neural Networks Chapter Feed-Forward Neural Networks.
1 Automated Feature Abstraction of the fMRI Signal using Neural Network Clustering Techniques Stefan Niculescu and Tom Mitchell Siemens Medical Solutions,
LOGO Classification III Lecturer: Dr. Bo Yuan
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
Detailed q2/Q2 results for 100 bootstraps for final runs with (38 + dummy features)
Gene based diagnostic prediction of cancers by using Artificial Neural Network Liya Wang ECE/CS/ME539.
Neural Network Tools. Neural Net Concepts The package provides a “standard” multi-layer perceptron –Composed of layers of neurons –All neurons in a layer.
Ranga Rodrigo April 5, 2014 Most of the sides are from the Matlab tutorial. 1.
197 Case Study: Predicting Breast Cancer Invasion with Artificial Neural Networks on the Basis of Mammographic Features MEDINFO 2004, T02: Machine Learning.
Predicting Income from Census Data using Multiple Classifiers Presented By: Arghya Kusum Das Arnab Ganguly Manohar Karki Saikat Basu Subhajit Sidhanta.
Artificial Neural Networks
Neural Networks Introduction – What is a neural network and what can it do for me? Terminology, Design and Topology Data Sets – When too much is not a.
Neural Networks John Riebe and Adam Profitt. What is a neuron? Weights: Weights are scalars that multiply each input element Summer: The summer sums the.
© Negnevitsky, Pearson Education, Will neural network work for my problem? Will neural network work for my problem? Character recognition neural.
Appendix B: An Example of Back-propagation algorithm
Matlab Matlab Sigmoid Sigmoid Perceptron Perceptron Linear Linear Training Training Small, Round Blue-Cell Tumor Classification Example Small, Round Blue-Cell.
Learning BlackJack with ANN (Aritificial Neural Network) Ip Kei Sam ID:
Back-Propagation MLP Neural Network Optimizer ECE 539 Andrew Beckwith.
NEURAL NETWORKS FOR DATA MINING
Neural Networks1 Introduction to NETLAB NETLAB is a Matlab toolbox for experimenting with neural networks Available from:
Figure 1.1 Rules for the contact lens data.. Figure 1.2 Decision tree for the contact lens data.
An informal description of artificial neural networks John MacCormick.
ARTIFICIAL NEURAL NETWORKS. Overview EdGeneral concepts Areej:Learning and Training Wesley:Limitations and optimization of ANNs Cora:Applications and.
Applying Neural Networks Michael J. Watts
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
Multi-Layer Perceptron
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
IE 585 Competitive Network – Learning Vector Quantization & Counterpropagation.
Neural Networks and Backpropagation Sebastian Thrun , Fall 2000.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
Fundamentals of Artificial Neural Networks Chapter 7 in amlbook.com.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
ROOT ROOT.PAT ROOT.TES (ROOT.WGT) (ROOT.FWT) (ROOT.DBD) MetaNeural ROOT.XXX ROOT.TTT ROOT.TRN (ROOT.DBD) ROOT.WGT ROOT.FWT Use Analyze root –34 for easy.
Artificial Neural Networks (Cont.) Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
1 Statistics & R, TiP, 2011/12 Neural Networks  Technique for discrimination & regression problems  More mathematical theoretical foundation  Works.
COMPUTATIONAL INTELLIGENCE
Artificial Neural Networks for Data Mining. Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall 6-2 Learning Objectives Understand the.
BACKPROPAGATION (CONTINUED) Hidden unit transfer function usually sigmoid (s-shaped), a smooth curve. Limits the output (activation) unit between 0..1.
1 Statistics & R, TiP, 2011/12 Multivariate Methods  Multivariate data  Data display  Principal component analysis Unsupervised learning technique 
語音訊號處理之初步實驗 NTU Speech Lab 指導教授: 李琳山 助教: 熊信寬
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Big data classification using neural network
Applying Neural Networks
DeepCount Mark Lenson.
Erich Smith Coleman Platt
Discriminant Analysis
CSSE463: Image Recognition Day 17
Machine Learning Today: Reading: Maria Florina Balcan
DataMining, Morgan Kaufmann, p Mining Lab. 김완섭 2004년 10월 27일
network of simple neuron-like computing elements
Neural Networks Geoff Hulten.
CSSE463: Image Recognition Day 17
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
CSSE463: Image Recognition Day 17
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Ch4: Backpropagation (BP)
Presentation transcript:

Neural networks – Hands on Delta rule and Backpropagation algorithm MetaNeural format for predictive data mining Iris Data Magnetocardiogram data

Neural net yields weights to map inputs to outputs  Neural Network Molecular weight w11 h w11   Boiling Point H-bonding   Biological response Hydrofobicity  h Electrostatic interactions w23  w34 Observable Projection Molecular Descriptor There are many algorithms that can determine the weights for ANNs RENSSELAER

McCulloch-Pitts neuron y x 1 3 N S f() w 2 RENSSELAER

Neural network as collection of M-P neurons x 1 2 w 11 12 13 23 22 32 21 3 y First hidden layer Second hidden Output neuron RENSSELAER

Standard Data Mining Terminology Basic Terminology - MetaNeural Format - Descriptors, features, response (or activity) and ID - Classification versus regression - Modeling/Feature detection - Training/Validation/Calibration - Vertical and horizontal view of data Outliers, rare events and minority classes Data Preparation - Data cleansing - Scaling Leave-one-out and leave-several-out validation Confusion matrix and ROC curves

Standard Data Mining Terminology Basic Terminology - MetaNeural Format - Descriptors, features, response (or activity) and ID - Classification versus regression - Modeling/Feature detection - Training/Validation/Calibration - Vertical and horizontal view of data Outliers, rare events and minority classes Data Preparation - Data cleansing - Scaling Leave-one-out and leave-several-out validation Confusion matrix and ROC curves

TERMINOLOGY Demo: iris_view.bat Standard Data Mining Problem Header and Data MetaNeural Format - descriptors and/or features - response (or activity to predict) - pattern ID - data matrix Validation/Calibration Training/Validation/Test Set Demo: iris_view.bat

UC URVINE DATA REPOSITORY Datafile Name: Fisher's Iris Datafile Subjects: Agriculture , Famous datasets Description: This is a dataset made famous by Fisher, who used it to illustrate principles of discriminant analysis. It contains 6 variables with 150 observations. Reference: Fisher, R. A. (1936). The Use of Multiple Measurements in Axonomic Problems. Annals of Eugenics 7, 179-188. Story Names: Fisher's Irises Authorization: free use Number of cases: 150 Variable Names: 1.Species_No: Flower species as a code 2.Species_Name: Species name 3.Petal_Width: Petal Width 4.Petal_Length: Petal Length 5.Sepal_Width: Sepal Width 6.Sepal_Length: Sepal Length

S • ANALYZE code has neural networks modules built-in • Either run: analyze root.pat 4331 (single training and testing) analyze root.pat 4332 (LOO) analyze root.txt 4333 (bootstrap mode) • Results for analyze are in resultss.xxx and resultss.ttt • Note that patterns have to be properly scaled first • The file name meta overrides the default input file for analyze S

Neural Network Module in Analyze Code ROOT ROOT.PAT ROOT.TES (ROOT.WGT) (ROOT.FWT) (ROOT.DBD) Use Analyze root 4331 for easy way (the file meta let you override defaults) Analyze resultss.XXX resultss.TTT ROOT.TRN (ROOT.DBD) ROOT.WGT ROOT.FWT

MetaNeural Input File for the ROOT Generating and Scaling Data 4 => 4 layers 2 => 2 inputs 16 => # hidden neurons in layer #1 4 => # hidden neurons in layer# 2 1 => # outputs 300 => epoch length (hint:always use 1, for the entire batch) 0.01 => learning parameters by weight layer (hint: 1/# patterns or 1/# epochs) 0.01 0.5 => momentum parameters by weight layer (hint use 0.5) 0.5 10000000 => some very large number of training epochs 200 => error display refresh rate 1 =>sigmoid transfer function 1 => Temperature of sigmoid check.pat => name of file with training patterns (test patterns in root.tes) 0 => not used (legacy entry) 100 => not used (legacy entry) 0.02000 => exit training if error < 0.02 0 => initial weights from a flat random distribution 0.2 => initial random weights all fall between –2 and +2

Generating and Scaling Iris Data

Run Neural Net for Iris Data