Multi-Valued Neuron with Sigmoid Activation Function Shin-Fu Wu 2013/6/21.

Slides:



Advertisements
Similar presentations
NEURAL NETWORKS Backpropagation Algorithm
Advertisements

EE 690 Design of Embodied Intelligence
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
For Wednesday Read chapter 19, sections 1-3 No homework.
Lecture 13 – Perceptrons Machine Learning March 16, 2010.
CS Perceptrons1. 2 Basic Neuron CS Perceptrons3 Expanded Neuron.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
The back-propagation training algorithm
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Multi Layer Perceptrons (MLP) Course website: The back-propagation algorithm Following Hertz chapter 6.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
Radial-Basis Function Networks
Gene based diagnostic prediction of cancers by using Artificial Neural Network Liya Wang ECE/CS/ME539.
Ranga Rodrigo April 5, 2014 Most of the sides are from the Matlab tutorial. 1.
Review – Backpropagation
Evolving a Sigma-Pi Network as a Network Simulator by Justin Basilico.
Multi-Layer Perceptrons Michael J. Watts
MVN- based Multilayer Feedforward Neural Network (MLMVN) and a Backpropagation Learning Algorithm Introduced in
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Appendix B: An Example of Back-propagation algorithm
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
Position Reconstruction in Miniature Detector Using a Multilayer Perceptron By Adam Levine.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Non-Bayes classifiers. Linear discriminants, neural networks.
A Simulated-annealing-based Approach for Simultaneous Parameter Optimization and Feature Selection of Back-Propagation Networks (BPN) Shih-Wei Lin, Tsung-Yuan.
Soft computing Lecture 7 Multi-Layer perceptrons.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
Fundamentals of Artificial Neural Networks Chapter 7 in amlbook.com.
Neural Networks - lecture 51 Multi-layer neural networks  Motivation  Choosing the architecture  Functioning. FORWARD algorithm  Neural networks as.
Multi-Valued Neuron with Sigmoid Activation Function Shin-Fu Wu 2013/5/10.
Multi-Valued Neurons and Multilayer Neural Network based on Multi-Valued Neurons MVN and MLMVN 1.
Assignments CS fall Assignment 1 due Generate the in silico data set of 2sin(1.5x)+ N (0,1) with 100 random values of x between.
Modelleerimine ja Juhtimine Tehisnärvivõrgudega Identification and Control with artificial neural networks.
Linear hyperplanes as classifiers Usman Roshan. Hyperplane separators.
Neural networks – Hands on
Dimensions of Neural Networks Ali Akbar Darabi Ghassem Mirroshandel Hootan Nokhost.
129 Feed-Forward Artificial Neural Networks AMIA 2003, Machine Learning Tutorial Constantin F. Aliferis & Ioannis Tsamardinos Discovery Systems Laboratory.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Neural networks.
Regression.
Artificial Neural Networks
Modelleerimine ja Juhtimine Tehisnärvivõrgudega
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
سرطان الثدي Breast Cancer
Artificial Intelligence Methods
Deep Learning Hierarchical Representations for Image Steganalysis
Neuro-Computing Lecture 4 Radial Basis Function Network
Using Artificial Neural Networks and Support Vector Regression to Model the Lyapunov Exponent Adam Maus.
Deep Neural Networks (DNN)
Neural Networks Geoff Hulten.
Lecture Notes for Chapter 4 Artificial Neural Networks
Machine Learning Neural Networks (2).
Multilayer Perceptron: Learning : {(xi, f(xi)) | i = 1 ~ N} → W
Ch4: Backpropagation (BP)
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
CS 621 Artificial Intelligence Lecture 29 – 22/10/05
Introduction to Neural Networks
Ch4: Backpropagation (BP)
Linear regression with one variable
LSTM Practical Exercise
Machine Learning for Cyber
Presentation transcript:

Multi-Valued Neuron with Sigmoid Activation Function Shin-Fu Wu 2013/6/21

MVN-sig Review 

Outlines  Stopping Criteria  Parameter C  Learning Rule for Parameter C  Simulation Results  Binary Classification  MVN-P approach  Simulation Results  Complex-output Model  Model Architecture  Simulation Results  Future Work

Stopping Criteria 

Training acc. - EpochSquared error - Epoch

Training acc. - EpochSquared error - Epoch

Training acc. - EpochSquared error - Epoch

Parameter C 

Simulation Results  Wine Dataset MVN EpochSec.Acc MVN-sig (C=5) EpochSec.Acc MVN-sig (learned C) EpochSec.Acc

Simulation Results  Glass Identification Dataset MVN EpochSec.Acc MVN-sig (C=5) EpochSec.Acc MVN-sig (learned C) EpochSec.Acc

Binary Classification  MVN-P approach  k=2, l=2, m=k*l=4  About 10% worse than MVN-P … WHY?

Simulation Results MVN-PMVN-sig-P Breast Cancer96.14%89% ~ 95.94% Parkinson's89.19%68.51% ~ 82.35% heart76.78%59.52% ~ 73.04%

Complex-output Model 

Simulation Results  Wine Dataset MVN EpochSec.Acc MVN-sig (C=5) EpochSec.Acc Complex MVN-sig (C=5) EpochSec.Acc

Simulation Results  Iris Dataset MVN (96% trained) EpochSec.Acc MVN-sig (C=5) EpochSec.Acc Complex MVN-sig (C=5) EpochSec.Acc

Future Works  Synthetic Data Analysis  Why the binary classification failed?  Why this model is feasible?  Regression Problem  How to solve regression problems?  Multilayer Structure  Construct MLMVN using complex-output MVN-sig  How to choose the activation functions in the hidden layer?