Back-Propagation MLP Neural Network Optimizer ECE 539 Andrew Beckwith.

Slides:



Advertisements
Similar presentations
NEURAL NETWORKS Backpropagation Algorithm
Advertisements

EE 690 Design of Embodied Intelligence
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Decision Support Systems
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Handwritten Character Recognition Using Artificial Neural Networks Shimie Atkins & Daniel Marco Supervisor: Johanan Erez Technion - Israel Institute of.
The back-propagation training algorithm
September 30, 2010Neural Networks Lecture 8: Backpropagation Learning 1 Sigmoidal Neurons In backpropagation networks, we typically choose  = 1 and 
Multi Layer Perceptrons (MLP) Course website: The back-propagation algorithm Following Hertz chapter 6.
Neural Networks An Introduction.
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
Neural Optimization of Evolutionary Algorithm Strategy Parameters Hiral Patel.
Neural Network Tools. Neural Net Concepts The package provides a “standard” multi-layer perceptron –Composed of layers of neurons –All neurons in a layer.
Motion Picture Revenue Prediction An Artificial Neural Network Method for Predicting Opening Weekend Box-Office Revenue ECE 539 – Fall 2001 Final Project.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Integrating Neural Network and Genetic Algorithm to Solve Function Approximation Combined with Optimization Problem Term presentation for CSC7333 Machine.
© Copyright 2004 ECE, UM-Rolla. All rights reserved A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 23 Nov 2, 2005 Nanjing University of Science & Technology.
1 Chapter 6: Artificial Neural Networks Part 2 of 3 (Sections 6.4 – 6.6) Asst. Prof. Dr. Sukanya Pongsuparb Dr. Srisupa Palakvangsa Na Ayudhya Dr. Benjarath.
Appendix B: An Example of Back-propagation algorithm
Backpropagation An efficient way to compute the gradient Hung-yi Lee.
Learning BlackJack with ANN (Aritificial Neural Network) Ip Kei Sam ID:
Prediction of Voting Patterns Based on Census and Demographic Data Analysis Performed by: Mike He ECE 539, Fall 2005.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
NEURAL NETWORKS FOR DATA MINING
Neural Networks1 Introduction to NETLAB NETLAB is a Matlab toolbox for experimenting with neural networks Available from:
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Applying Neural Networks Michael J. Watts
Multi-Layer Perceptron
Design and Implementation of a Dynamic Data MLP to Predict Motion Picture Revenue David A. Gerasimow.
Soft Computing Lecture 8 Using of perceptron for image recognition and forecasting.
Non-Bayes classifiers. Linear discriminants, neural networks.
A Simulated-annealing-based Approach for Simultaneous Parameter Optimization and Feature Selection of Back-Propagation Networks (BPN) Shih-Wei Lin, Tsung-Yuan.
Prediction of the Foreign Exchange Market Using Classifying Neural Network Doug Moll Chad Zeman.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
CS621 : Artificial Intelligence
AI & Machine Learning Libraries By Logan Kearsley.
Procedure for Training a Child to Identify a Cat using 10,000 Example Cats For Cat_index  1 to Show cat and describe catlike features (Cat_index)
Neural Networks - lecture 51 Multi-layer neural networks  Motivation  Choosing the architecture  Functioning. FORWARD algorithm  Neural networks as.
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Neural networks – Hands on
BACKPROPAGATION (CONTINUED) Hidden unit transfer function usually sigmoid (s-shaped), a smooth curve. Limits the output (activation) unit between 0..1.
Artificial Neural Network System to Predict Golf Score on the PGA Tour ECE 539 – Fall 2003 Final Project Robert Steffes ID:
Intro. ANN & Fuzzy Systems Lecture 11. MLP (III): Back-Propagation.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Neural networks.
Multiple-Layer Networks and Backpropagation Algorithms
Applying Neural Networks
Other Classification Models: Neural Network
Derivation of a Learning Rule for Perceptrons
MLP Based Feedback System for Gas Valve Control in a Madison Symmetric Torus Andrew Seltzman Dec 14, 2010.
Artificial Intelligence Methods
Training a Neural Network
Artificial Neural Network & Backpropagation Algorithm
Synaptic DynamicsII : Supervised Learning
network of simple neuron-like computing elements
An Introduction To The Backpropagation Algorithm
Multi-Layer Perceptron
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Learning Combinational Logic
Presentation transcript:

Back-Propagation MLP Neural Network Optimizer ECE 539 Andrew Beckwith

Back-Propagation MLP Network Optimizer Purpose Methods Features

Purpose Configuring a Neural Network and its parameters is often a long and experimental process with much guess work. Let the computer do it for you. Design and implement a program that can test multiple network configurations with easy setup. Allow user to modify data properly by enhancing important features and minimizing features with little importance or detrimental qualities.

Methods Use back-propagation algorithm with momentum To test multiple configurations, use brute force method and keep track of most successful configuration. Only parameter user cannot control is the number of neurons per hidden layer. Each configuration is tested with 2, 3, 5, and 10 neurons per hidden layer. The last test is a random initialization between 1 and 10 for each layer. Use hyperbolic tangent activation function for hidden neurons and sigmoidal activation function for output neurons. One could change this in the source code if desired.

Features Allow user to open data file, view mean and standard deviation for each feature of each class for modification purposes. Allow user to enter ranges and number of trials for parameters such as: max epoch, epoch size, learning rate, momentum constant, and the number of hidden layers. Allow user to set a tolerance to achieve maximum classification rate. Allow user to view entire network – network configuration, weight values, etc.