Jonathan Reagan Umass Dartmouth CSUMS Summer 11 August 3 rd 2011.

Slides:



Advertisements
Similar presentations
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Advertisements

Artificial Intelligence 13. Multi-Layer ANNs Course V231 Department of Computing Imperial College © Simon Colton.
For Wednesday Read chapter 19, sections 1-3 No homework.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Practical Advice For Building Neural Nets Deep Learning and Neural Nets Spring 2015.
November 18, 2010Neural Networks Lecture 18: Applications of SOMs 1 Assignment #3 Question 2 Regarding your cascade correlation projects, here are a few.
Artificial Neural Networks ECE 398BD Instructor: Shobha Vasudevan.
The back-propagation training algorithm
Radial Basis Functions
Neural Networks Marco Loog.
1 Automated Feature Abstraction of the fMRI Signal using Neural Network Clustering Techniques Stefan Niculescu and Tom Mitchell Siemens Medical Solutions,
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
September 28, 2010Neural Networks Lecture 7: Perceptron Modifications 1 Adaline Schematic Adjust weights i1i1i1i1 i2i2i2i2 inininin …  w 0 + w 1 i 1 +
Modeling the Behavior of the S&P 500 Index Mary Malliaris Loyola University Chicago 10 th IEEE Conference on Artificial Intelligence for Applications.
Cascade Correlation Architecture and Learning Algorithm for Neural Networks.
Artificial Neural Networks
Explorations in Neural Networks Tianhui Cai Period 3.
Appendix B: An Example of Back-propagation algorithm
Back-Propagation MLP Neural Network Optimizer ECE 539 Andrew Beckwith.
Neural Networks - Berrin Yanıkoğlu1 Applications and Examples From Mitchell Chp. 4.
Rotation Invariant Neural-Network Based Face Detection
A Taylor Rule with Monthly Data A.G. Malliaris Mary.E. Malliaris Loyola University Chicago.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 9: Ways of speeding up the learning and preventing overfitting Geoffrey Hinton.
A note about gradient descent: Consider the function f(x)=(x-x 0 ) 2 Its derivative is: By gradient descent. x0x0 + -
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
Multi-Layer Perceptron
Neural Networks and Backpropagation Sebastian Thrun , Fall 2000.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
CSC321: Lecture 7:Ways to prevent overfitting
פרקים נבחרים בפיסיקת החלקיקים אבנר סופר אביב
Internal States Sensory Data Learn/Act Action Development Action Selection Action[1] Action[2] Action[N] Action[i] (being developed) Action[i] Environment.
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
CS206Evolutionary Robotics Artificial Neural Networks Input Layer Output Layer Input Layer Output Layer ++ ++
CPH Dr. Charnigo Chap. 11 Notes Figure 11.2 provides a diagram which shows, at a glance, what a neural network does. Inputs X 1, X 2,.., X P are.
Intro. ANN & Fuzzy Systems Lecture 11. MLP (III): Back-Propagation.
Neural Networks - Berrin Yanıkoğlu1 MLP & Backpropagation Issues.
Machine Learning Supervised Learning Classification and Regression
Chapter 4 Supervised learning: Multilayer Networks II
RADIAL BASIS FUNCTION NEURAL NETWORK DESIGN
The Gradient Descent Algorithm
Neural Networks Winter-Spring 2014
ANN-based program for Tablet PC character recognition
One-layer neural networks Approximation problems
Real Neurons Cell structures Cell body Dendrites Axon
CSE 473 Introduction to Artificial Intelligence Neural Networks
Object Detection with Bootstrapping Carlos Rubiano Mentor: Oliver Nina
Intro to NLP and Deep Learning
Deep Belief Networks Psychology 209 February 22, 2013.
CSSE463: Image Recognition Day 17
A “Holy Grail” of Machine Learing
Prof. Carolina Ruiz Department of Computer Science
Convolutional Neural Networks
Lecture 11. MLP (III): Back-Propagation
Training a Neural Network
BACKPROPAGATION Multlayer Network.
XOR problem Input 2 Input 1
network of simple neuron-like computing elements
Neural Networks Geoff Hulten.
CSSE463: Image Recognition Day 17
Ch4: Backpropagation (BP)
Image Compression Using An Adaptation of the ART Algorithm
A connectionist model in action
Artificial Intelligence 10. Neural Networks
Structure of a typical back-propagated multilayered perceptron used in this study. Structure of a typical back-propagated multilayered perceptron used.
CSSE463: Image Recognition Day 17
Ch4: Backpropagation (BP)
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Jonathan Reagan Umass Dartmouth CSUMS Summer 11 August 3 rd 2011

 What is a Neural Network?  How does it work?  Why do we care?  Results  Issues encountered  Future work

Input Layers Hidden Layers Output Layers

 Not realistic to study every possible case  Smaller sample can be used to model the entire case  Assume connections hold

 (input)=[age, income, credit score, etc]  (output)=[dependability]  We want weights of α’s  X* α (Hidden)=Y

 Use the learning method to find α  I Y-Xα I=0

PerceptronLeast Square NAccuracys Failed TrialsNAccuracys Failed Trials MinAVGMaxN/Total MinAVGMaxN/Total / / / / / / / / / / / / / / / / / /100

2 million Convergence Failed Trials 4 million convergence Failed Trials NT(Time)SecondsN/TotalNT(Time)SecondsN/Total / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / /50

 Random Data can’t be learned  Deterministic Data can be learned  Adding Random variance decreases Accuracy  More values of N the Better  But more values of N take Longer

 Increase the speed of the Neural Network  Find more applicable data for testing of the Neural Network  Try multiple layer Neural Networks and Compare