Hybrid AI & Machine Learning Systems Using Ne ural Networks and Subsumption Architecture By Logan Kearsley.

Slides:



Advertisements
Similar presentations
Pattern Association.
Advertisements

Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Neural Networks  A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Kostas Kontogiannis E&CE
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Modular Neural Networks CPSC 533 Franco Lee Ian Ko.
Lecture 14 – Neural Networks
Handwritten Character Recognition Using Artificial Neural Networks Shimie Atkins & Daniel Marco Supervisor: Johanan Erez Technion - Israel Institute of.
Neural Networks My name is Burleson. Neural Networks vs Conventional Computing Programming is broken into small, unambiguous steps Algorithms must be.
Neural Networks Basic concepts ArchitectureOperation.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
Integrating POMDP and RL for a Two Layer Simulated Robot Architecture Presented by Alp Sardağ.
October 28, 2010Neural Networks Lecture 13: Adaptive Networks 1 Adaptive Networks As you know, there is no equation that would tell you the ideal number.
Dan Simon Cleveland State University
Neural Networks Slides by Megan Vasta. Neural Networks Biological approach to AI Developed in 1943 Comprised of one or more layers of neurons Several.
Rohit Ray ESE 251. What are Artificial Neural Networks? ANN are inspired by models of the biological nervous systems such as the brain Novel structure.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
CSSE463: Image Recognition Day 21 Upcoming schedule: Upcoming schedule: Exam covers material through SVMs Exam covers material through SVMs.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Cascade Correlation Architecture and Learning Algorithm for Neural Networks.
Integrating Neural Network and Genetic Algorithm to Solve Function Approximation Combined with Optimization Problem Term presentation for CSC7333 Machine.
Hybrid AI & Machine Learning Systems Using Ne ural Network and Subsumption Architecture Libraries By Logan Kearsley.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Artificial Neural Networks An Overview and Analysis.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Appendix B: An Example of Back-propagation algorithm
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
NEURAL NETWORKS FOR DATA MINING
Artificial Intelligence Techniques Multilayer Perceptrons.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Multi-Layer Perceptron
Akram Bitar and Larry Manevitz Department of Computer Science
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
PARALLELIZATION OF ARTIFICIAL NEURAL NETWORKS Joe Bradish CS5802 Fall 2015.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
AI & Machine Learning Libraries By Logan Kearsley.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
A Roadmap towards Machine Intelligence
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
C - IT Acumens. COMIT Acumens. COM. To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Joe Bradish Parallel Neural Networks. Background  Deep Neural Networks (DNNs) have become one of the leading technologies in artificial intelligence.
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
A PID Neural Network Controller
CSSE463: Image Recognition Day 14
Data Mining, Neural Network and Genetic Programming
Neural Networks A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
CSSE463: Image Recognition Day 17
CSE P573 Applications of Artificial Intelligence Neural Networks
Artificial Intelligence Methods
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
CSSE463: Image Recognition Day 17
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 13
CSSE463: Image Recognition Day 18
Artificial Neural Networks
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 18
Presentation transcript:

Hybrid AI & Machine Learning Systems Using Ne ural Networks and Subsumption Architecture By Logan Kearsley

Purpose 1. To design a hybrid system combining the capabilities of neural networks and subsumption architectures, and demonstrate that it affords increased performance. 2. To produce C libraries to allow others to make use of these algorithms as 'black boxes' in other AI/ML projects.

Similar Projects The Reactive Accompanist- generalized subsumption architecture beyond robotic control. “Evolution of the layers in a subsumption architecture robot controller”- combined subsumption architecture and genetic algorithms. My Project- primarily focuses on neural networks. Major test problem is character recognition.

Design & Programming Modular / Black Box Design  The end user should be able to put together a working AI system with minimal knowledge of how the internals work Extensibility  Data structures and the test program are designed to be scalable and make use of the modularity of the AI libraries. Programming done in C

Testing Forced learning: make sure it will learn arbitrary noiseless input-output mappings after a certain number of exposures (very successful, if the datasets aren't too large)‏ Scalability: try different input-output sets with different dimensions and different numbers of associations to check learning times; optimal network dimensions found through trial-and-error. Extensibility: feed a previously-trained system new data to see how quickly and accurately it can be assimilated.

Algorithms Neural Nets Back-propagation learning: weights are adjusted based on the distance between the net's current output and the optimal output; errors are calculated based on changes made to lower layers Hebbian learning: weights are adjusted to strengthen connections between co-firing neurons Training: one back-propagation run is done for every association to be learned until, and the cycle repeats until accumulated errors are below a threshhold; Hebbian reinforcement should prevent corruption of old associations when adding new data (not highly successful so far)‏ Matrix simulation: weights are stored in an I (# of inputs) by O (# of outputs) matrix for each layer, rather than simulating each neuron individually; outputs for each layer are calculated separately and used as inputs for the next layer

Algorithms Neural Nets Network Structure  Individual Neurons (Allows More Complex Network Topologies)‏  vs.  Weight Matrix (Allows for simpler, faster learning algorithms and more efficient use of memory, given a known simple network topology)‏ Only using feed-forward networks

Algorithms Subsumption Architecture Scheduler calls a list of task-specific functions  Here, queries for character-specific neural networks Task functions return an output or null Highest-priority non-null task has its output returned on each iteration

Problems Have to compromise on network dimensions. Training new networks from scratch- seems to take an unusually long time *Very* difficult to write a generic subsumption architecture library.

Results & Conclusions Moderately large datasets require an extremely long time to train a single network. Splitting datastes up among many different networks allows for rapid training and sufficient variance in outputs to be useful in a subsumption architecture. Conclusion- for complex or multi-purpose AIs, it is highly beneficial to split up sub-tasks among many specialized sub-AIs (in this case many differently-trained neural networks). However, it's not very practical to write a completely generic subsumption wrapper- I/O requirements are too variable.