Download presentation
Presentation is loading. Please wait.
Published byGriffin Bryan Clarke Modified over 9 years ago
1
Hybrid AI & Machine Learning Systems Using Ne ural Network and Subsumption Architecture Libraries By Logan Kearsley
2
Purpose 1. To design a hybrid system combining the capbilities of neural networks and subsumption architectures. 2. To produce C libraries to allow others to make use of these algorithms as 'black boxes' in other AI/ML projects.
3
Similar Projects The Reactive Accompanist- generalized subsumption architecture beyond robotic control. “Evolution of the layers in a subsumption architecture robot controller”- combined subsumption architecture and genetic algorithms. My Project- primarily focuses on neural networks. Major test problem is character recognition.
4
Design & Programming Modular / Black Box Design The end user should be able to put together a working AI system with minimal knowledge of how the internals work Programming done in C
5
Testing Forced learning: make sure it will learn arbitrary noiseless input-output mappings after a certain number of exposures Scalability: try differen input-output sets with different dimensions and different numbers of associations to check learning times Noise Filtering: alter individual bits from the test inputs and check that the network still gives high output values for close matches
6
Algorithms Neural Nets: Back-propagation learning: weights are adjusted based on the distance between the net's current output and the optimal output; errors are calculated based on changes made to lower layers Training: one back-propagation run is done for every association to be learned until, and the cycle repeats until accumulated errors are below a threshhold Matrix simulation: weights are stored in an I (# of inputs) by O (# of outputs) matrix for each layer, rather than simulating each neuron individually; outputs for each layer are calculated separately and used as inputs for the next layer
7
Algorithms Network Structure Individual Neurons (Allows More Complex Network Topologies) vs. Weight Matrix (Allows for simpler, faster learning algorithms and more efficient use of memory)
8
Algorithms Subsumption Architecture: Scheduler takes a list of function pointers to task-specific functions Task functions return an output or null Highest-priority non-null task has its output returned on each iteration
9
Problems Saving/Loading network data between program runs. Segfaults are Annoying
10
Results & Conclusions Moderately large datasets require an extremely long time to train the network- possibly an asset as far as demonstrating my ideas goes.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.