Hybrid AI & Machine Learning Systems Using Ne ural Network and Subsumption Architecture Libraries By Logan Kearsley
Purpose 1. To design a hybrid system combining the capbilities of neural networks and subsumption architectures. 2. To produce C libraries to allow others to make use of these algorithms as 'black boxes' in other AI/ML projects.
Similar Projects The Reactive Accompanist- generalized subsumption architecture beyond robotic control. “Evolution of the layers in a subsumption architecture robot controller”- combined subsumption architecture and genetic algorithms. My Project- primarily focuses on neural networks. Major test problem is character recognition.
Design & Programming Modular / Black Box Design The end user should be able to put together a working AI system with minimal knowledge of how the internals work Programming done in C
Testing Forced learning: make sure it will learn arbitrary noiseless input-output mappings after a certain number of exposures Scalability: try differen input-output sets with different dimensions and different numbers of associations to check learning times Noise Filtering: alter individual bits from the test inputs and check that the network still gives high output values for close matches
Algorithms Neural Nets: Back-propagation learning: weights are adjusted based on the distance between the net's current output and the optimal output; errors are calculated based on changes made to lower layers Training: one back-propagation run is done for every association to be learned until, and the cycle repeats until accumulated errors are below a threshhold Matrix simulation: weights are stored in an I (# of inputs) by O (# of outputs) matrix for each layer, rather than simulating each neuron individually; outputs for each layer are calculated separately and used as inputs for the next layer
Algorithms Network Structure Individual Neurons (Allows More Complex Network Topologies) vs. Weight Matrix (Allows for simpler, faster learning algorithms and more efficient use of memory)
Algorithms Subsumption Architecture: Scheduler takes a list of function pointers to task-specific functions Task functions return an output or null Highest-priority non-null task has its output returned on each iteration
Problems Saving/Loading network data between program runs. Segfaults are Annoying
Results & Conclusions Moderately large datasets require an extremely long time to train the network- possibly an asset as far as demonstrating my ideas goes.