Presentation is loading. Please wait.

Presentation is loading. Please wait.

Hybrid AI & Machine Learning Systems Using Ne ural Networks and Subsumption Architecture By Logan Kearsley.

Similar presentations


Presentation on theme: "Hybrid AI & Machine Learning Systems Using Ne ural Networks and Subsumption Architecture By Logan Kearsley."— Presentation transcript:

1 Hybrid AI & Machine Learning Systems Using Ne ural Networks and Subsumption Architecture By Logan Kearsley

2 Purpose 1. To design a hybrid system combining the capabilities of neural networks and subsumption architectures, and demonstrate that it affords increased performance. 2. To produce C libraries to allow others to make use of these algorithms as 'black boxes' in other AI/ML projects.

3 Similar Projects The Reactive Accompanist- generalized subsumption architecture beyond robotic control. “Evolution of the layers in a subsumption architecture robot controller”- combined subsumption architecture and genetic algorithms. My Project- primarily focuses on neural networks. Major test problem is character recognition.

4 Design & Programming Modular / Black Box Design  The end user should be able to put together a working AI system with minimal knowledge of how the internals work Extensibility  Data structures and the test program are designed to be scalable and make use of the modularity of the AI libraries. Programming done in C

5 Testing Forced learning: make sure it will learn arbitrary noiseless input-output mappings after a certain number of exposures (very successful, if the datasets aren't too large)‏ Scalability: try different input-output sets with different dimensions and different numbers of associations to check learning times; optimal network dimensions found through trial-and-error. Extensibility: feed a previously-trained system new data to see how quickly and accurately it can be assimilated.

6 Algorithms Neural Nets Back-propagation learning: weights are adjusted based on the distance between the net's current output and the optimal output; errors are calculated based on changes made to lower layers Hebbian learning: weights are adjusted to strengthen connections between co-firing neurons Training: one back-propagation run is done for every association to be learned until, and the cycle repeats until accumulated errors are below a threshhold; Hebbian reinforcement should prevent corruption of old associations when adding new data (not highly successful so far)‏ Matrix simulation: weights are stored in an I (# of inputs) by O (# of outputs) matrix for each layer, rather than simulating each neuron individually; outputs for each layer are calculated separately and used as inputs for the next layer

7 Algorithms Neural Nets Network Structure  Individual Neurons (Allows More Complex Network Topologies)‏  vs.  Weight Matrix (Allows for simpler, faster learning algorithms and more efficient use of memory, given a known simple network topology)‏ Only using feed-forward networks

8 Algorithms Subsumption Architecture Scheduler calls a list of task-specific functions  Here, queries for character-specific neural networks Task functions return an output or null Highest-priority non-null task has its output returned on each iteration

9 Problems Have to compromise on network dimensions. Training new networks from scratch- seems to take an unusually long time *Very* difficult to write a generic subsumption architecture library.

10 Results & Conclusions Moderately large datasets require an extremely long time to train a single network. Splitting datastes up among many different networks allows for rapid training and sufficient variance in outputs to be useful in a subsumption architecture. Conclusion- for complex or multi-purpose AIs, it is highly beneficial to split up sub-tasks among many specialized sub-AIs (in this case many differently-trained neural networks). However, it's not very practical to write a completely generic subsumption wrapper- I/O requirements are too variable.


Download ppt "Hybrid AI & Machine Learning Systems Using Ne ural Networks and Subsumption Architecture By Logan Kearsley."

Similar presentations


Ads by Google