Hybrid AI & Machine Learning Systems Using Ne ural Network and Subsumption Architecture Libraries By Logan Kearsley.

Slides:



Advertisements
Similar presentations
Pattern Association.
Advertisements

Back-propagation Chih-yun Lin 5/16/2015. Agenda Perceptron vs. back-propagation network Network structure Learning rule Why a hidden layer? An example:
Kostas Kontogiannis E&CE
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Structure learning with deep neuronal networks 6 th Network Modeling Workshop, 6/6/2013 Patrick Michl.
Modular Neural Networks CPSC 533 Franco Lee Ian Ko.
Neural Nets Using Backpropagation Chris Marriott Ryan Shirley CJ Baker Thomas Tannahill.
Artificial Intelligence (CS 461D)
Handwritten Character Recognition Using Artificial Neural Networks Shimie Atkins & Daniel Marco Supervisor: Johanan Erez Technion - Israel Institute of.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
Integrating POMDP and RL for a Two Layer Simulated Robot Architecture Presented by Alp Sardağ.
Dan Simon Cleveland State University
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Neural Networks Slides by Megan Vasta. Neural Networks Biological approach to AI Developed in 1943 Comprised of one or more layers of neurons Several.
Radial-Basis Function Networks
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Neural Network Tools. Neural Net Concepts The package provides a “standard” multi-layer perceptron –Composed of layers of neurons –All neurons in a layer.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Cascade Correlation Architecture and Learning Algorithm for Neural Networks.
Integrating Neural Network and Genetic Algorithm to Solve Function Approximation Combined with Optimization Problem Term presentation for CSC7333 Machine.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
© Negnevitsky, Pearson Education, Will neural network work for my problem? Will neural network work for my problem? Character recognition neural.
Artificial Neural Networks An Overview and Analysis.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
Appendix B: An Example of Back-propagation algorithm
Hybrid AI & Machine Learning Systems Using Ne ural Networks and Subsumption Architecture By Logan Kearsley.
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
NEURAL NETWORKS FOR DATA MINING
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Techniques Multilayer Perceptrons.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
PARALLELIZATION OF ARTIFICIAL NEURAL NETWORKS Joe Bradish CS5802 Fall 2015.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
AI & Machine Learning Libraries By Logan Kearsley.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Computer Architecture Lecture 26 Past and Future Ralph Grishman November 2015 NYU.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
C - IT Acumens. COMIT Acumens. COM. To demonstrate the use of Neural Networks in the field of Character and Pattern Recognition by simulating a neural.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Joe Bradish Parallel Neural Networks. Background  Deep Neural Networks (DNNs) have become one of the leading technologies in artificial intelligence.
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
Meta-controlled Boltzmann Machine toward Accelerating the Computation Tran Duc Minh (*), Junzo Watada (**) (*) Institute Of Information Technology-Viet.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
ANN-based program for Tablet PC character recognition
Chilimbi, et al. (2014) Microsoft Research
One-layer neural networks Approximation problems
Artificial Intelligence (CS 370D)
CSE 473 Introduction to Artificial Intelligence Neural Networks
Structure learning with deep autoencoders
CSSE463: Image Recognition Day 17
Java Implementation of Optimal Brain Surgeon
Artificial Intelligence Methods
network of simple neuron-like computing elements
CSSE463: Image Recognition Day 17
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
CSSE463: Image Recognition Day 18
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 13
Artificial Neural Networks
CSSE463: Image Recognition Day 17
CSSE463: Image Recognition Day 17
Presentation transcript:

Hybrid AI & Machine Learning Systems Using Ne ural Network and Subsumption Architecture Libraries By Logan Kearsley

Purpose 1. To design a hybrid system combining the capbilities of neural networks and subsumption architectures. 2. To produce C libraries to allow others to make use of these algorithms as 'black boxes' in other AI/ML projects.

Similar Projects The Reactive Accompanist- generalized subsumption architecture beyond robotic control. “Evolution of the layers in a subsumption architecture robot controller”- combined subsumption architecture and genetic algorithms. My Project- primarily focuses on neural networks. Major test problem is character recognition.

Design & Programming Modular / Black Box Design  The end user should be able to put together a working AI system with minimal knowledge of how the internals work Programming done in C

Testing Forced learning: make sure it will learn arbitrary noiseless input-output mappings after a certain number of exposures Scalability: try differen input-output sets with different dimensions and different numbers of associations to check learning times Noise Filtering: alter individual bits from the test inputs and check that the network still gives high output values for close matches

Algorithms Neural Nets:  Back-propagation learning: weights are adjusted based on the distance between the net's current output and the optimal output; errors are calculated based on changes made to lower layers  Training: one back-propagation run is done for every association to be learned until, and the cycle repeats until accumulated errors are below a threshhold  Matrix simulation: weights are stored in an I (# of inputs) by O (# of outputs) matrix for each layer, rather than simulating each neuron individually; outputs for each layer are calculated separately and used as inputs for the next layer

Algorithms Network Structure  Individual Neurons (Allows More Complex Network Topologies)  vs.  Weight Matrix (Allows for simpler, faster learning algorithms and more efficient use of memory)

Algorithms Subsumption Architecture:  Scheduler takes a list of function pointers to task-specific functions  Task functions return an output or null  Highest-priority non-null task has its output returned on each iteration

Problems Saving/Loading network data between program runs. Segfaults are Annoying

Results & Conclusions Moderately large datasets require an extremely long time to train the network- possibly an asset as far as demonstrating my ideas goes.