Using Recurrent Neural Networks and Hebbian Learning

Slides:



Advertisements
Similar presentations
Cellular Networks.
Advertisements

CSE 2331/5331 CSE 780: Design and Analysis of Algorithms Lecture 14: Directed Graph BFS DFS Topological sort.
Synchronization and Connectivity of Discrete Complex Systems Michael Holroyd.
Memristor in Learning Neural Networks
Introduction to Neural Networks 2. Overview  The McCulloch-Pitts neuron  Pattern space  Limitations  Learning.
Artificial Intelligence Chapter 5 State Machines.
PDP: Motivation, basic approach. Cognitive psychology or “How the Mind Works”
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
Hypercubes and Neural Networks bill wolfe 9/21/2005.
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Neural Networks Dr. Thompson March 26, Artificial Neural Network Topology.
Objective: To Express Rational Numbers as Decimals or Fractions.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
CS430 © 2006 Ray S. Babcock LZW Coding Lempel-Ziv-Welch.
Finite State Machines (FSM) OR Finite State Automation (FSA) - are models of the behaviors of a system or a complex object, with a limited number of defined.
Analog recurrent neural network simulation, Θ(log 2 n) unordered search with an optically-inspired model of computation.
Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation EMNLP’14 paper by Kyunghyun Cho, et al.
Cellular Neural Networks and Visual Computing Leon O. Chua and Tamás Roska Presentation by Max Pflueger.
Convolutional Sequence to Sequence Learning
CS 388: Natural Language Processing: Neural Networks
Neural Networks.
HyperNetworks Engın denız usta
SOFT COMPUTING.
State Machines Chapter 5.
Regular grammars Module 04.1 COP4020 – Programming Language Concepts Dr. Manuel E. Bermudez.
Java Implementation of Optimal Brain Surgeon
CSE P573 Applications of Artificial Intelligence Neural Networks
Attention Is All You Need
Master’s Thesis defense Ming Du Advisor: Dr. Yi Shang
CSE 473 Introduction to Artificial Intelligence Neural Networks
Discrete Kernels.
CSE322 LEFT & RIGHT LINEAR REGULAR GRAMMAR
A critical review of RNN for sequence learning Zachary C
Grid Long Short-Term Memory
RNNs & LSTM Hadar Gorodissky Niv Haim.
Ambiguity in Grammar, Error Recovery
COP4620 – Programming Language Translators Dr. Manuel E. Bermudez
CSE 573 Introduction to Artificial Intelligence Neural Networks
Understanding LSTM Networks
The Evolution of Learning Algorithms for Artificial Neural Networks
Artificial Intelligence in an Agent-Based Model
Artificial Neural Networks Thomas Nordahl Petersen & Morten Nielsen
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
Recurrent Encoder-Decoder Networks for Time-Varying Dense Predictions
Neural Networks References: “Artificial Intelligence for Games”
Machine Translation(MT)
Fundamental Theorem of Algebra
Fundamentals of Neural Networks Dr. Satinder Bal Gupta
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Artificial neurons Nisheeth 10th January 2019.
Plan for Lecture Review code and trace output
Attention.
Neural Networks II Chen Gao Virginia Tech ECE-5424G / CS-5824
Attention for translation
August 8, 2006 Danny Budik, Itamar Elhanany Machine Intelligence Lab
Zaichen Chen, M. Raginsky & E. Rosenbaum
Introduction to Neural Network
Artificial Neural Networks
Automatic Handwriting Generation
@ NeurIPS 2016 Tim Dunn, May
Neural Machine Translation using CNN
Question Answering System
Introduction to Neural Networks
CS621: Artificial Intelligence Lecture 18: Feedforward network contd
Random Neural Network Texture Model
Artificial Neural Network learning
Machine Learning.
Presentation transcript:

Using Recurrent Neural Networks and Hebbian Learning Artificial Life Using Recurrent Neural Networks and Hebbian Learning Michael Tauraso CS 152

Organisms Behavior decided by a Recurrent Neural Network (RNN) RNN derived from a genome Organisms have a 360 degree visual system

Gene Encoding Direct Encoding (boring) Graph Grammar (interesting) Cellular Encoding (Interesting) Functional Block Problem

Graph Grammar Goal is to generate an adjacency matrix Each non-terminal generates a 2x2 matrix Each 2x2 matrix contains either numbers or other non-terminals. The start symbol is used to build up an adjacency matrix

Cellular Encoding Fundamental unit of genome is a transformation. Transformations act on a particular neuron A sequence of transformations defines a neural network. This preserves functional blocks better

Grid World World is a wrap-around grid Each grid can have 1 organism Each grid can have food on it for the organisms to eat Organisms have a 360 degree visual system Organisms use an RNN to decide on their actions

Learning Method Recurrent networks Hebbian Learning Clamping Outputs

Results? Grid World difficult to implement Two Interesting encodings remain unimplemented Interesting extensions to the world model remain unimplemented