SE367 Project Final Presentation By: Sujith Thomas Parimi Krishna Chaitanya In charge:- Prof Amitabha Mukerjee.

Slides:



Advertisements
Similar presentations
Pattern Association.
Advertisements

A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
NEURAL NETWORKS Backpropagation Algorithm
Artificial Neural Networks (1)
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 7: Learning in recurrent networks Geoffrey Hinton.
Machine Learning Neural Networks
Non-linear classification problem using NN Fainan May 2006 Pattern Classification and Machine Learning Course Three layers Feedforward Neural Network (FFNN)

Lecture 14 – Neural Networks
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Neural Networks Basic concepts ArchitectureOperation.
Radial Basis-Function Networks. Back-Propagation Stochastic Back-Propagation Algorithm Step by Step Example Radial Basis-Function Networks Gaussian response.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
September 30, 2010Neural Networks Lecture 8: Backpropagation Learning 1 Sigmoidal Neurons In backpropagation networks, we typically choose  = 1 and 
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Multi Layer Perceptrons (MLP) Course website: The back-propagation algorithm Following Hertz chapter 6.
October 14, 2010Neural Networks Lecture 12: Backpropagation Examples 1 Example I: Predicting the Weather We decide (or experimentally determine) to use.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
MLP Exercise (2006) Become familiar with the Neural Network Toolbox in Matlab Construct a single hidden layer, feed forward network with sigmoidal units.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
Artificial Neural Network
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
-Artificial Neural Network- Chapter 3 Perceptron 朝陽科技大學 資訊管理系 李麗華教授.
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Artificial Neural Networks An Overview and Analysis.
ANNs (Artificial Neural Networks). THE PERCEPTRON.
Appendix B: An Example of Back-propagation algorithm
Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University EE459 Neural Networks The Structure.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
COMPARISON OF IMAGE ANALYSIS FOR THAI HANDWRITTEN CHARACTER RECOGNITION Olarik Surinta, chatklaw Jareanpon Department of Management Information System.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
The Perceptron. Perceptron Pattern Classification One of the purposes that neural networks are used for is pattern classification. Once the neural network.
Playing God: The Engineering of Functional Designs in the Game of Life Liban Mohamed Computer Systems Research Lab
Computer Go : A Go player Rohit Gurjar CS365 Project Presentation, IIT Kanpur Guided By – Prof. Amitabha Mukerjee.
Multi-Layer Perceptron
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Prediction of the Foreign Exchange Market Using Classifying Neural Network Doug Moll Chad Zeman.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
Neural Networks - lecture 51 Multi-layer neural networks  Motivation  Choosing the architecture  Functioning. FORWARD algorithm  Neural networks as.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Mentor Prof. Amitabha Mukerjee Deepak Pathak Kaustubh Tapi 10346
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Web-Mining Agents Prof. Dr. Ralf Möller Universität zu Lübeck Institut für Informationssysteme Tanya Braun (Übungen)
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Neural networks.
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
CSE P573 Applications of Artificial Intelligence Neural Networks
Prof. Carolina Ruiz Department of Computer Science
Cache Replacement Scheme based on Back Propagation Neural Networks
CSE 573 Introduction to Artificial Intelligence Neural Networks
CSC321: Neural Networks Lecture 11: Learning in recurrent networks
Learning Combinational Logic
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

SE367 Project Final Presentation By: Sujith Thomas Parimi Krishna Chaitanya In charge:- Prof Amitabha Mukerjee

 To make a neural net learn the rules of Conway’s game of life and predict the next generation of cells.  To identify oscillators and other emergent patterns using recurrent neural networks.

 Simple rules of Conway’s game of life  Emergence of complex patterns  Backpropagated neural network  Recurrent neural networks

 Training Neural Network to learn the rules of Conway’s game of life  Training a Recurrent Neural Network to detect a repeated pattern.

1.Input vector of size 9 2.Hidden layer has 9 nodes 3.Output layer has 1 node 4.We use bias at input and hidden layer 5.Our activation function is sigmoid 6.We update the weights through the backpropagation algorithm Features of training model

 Input vector of size 18  Hidden layer has 18 nodes  Output layer has 2 nodes  Bias is present at each layer  Activation function is Sigmoid  We are again updating weights through backpropagation.  In input vector the last 9 dimensions correspond to previous delayed state as shown.  We are using an array to store the previous 12 output states (size may vary later).

 The game has cells of 12 rows and 12 columns.  We use a seed of size 3X3 and4X4 to initialize the game.  We use a activation feedback from the output layer with a delay of 12 ticks.  This helps us to detect oscillators with period 1,2,3,4,6.

 Till now we have detected still lives and oscillators.  Till final demonstration we will show Gliders after they are recognized. The problem with gliders comes with their property of “Translation”  For solving this we can either use a 4 layer Neural Network or we have a heuristic of re-seeding.

OSCILLATORS

Still Lives

 A guide to Recurrent Neural Networks and Backpropagation, Mikael Boden, Halmstad University  Pattern Classification – Duda, Hart and Stork  Wikipedia – Conway’s Game of Life  Implementation of Neural Networks in C - John Bullinaria, University of Birmingham.