CS206Evolutionary Robotics Artificial Neural Networks 0.60.9 0.6 Input Layer Output Layer ++ 0.60.9 1.5 Input Layer Output Layer ++ ++

Slides:



Advertisements
Similar presentations
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Advertisements

1 Image Classification MSc Image Processing Assignment March 2003.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Lecture 14 – Neural Networks
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
1 Part I Artificial Neural Networks Sofia Nikitaki.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Neural Networks Basic concepts ArchitectureOperation.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
An Illustrative Example
Chapter 6: Multilayer Neural Networks
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
1 st Neural Network: AND function Threshold(Y) = 2 X1 Y X Y.
Cascade Correlation Architecture and Learning Algorithm for Neural Networks.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Learning in neural networks Chapter 19 DRAFT. Biological neuron.
Explorations in Neural Networks Tianhui Cai Period 3.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
ANNs (Artificial Neural Networks). THE PERCEPTRON.
Appendix B: An Example of Back-propagation algorithm
NEURAL NETWORKS FOR DATA MINING
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Intelligence Techniques Multilayer Perceptrons.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
An informal description of artificial neural networks John MacCormick.
George F Luger ARTIFICIAL INTELLIGENCE 6th edition Structures and Strategies for Complex Problem Solving Machine Learning: Connectionist Luger: Artificial.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Artificial Neural Networks Bruno Angeles McGill University – Schulich School of Music MUMT-621 Fall 2009.
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Neural Networks Demystified by Louise Francis Francis Analytics and Actuarial Data Mining, Inc.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Neural Networks 2nd Edition Simon Haykin
Modelleerimine ja Juhtimine Tehisnärvivõrgudega Identification and Control with artificial neural networks.
Perceptrons Michael J. Watts
Chapter 6 Neural Network.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
NEURONAL NETWORKS AND CONNECTIONIST (PDP) MODELS Thorndike’s “Law of Effect” (1920’s) –Reward strengthens connections for operant response Hebb’s “reverberatory.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Neural Networks.
Modelleerimine ja Juhtimine Tehisnärvivõrgudega
CSE 473 Introduction to Artificial Intelligence Neural Networks
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
CSE P573 Applications of Artificial Intelligence Neural Networks
CSE 473 Introduction to Artificial Intelligence Neural Networks
Prof. Carolina Ruiz Department of Computer Science
Artificial Neural Networks for Pattern Recognition
Convolutional Neural Networks
Artificial Intelligence Methods
XOR problem Input 2 Input 1
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Neural Networks Chapter 5
Artificial Neural Networks
A connectionist model in action
ARTIFICIAL NEURAL networks.
Sanguthevar Rajasekaran University of Connecticut
Learning Combinational Logic
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

CS206Evolutionary Robotics Artificial Neural Networks Input Layer Output Layer Input Layer Output Layer ++ ++

CS206Evolutionary Robotics Synaptic Weights Input Layer Output Layer Input Layer Output Layer Input Layer Output Layer

CS206Evolutionary Robotics Activation functions: keeping neuron values within limits Input Layer Output Layer σ(0.6* *0.3)= σ(0.09)=0.09 σ(0.6* *0.9)= σ(1.29) = σ(x) = ?

CS206Evolutionary Robotics Neural networks as functions output 0. input1input2output Truth Table: σ(x) = 1:x > 0:x <= input1input2 σ(0* + 0* ) = σ( ) = 0 σ(0* + 1* ) = σ( ) = 0 σ(1* + 0* ) = σ( ) = 0 σ(1* + 1* ) = σ( ) = 1

CS206Evolutionary Robotics Neural networks as functions output 0. input1input2output Truth Table: or σ(x) = 1:x > 0:x <= input1input2 σ(0* + 0* ) = σ( ) = 0 σ(0* + 1* ) = σ( ) = 1 σ(1* + 0* ) = σ( ) = 1 σ(1* + 1* ) = σ( ) = 1

CS206Evolutionary Robotics Neural networks as functions output 0. input1input2output Truth Table: xor σ(x) = 1:x > 0:x <= input1input2 σ(0* + 0* ) = σ( ) = 0 σ(0* + 1* ) = σ( ) = 1 σ(1* + 0* ) = σ( ) = 1 σ(1* + 1* ) = σ( ) = 0

CS206Evolutionary Robotics Hidden neurons allow for nonlinear transformations 0 1 output input1input2 ( ) σ(x) = 1:x > ( ) 0:x <= ( ) input1input2output Truth Table: xor

CS206Evolutionary Robotics Hidden neurons allow for nonlinear transformations 1 1 output input1input2 ( ) σ(x) = 1:x > ( ) 0:x <= ( ) input1input2output Truth Table: xor

CS206Evolutionary Robotics Backpropagation: Learning synaptic weights through training input1input Input1input2out1out2 00…… …… 11…… Truth Table: 1 0 Error: 0 1

CS206Evolutionary Robotics Backpropagation: Learning synaptic weights through training input1input Input1input2out1out2 00…… …… 11…… Truth Table: 1 0 Error: 0 0 localized change to the synaptic weights  low chance of increasing error for another input pattern  “using tweezers on the neural network”

CS206Evolutionary Robotics Overfitting: Failing to learn the proper relationship between input and output input1input inputk yes “patient 1?” “patient 2?” “patient n?” … … … … symptom 1 …symptom k Disease ? patient1101yes patient2001no …. patient n100no Patient 1’s diagnosis (yes/no) Patient n’s diagnosis (yes/no)

CS206Evolutionary Robotics Recurrent connections: Adding memory to a neural network 01 1 sensor1sensor2 1 motor1 motor sensor1sensor2 ? motor1 motor2 Robot at time step tRobot at time step t+1 Value from time step t Value from Current time step Value of motor 1 is a function of the current sensor values, and the value of motor 2 from the previous time step.

CS206Evolutionary Robotics Recurrent connections: Adding memory to a neural network input1input Input1input2out1out2 00…… …… 11…… Truth Table: 1 0 Error: 0 1

CS206Evolutionary Robotics Backpropagation: Does not work for recurrent connections input1input Input1input2out1out2 00…… …… 11…… Truth Table: 1 0 Error: 0 1 global change to the synaptic weights  good chance of increasing error for another input pattern  “using a sledge hammer on the neural network”

CS206Evolutionary Robotics Supervised Learning: Correct output for each input pattern is known. Input1…InputnOutput1…Outputm 1…01…1 0…10…1 1…10…0

CS206Evolutionary Robotics Unsupervised Learning: Reward signal returned for a series of input patterns. Sensor1…SensornMotor1…Motorm 1…0… 0…1… 1…1… Distance:6.3 meters