CS206Evolutionary Robotics Artificial Neural Networks Input Layer Output Layer Input Layer Output Layer ++ ++
CS206Evolutionary Robotics Synaptic Weights Input Layer Output Layer Input Layer Output Layer Input Layer Output Layer
CS206Evolutionary Robotics Activation functions: keeping neuron values within limits Input Layer Output Layer σ(0.6* *0.3)= σ(0.09)=0.09 σ(0.6* *0.9)= σ(1.29) = σ(x) = ?
CS206Evolutionary Robotics Neural networks as functions output 0. input1input2output Truth Table: σ(x) = 1:x > 0:x <= input1input2 σ(0* + 0* ) = σ( ) = 0 σ(0* + 1* ) = σ( ) = 0 σ(1* + 0* ) = σ( ) = 0 σ(1* + 1* ) = σ( ) = 1
CS206Evolutionary Robotics Neural networks as functions output 0. input1input2output Truth Table: or σ(x) = 1:x > 0:x <= input1input2 σ(0* + 0* ) = σ( ) = 0 σ(0* + 1* ) = σ( ) = 1 σ(1* + 0* ) = σ( ) = 1 σ(1* + 1* ) = σ( ) = 1
CS206Evolutionary Robotics Neural networks as functions output 0. input1input2output Truth Table: xor σ(x) = 1:x > 0:x <= input1input2 σ(0* + 0* ) = σ( ) = 0 σ(0* + 1* ) = σ( ) = 1 σ(1* + 0* ) = σ( ) = 1 σ(1* + 1* ) = σ( ) = 0
CS206Evolutionary Robotics Hidden neurons allow for nonlinear transformations 0 1 output input1input2 ( ) σ(x) = 1:x > ( ) 0:x <= ( ) input1input2output Truth Table: xor
CS206Evolutionary Robotics Hidden neurons allow for nonlinear transformations 1 1 output input1input2 ( ) σ(x) = 1:x > ( ) 0:x <= ( ) input1input2output Truth Table: xor
CS206Evolutionary Robotics Backpropagation: Learning synaptic weights through training input1input Input1input2out1out2 00…… …… 11…… Truth Table: 1 0 Error: 0 1
CS206Evolutionary Robotics Backpropagation: Learning synaptic weights through training input1input Input1input2out1out2 00…… …… 11…… Truth Table: 1 0 Error: 0 0 localized change to the synaptic weights low chance of increasing error for another input pattern “using tweezers on the neural network”
CS206Evolutionary Robotics Overfitting: Failing to learn the proper relationship between input and output input1input inputk yes “patient 1?” “patient 2?” “patient n?” … … … … symptom 1 …symptom k Disease ? patient1101yes patient2001no …. patient n100no Patient 1’s diagnosis (yes/no) Patient n’s diagnosis (yes/no)
CS206Evolutionary Robotics Recurrent connections: Adding memory to a neural network 01 1 sensor1sensor2 1 motor1 motor sensor1sensor2 ? motor1 motor2 Robot at time step tRobot at time step t+1 Value from time step t Value from Current time step Value of motor 1 is a function of the current sensor values, and the value of motor 2 from the previous time step.
CS206Evolutionary Robotics Recurrent connections: Adding memory to a neural network input1input Input1input2out1out2 00…… …… 11…… Truth Table: 1 0 Error: 0 1
CS206Evolutionary Robotics Backpropagation: Does not work for recurrent connections input1input Input1input2out1out2 00…… …… 11…… Truth Table: 1 0 Error: 0 1 global change to the synaptic weights good chance of increasing error for another input pattern “using a sledge hammer on the neural network”
CS206Evolutionary Robotics Supervised Learning: Correct output for each input pattern is known. Input1…InputnOutput1…Outputm 1…01…1 0…10…1 1…10…0
CS206Evolutionary Robotics Unsupervised Learning: Reward signal returned for a series of input patterns. Sensor1…SensornMotor1…Motorm 1…0… 0…1… 1…1… Distance:6.3 meters