Cognitive Science Computational modelling

Slides:



Advertisements
Similar presentations
Artificial Intelligence 12. Two Layer ANNs
Advertisements

Multi-Layer Perceptron (MLP)
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 7: Learning in recurrent networks Geoffrey Hinton.
Multilayer Perceptrons 1. Overview  Recap of neural network theory  The multi-layered perceptron  Back-propagation  Introduction to training  Uses.
Introduction to Training and Learning in Neural Networks n CS/PY 399 Lab Presentation # 4 n February 1, 2001 n Mount Union College.
Branch Prediction with Neural- Networks: Hidden Layers and Recurrent Connections Andrew Smith CSE Dept. June 10, 2004.
An Illustrative Example
Multi Layer Perceptrons (MLP) Course website: The back-propagation algorithm Following Hertz chapter 6.
Data Mining with Neural Networks (HK: Chapter 7.5)
LOGO Classification III Lecturer: Dr. Bo Yuan
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
EXAMPLE 2 Graph an exponential function Graph the function y = 2 x. Identify its domain and range. SOLUTION STEP 1 Make a table by choosing a few values.
Learning BlackJack with ANN (Aritificial Neural Network) Ip Kei Sam ID:
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Intelligence Techniques Multilayer Perceptrons.
Introduction to the TLearn Simulator n CS/PY 399 Lab Presentation # 5 n February 8, 2001 n Mount Union College.
xy ? {(3,-2), (5,4), (-2,0)}. A relation is set of ordered pairs, where one number is mapped to another number.
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Neural Networks Videos Brief Review The Next Generation Neural Networks - Geoff Hinton.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Section 7.1: Functions and Representations of Functions.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
EEE502 Pattern Recognition
Perceptrons Michael J. Watts
2-1 Relations and Functions Objective: To graph a relation, state its domain and range, and determine if it is a function, and to find values of functions.
Graphing Linear Equations
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
ALGEBRA READINESS LESSON 8-4 Warm Up Lesson 8-4 Warm-Up.
Neural Networks Lecture 4 out of 4. Practical Considerations Input Architecture Output.
An Introduction To The Backpropagation Algorithm.
Chapter 13 Artificial Intelligence. Artificial Intelligence – Figure 13.1 The Turing Test.
Introduction to the TLearn Simulator n CS/PY 231 Lab Presentation # 5 n February 16, 2005 n Mount Union College.
CSC2535: Lecture 4: Autoencoders, Free energy, and Minimum Description Length Geoffrey Hinton.
Evolutionary Computation Evolving Neural Network Topologies.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Today’s Lecture Neural networks Training
Neural networks.
Graphing Linear Equations
Graphing Linear Relations and Functions
Today we will graph linear equations in slope intercept form.
Artificial Neural Networks
Compact Bilinear Pooling
The Gradient Descent Algorithm
Bell Ringer What quadrant will you find the following points in:
Neural Networks Dr. Peter Phillips.
Data Mining with Neural Networks (HK: Chapter 7.5)
Chapter 3. Artificial Neural Networks - Introduction -
Artificial Intelligence Methods
BACKPROPAGATION Multlayer Network.
network of simple neuron-like computing elements
Neural Networks Chapter 5
Neural Network - 2 Mayank Vatsa
Backpropagation.
An Introduction to Functions
Neural Networks Geoff Hulten.
Ch4: Backpropagation (BP)
Artificial Intelligence 12. Two Layer ANNs
Graphing a Linear Equation
Objective- To use an equation to graph the
CSC321: Neural Networks Lecture 11: Learning in recurrent networks
Objective- To graph a relationship in a table.
The McCullough-Pitts Neuron
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
David Kauchak CS158 – Spring 2019
Lecture 16. Classification (II): Practical Considerations
Ch4: Backpropagation (BP)
Presentation transcript:

Cognitive Science Computational modelling Week 3 Linear separability Configuration files Reconstructing Cohen’s model of autism

Objectives of this workshop To gain more familiarity with Tlearn To learn how to set up a network in Tlearn To train and evaluate a backprop network learning "Exclusive OR" To appreciate the difficulty of analysing network performance To train and evaluate a backprop network model of "autism"

"Exclusive OR“ & hidden units "John is a Tory or John is a Marxist" Either Tory, or Marxist, but not both. Compositional

Linear separability XOR truth table as a graph 2 dimensions (one for each input) Plot the corresponding target output Tory 1 1 1 Marxist

Exercise Draw the corresponding graph for ‘and’ e.g. Sue likes Radiohead and chocolate cake Is ‘and’ linearly separable?

Number of inputs: 2 i1, i2 Number of hidden: ? two? #1, #2 Number of outputs: 1 #3 xor-1501.wts contains the weights saved after 1501 learning trials with the set of training patterns For exercise follow from p117, Chapter 5

Cohen's model of learning in autism Too many and too few neurons and/or connections - Some things hard to learn - Poor generalisation Model looks at effect of irrelevant inputs extra hidden units Rote learning good Distracted by task-irrelevant aspects of the situation Poor generalisation: eg change teacher or environment, don’t perform well Too many eg hippocampus; too few eg cerebellum

Happy face mouth up -1 … 1 (+ve = smile) eyebrows 0 … -1 (-ve = smile*) *roughly See Figure 11.3, but note that the vertical axis has the wrong values

Reconstructing Cohen Re-create input patterns Re-create the target for each input pattern Put those patterns into .data and .teach files Create configuration file

Input patterns 5 input values in each pattern 1st : mouth 2nd : eyebrow 3rd, 4th, 5th : mimic task-irrelevant features of the situation

Values for ‘xtra’ inputs Random numbers should be noise easy way to do it is using SPSS … then “Save as…” comma delimitted

Overview Create training pattern inputs with 5 input values, n = 16 - and corresponding targets in a .teach file Create 8 more in a separate .data file [why?] nb no .teach file needed for these Create configuration file Train; every so many trials, test both the training set & the configuration set

Overview ctd Do it all again, with just one irrelevant xtra input Hint: you only need to make small changes to some of the files you already have

Overview concluded Evaluate the results Quantitatively error as learning progresses, on training set error as learning progresses, generalisation compare results for 1 irrelevant v 3 irrelevant Qualitatively Mapping parameters onto theory eg number of inputs; what does it stand for from the theory Mapping to cognitive performance Mapping to biology