CS 182 Leon Barrett and Will Chang Thanks to Eva Mok and Joe Makin Q:What did the Hollywood film director say after he finished making a movie about myelin?

Slides:



Advertisements
Similar presentations
Phantom Limb Phenomena. Hand movement observation by individuals born without hands: phantom limb experience constrains visual limb perception. Funk M,
Advertisements

Artificial Intelligence 13. Multi-Layer ANNs Course V231 Department of Computing Imperial College © Simon Colton.
A few pilots to drive this research Well-trained subjects: 15 hours, including 5 of practice. Stimuli: Holistic: Separable: Task: "Same"-"Different" task.
Language Comprehension Speech Perception Semantic Processing & Naming Deficits.
Auditory Word Recognition
Perception and Attention. Information Processing Model  models human thought like its a computer.
Visual Search: finding a single item in a cluttered visual scene.
Upcoming Stuff: Finish attention lectures this week No class Tuesday next week – What should you do instead? Start memory Thursday next week – Read Oliver.
CS 182 Sections Created by Eva Mok Modified by JGM 2/2/05 Q:What did the hippocampus say during its retirement speech? A:“Thanks for the memories”
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
Visual Cognition II Object Perception. Theories of Object Recognition Template matching models Feature matching Models Recognition-by-components Configural.
Abstract Neuron { o u t p u t y i n p u t i i2 in i w0 i0=1 w2
Abstract Neuron w2w2 wnwn w1w1 w0w0 i 0 =1 o u t p u t y i2i2 inin i1i1... i n p u t i 1 if net > 0 0 otherwise {
Attention II Selective Attention & Visual Search.
Features and Object in Visual Processing. The Waterfall Illusion.
Visual Cognition II Object Perception. Theories of Object Recognition Template matching models Feature matching Models Recognition-by-components Configural.
Reading. Reading Research Processes involved in reading –Orthography (the spelling of words) –Phonology (the sound of words) –Word meaning –Syntax –Higher-level.
Features and Object in Visual Processing. The Waterfall Illusion.
CS 182 Sections Eva Mok Feb 11, 2004 ( bad puns alert!
Levels of Sensory Processing
Language Comprehension Speech Perception Naming Deficits.
Artificial Neural Networks
December 2, 2014Computer Vision Lecture 21: Image Understanding 1 Today’s topic is.. Image Understanding.
Abstract Neuron w2w2 wnwn w1w1 w0w0 i 0 =1 o u t p u t y i2i2 inin i1i1... i n p u t i 1 if net > 0 0 otherwise {
Studying Visual Attention with the Visual Search Paradigm Marc Pomplun Department of Computer Science University of Massachusetts at Boston
Connectionism. ASSOCIATIONISM Associationism David Hume ( ) was one of the first philosophers to develop a detailed theory of mental processes.
Machine Learning Chapter 4. Artificial Neural Networks
1 Perception and VR MONT 104S, Fall 2008 Session 13 Visual Attention.
Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University EE459 Neural Networks The Structure.
NEURAL NETWORKS FOR DATA MINING
The Word Superiority Effect OR How humans use context to see without really seeing and how can it help the field of computational vision.
Visual Information Processing October 30, Feature Detectors Certain cells in the visual cortex (occipital lobe) respond to specific features of.
The effects of working memory load on negative priming in an N-back task Ewald Neumann Brain-Inspired Cognitive Systems (BICS) July, 2010.
Phonological Priming and Lexical Access in Spoken Word Recognition Christine P. Malone Minnesota State University Moorhead.
Announcements a3 is out, due 2/15 11:59pm Please please please start early quiz will be graded in about a week. a1 will be graded shortly—use glookup to.
Welcome to the STROOP Experiment Page
Computational Properties of Perceptron Networks n CS/PY 399 Lab Presentation # 3 n January 25, 2001 n Mount Union College.
Artificial Neural Networks This is lecture 15 of the module `Biologically Inspired Computing’ An introduction to Artificial Neural Networks.
Learning: Neural Networks Artificial Intelligence CMSC February 3, 2005.
CS 182 Sections Leon Barrett ( bad puns alert!
CS 182 Leon Barrett Thanks to Eva Mok and Joe Makin Q:What did the Hollywood film director say after he finished making a movie about myelin? A:“That’s.
CS 182 Sections Leon Barrett (
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Chapter 9 Knowledge. Some Questions to Consider Why is it difficult to decide if a particular object belongs to a particular category, such as “chair,”
Today’s Lecture Neural networks Training
Learning linguistic structure with simple and more complex recurrent neural networks Psychology February 2, 2017.
CS262: Computer Vision Lect 06: Face Detection
CS 182 Sections 101 & 102 Leon Barrett Jan 17, 2007
Cognitive Processes in SLL and Bilinguals:
CS 182 Sections 101 & 102 Leon Barrett Jan 25, 2008
Phonological Priming and Lexical Access in Spoken Word Recognition
The Components of the Phenomenon of Repetition Suppression
slides created by Leon Barrett
What is an ANN ? The inventor of the first neuro computer, Dr. Robert defines a neural network as,A human brain like system consisting of a large number.
Neural Networks CS 446 Machine Learning.
The what, where, when, and how of visual word recognition
Intelligent Information System Lab
Learning linguistic structure with simple and more complex recurrent neural networks Psychology February 8, 2018.
Cognitive Processes PSY 334
Lecture: Deep Convolutional Neural Networks
Phonological Priming and Lexical Access in Spoken Word Recognition
Machine Learning in Practice Lecture 6
Conducting experiments
Artificial Intelligence 12. Two Layer ANNs
(Do Now) Journal What is psychophysics? How does it connect sensation with perception? What is an absolute threshold? What are some implications of Signal.
The Network Approach: Mind as a Web
The McCullough-Pitts Neuron
Active, dynamic, interactive, system
David Kauchak CS158 – Spring 2019
Unit 3 Biological Bases of Behavior
Presentation transcript:

CS 182 Leon Barrett and Will Chang Thanks to Eva Mok and Joe Makin Q:What did the Hollywood film director say after he finished making a movie about myelin? A:“That’s a wrap!”

Announcements a2 is out, due next Tuesday 11:59pm –play with fannExplorer –you can run it on the server, but if you want to do a lot of playing, you might want to download it; search for FANN

Where we stand Last Week –Neurons –Neural development This Week –Idealized neurons and connectionist units –Spreading Activation, triangle nodes Coming up –Backprop (review your calculus!)

Serial and Parallel What is the difference between serial and parallel computation? Can you give an example of this contrast? –Serial computation takes time for every piece of the computation –Parallel computations occur at the same time –For example, face recognition is parallel – don't have to search through all possible people –Listing family members is serial – can't think of them all at once

Neuron models What is a McCullough-Pitts neuron? –Calculate “activation” by weighted sum of inputs –Use a function to map activation to output What sorts of output functions does it use? –Linear – why is this bad? –Threshold –Sigmoid – smoothed threshold

The McCullough-Pitts Neuron y j : output from unit j W ij : weight on connection from j to i x i : weighted sum of input to unit i xixi f yjyj w ij yiyi x i = ∑ j w ij y j y i = f(x i ) t i : target

Let’s try an example: the OR function Assume you have a threshold function centered at the origin What should you set w 01, w 02 and w 0b to be so that you can get the right answers for y 0 ? y0y0 i2i2 i1i1 x0x0 f i1i1 w 01 y0y0 i2i2 b=1 w 02 w 0b

Many answers would work y = f (w 01 i 1 + w 02 i 2 + w 0b b) recall the threshold function the separation happens when w 01 i 1 + w 02 i 2 + w 0b b = 0 move things around and you get i 2 = - (w 01/ w 02 )i 1 - (w 0b b/w 02 ) i2i2 i1i1

Neuron models If neurons spike, what biological feature do non-integer values correspond to? –Firing rate What is unbiological about the McCullough-Pitts model? –Time is ignored –Weights are unlimited –Hugely simplifies computations Some people propose that real neurons do complicated quantum-mechanical things

Neuron models How does a Sigma-Pi neuron differ from a McCullough-Pitts neuron? –It multiplies sets of inputs

Sigma-Pi units

Sigma-Pi Unit

How we can model the triangle node with McCullough-Pitts Neurons? BC A ABC

Necker-Cube like illusion

Stroop effect What is the Stroop effect? –takes longer to say what color a word is printed in if it names a different color –suggests interaction of form and meaning (as opposed to an encapsulated ‘language module’) Red Blue Green Orange Red Yellow Green Yellow Red Blue Yellow Orange

‘Word superiority effect’ What is the word superiority effect? –it’s easier to remember letters if they are seen in the context of a word –militates against ‘bottom-up’ model, where word recognition is built up from letters –suggestion: there are top-down and bottom-up processes which interact

Priming and Related Concepts What is priming, and how do we measure it? –Faster completion of a task –Present stimulus while subject performing task –Different stimulus timing causes different effects What is the effect of suggesting an alternate meaning of a word, like saying “They all rose”? –If early, primes –If late, decreases priming

“They all rose” triangle nodes: when two of the neurons fire, the third also fires model of spreading activation

Body part understanding task Tests priming and interference when showing images of actions It turns out that viewing actions affects recognizing action words Can you explain the results (next slide)?

Preliminary Behavior Results Same Action Other Effector Same Effector 40 Native Speakers Eliminate RT > 2 sec.

Ey e trackin g compute r Eye camera Scene camera Allopenna, Magnuson & Tanenhaus (1998) ‘Pick up the beaker’

Do rhymes compete? Cohort (Marlsen-Wilson) : onset similarity is primary because of the incremental nature of speech ( serial/staged; Shortlist/Merge ) – Cat activates cap, cast, cattle, camera, etc. – Rhymes won’t compete NAM (Neighborhood Activation Model; Luce) : global similarity is primary – Cat activates bat, rat, cot, cast, etc. – Rhymes among set of strong competitors TRACE (McClelland & Elman) : global similarity constrained by incremental nature of speech – Cohorts and rhymes compete, but with different time course TRACE predictions

Allopenna et al. Results

Study 1 Conclusions As predicted by interactive models, cohorts and rhymes are activated, with different time courses Eye movement paradigm – More sensitive than conventional paradigms – More naturalistic – Simultaneous measures of multiple items – Transparently linkable to computational model Time locked to speech at a fine grain

Theoretical conclusions Natural contexts provide strong constraints that are used When those constraints are extremely predictive, they are integrated as quickly as we can measure Suggests rapid, continuous interaction among – Linguistic levels – Nonlinguistic context Even for processes assumed to be low-level and automatic Constrains processing theories, also has implications for, e.g., learnability