Computational Cognitive Neuroscience Lab Today: Model Learning.

Slides:



Advertisements
Similar presentations
E.4 Neurotransmitters and Synapses. E4.1 Postsynaptic Responses Pre-synaptic neurons can inhibit or excite the post synaptic neuron by releasing excitatory.
Advertisements

This powerpoint will help you with your WS and with the computer based activity. You do not need to memorize this info, just understand it as we discuss.
USG Part III: Electrochemistry USG Part III: Electrochemistry See also the NOTES documents posted online at our wikispace, the online self-quizzes posted.
PDP: Motivation, basic approach. Cognitive psychology or “How the Mind Works”
Spike Timing-Dependent Plasticity Presented by: Arash Ashari Slides mostly from: 1  Woodin MA, Ganguly K, and Poo MM. Coincident pre-
Spike timing dependent plasticity Homeostatic regulation of synaptic plasticity.
Biological and Artificial Neurons Michael J. Watts
Synaptic Transmission Chapter 4 Pages Chemical Synapses  Most synapses in the brain are chemical. Electronically coupled gap junction synapses.
How Patterned Connections Can Be Set Up by Self-Organization D.J. Willshaw C. Von Der Malsburg.
Inhibitory and Excitatory Signals
Un Supervised Learning & Self Organizing Maps Learning From Examples
Synapses are everywhere neurons synapses Synapse change continuously –From msec –To hours (memory) Lack HH type model for the synapse.
CSE 153Modeling Neurons Chapter 2: Neurons A “typical” neuron… How does such a thing support cognition???
Long term potentiation (LTP) of an excitatory synaptic inputs is input specific.
How does the mind process all the information it receives?
The three main phases of neural development 1. Genesis of neurons (and migration). 2. Outgrowth of axons and dendrites, and synaptogenesis. 3. Refinement.
Neural communication How do neurons send messages to each other?
1 Activity-dependent Development (2) Hebb’s hypothesis Hebbian plasticity in visual system Cellular mechanism of Hebbian plasticity.
EE141 1 Hebbian learning models Janusz A. Starzyk Computational Intelligence Based on a course taught by Prof. Randall O'ReillyRandall O'Reilly University.
Neuron schematic  G t = RT ln (c 2 /c 1 ) + zF  E axon myelin sheath dendrites nerve endings nt release nt receptors Cell body synapse.
Mind, Brain & Behavior Monday January 27, Connections Among Neurons  The growing tip of an axon is called a growth cone.  Lamellipodia – flaps.
Critical periods A time period when environmental factors have especially strong influence in a particular behavior. –Language fluency –Birds- Are you.
Neurons & Neuroanatomy What are the characteristics of neurons important for Cognitive Neuroscience? What is the brain structure important for CogNeuro?
1 Session 5 The Neuron II: Synaptic Transmission PS111: Brain & Behaviour Module 1: Psychobiology.
Neural Plasticity Lecture 7. Neural Plasticity n Nervous System is malleable l learning occurs n Structural changes l increased dendritic branching l.
THE ROLE OF NEURONS IN PERCEPTION Basic Question How can the messages sent by neurons represent objects in the environment?
Questions 17-1 Q: What happens to a nerve impulse once it reaches the end of an axon? Q: How does one neuron communicate with another?
Taken from: Hodgkin and Huxley Taken from:
Chapter Thirteen The Biology of Learning and Memory.
Molecular mechanisms of memory. How does the brain achieve Hebbian plasticity? How is the co-activity of presynaptic and postsynaptic cells registered.
Neural Plasticity: Long-term Potentiation Lesson 15.
Cognition, Brain and Consciousness: An Introduction to Cognitive Neuroscience Edited by Bernard J. Baars and Nicole M. Gage 2007 Academic Press Chapter.
synaptic plasticity is the ability of the connection, or synapse, between two neurons to change in strength in response to either use or disuse of transmission.
Unit 4 Psychology Learning: Neural Pathways, Synapse Formation & the Role of Neurotransmitters.
Mechanisms for memory: Introduction to LTP Bailey Lorv Psych 3FA3 November 15, 2010.
LEARNING GOAL 2.5: DESCRIBE THE PROCESS OF NEUROTRANSMISSION Neurotransmission.
Memory I Episodic or expelicit or declerative Limbic and temporal lobe are essential Semantic or Implicit or nondeclerative or reflexive, Tehran is captial.
The Action Potential & Impulse/Signal Propagation Learning Objective Be able to describe what a synapse is. Be able to describe how an action potential.
Version 0.10 (c) 2007 CELEST VISI  N BRIGHTNESS CONTRAST: ADVANCED MODELING CLASSROOM PRESENTATION.
1 4. Associators and Synaptic Plasticity Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science.
Trends in Biomedical Science Making Memory. The following slides are mostly derived from The Brain from Top to Bottom, an Interactive Website about the.
UNIT 3 THE CONSCIOUS SELF
Copyright © 2004 Allyn and Bacon 1 Chapter 13 Learning and Memory: Basic Mechanisms This multimedia product and its contents are protected under copyright.
Synaptic plasticity. Definition Alteration of synapse response to input.
Unit 3: Biological Bases of Behavior 3-A (The Neuron) Mr. Debes A.P. Psychology.
Electrical signals Sodium ions Potassium ions Generate an action potential at the axon hillock Travels down the axon to the terminal – regenerating the.
Regression. We have talked about regression problems before, as the problem of estimating the mapping f(x) between an independent variable x and a dependent.
Computational Cognitive Neuroscience Lab Today: Second session.
Neural Mechanisms of Learning & Memory Lesson 24.
Central Nervous System + Neural Networks
CSC321: Neural Networks Lecture 1: What are neural networks? Geoffrey Hinton
University of Jordan1 Physiology of Synapses in the CNS- L4 Faisal I. Mohammed, MD, PhD.
Synaptic Plasticity Synaptic efficacy (strength) is changing with time. Many of these changes are activity-dependent, i.e. the magnitude and direction.
1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately neurons, each connected to, on average, 10 4.
Jochen Triesch, UC San Diego, 1 Part 3: Hebbian Learning and the Development of Maps Outline: kinds of plasticity Hebbian.
The Reward Pathway.
Long Term Potentiation
Unit IV Lesson III, Activity I
Types of Learning Associative Learning: Classical Conditioning
A principled way to principal components analysis
Types of Learning Associative Learning: Classical Conditioning
Mind, Brain & Behavior Friday January 31, 2003.
Computational neuroscience
Long term potentiation and depression
Types of Learning Associative Learning: Classical Conditioning
Types of Memory (iconic memory) (7 bits for 30seconds)
Adaptive Resonance Theory
Types of Learning Associative Learning: Classical Conditioning
Synaptic Transmission and Integration
Design of Experiments CHM 585 Chapter 15.
Presentation transcript:

Computational Cognitive Neuroscience Lab Today: Model Learning

Computational Cognitive Neuroscience Lab »Today: »Homework is due Friday, Feb 17 »Chapter 4 homework is shorter than the last one! »Undergrads omit 4.4, 4.5, 4.7c, 4.7d

Hebbian Learning »“Neurons that fire together, wire together” »Correlations between sending and receiving activity strengthens the connection between them »“Don’t fire together, unwire” »Anti-correlation between sending and receiving activity weakens the connection

LTP/D via NMDA receptors »NMDA receptors allow calcium to enter the (postsynaptic) cell »NMDA are blocked by Mg+ ions, which are cast off when the membrane potential increases »Glutamate (excitatory) binds to unblocked NMDA receptor, causes structural change that allows Ca++ to pass through

Calcium and Synapses »Calcium initiates multiple chemical pathways, dependent on the level of calcium »Low Ca++ long term depression (LTD) »High Ca++ long term potentiation (LTP) »LTP/D effects: new postsynaptic receptors, incresed dendritic spine size, or increased presynaptc release processes (via retrograde messenger)

Fixing Hebbian learning »Hebbian learning results in infinite weights! »Oja’s normalization (savg_corr) »When to learn? »Conditional PCA--learn only when you see something interesting »A single unit hogs everything? »kWTA and Contrast enhancement --> specialization

Principal Components Analysis (PCA) »Principal, as in primary, not principle, as in some idea »PCA seeks a linear combination of variables such that maximum variance is extracted from the variables. It then removes this variance and seeks a second linear combination which explains the maximum proportion of the remaining variance, and so on until you run out of variance.

PCA continued »This is like linear regression, except you take the whole collection of variables (vector) and correlate it with itself to make a matrix. »This is kind of like linear regression, where a whole collection of variables is regressed on itself »The line of best fit through this regression is the first principal component!

PCA cartoon

Conditional PCA »“Perform PCA only when a particular input is received” »Condition: The forces that determine when a receiving unit is active »Competition means hidden units will specialize for particular inputs »So hidden units only learn when their favorite input is available

Self-organizing learning »kWTA determines which hidden units are active for a given input »CPCA ensures those hidden units learn only about a single aspect of that input »Contrast enhancement -- drive high weighs higher, low weights lower »Contrast enhancement helps units specialize (and share)

Bias-variance dilemma »High bias--actual experience does not change model much, so biases better be good! »Low bias--experience highly determines learning, so does random error! Model could be different, high model variance

Architecture as Bias »Inhibition drives competition, and competition determines which units are active, and the unit activity determines learning »Thus, deciding which units share inhibitory connections (are in the same layer) will affect the learning »This architecture is the learning bias!

Fidelity and Simplicity of representations »Information must be lost in the world-to- brain transformation (p118) »There is a tradeoff in the amount of information lost, and the complexity of the representation »Fidelity / simplicity tradeoff is set by »Conditional PCA (first principal component only) »Competition (k value) »Contrast enhancement (savg_corr, wt_gain)