Dante R. Chialvo Learning, Memory and Criticality “Most of our entire life is devoted to learn. Although its importance in medicine is tremendous, the.

Slides:



Advertisements
Similar presentations
Molecular Mechanisms of Learning and Memory
Advertisements

Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Molecular mechanisms of long-term memory
Kostas Kontogiannis E&CE
Figure 8.1 Forms of short-term synaptic plasticity.
1Neural Networks B 2009 Neural Networks B Lecture 1 Wolfgang Maass
Neural Mechanisms of Memory Storage Molecular, synaptic, and cellular events store information in the nervous system. New learning and memory formation.
System Theory A quick look at systems. General Systems Theory Ludwig von Bertalanffy Peter Checkland General Systems Theory: There are parallels found.
Adult Cortical Plasticity 1.Maps in somatic sensory and motor cortex 2.Reorganization of cortical maps following sensory deprivation 3.Synaptic basis of.
Basic Models in Neuroscience Oren Shriki 2010 Associative Memory 1.
Synapses are everywhere neurons synapses Synapse change continuously –From msec –To hours (memory) Lack HH type model for the synapse.
Long term potentiation (LTP) of an excitatory synaptic inputs is input specific.
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #31 4/17/02 Neural Networks.
Nervous Systems. What’s actually happening when the brain “learns” new information? 3. I’m too old to learn anything new anymore; I hire people do that.
Neuron schematic  G t = RT ln (c 2 /c 1 ) + zF  E axon myelin sheath dendrites nerve endings nt release nt receptors Cell body synapse.
Critical periods A time period when environmental factors have especially strong influence in a particular behavior. –Language fluency –Birds- Are you.
Artificial Neural Networks -Application- Peter Andras
Chapter 13 Learning and Memory. SIMPLE LEARNING a. habituation b. Pavlovian learning c. instrumental learning d. biological mechanisms HIGHER ORDER COGNITION.
Copyright © 2007 Wolters Kluwer Health | Lippincott Williams & Wilkins Neuroscience: Exploring the Brain, 3e Chapter 25: Molecular Mechanisms of Learning.
Figure 8.1 The formation of a memory trace. Figure 8.2 Components of the classic Morris experiment.
Vertebrate Models of Learning
Jochen Triesch, UC San Diego, 1 Short-term and Long-term Memory Motivation: very simple circuits can store patterns of.
THE ROLE OF NEURONS IN PERCEPTION Basic Question How can the messages sent by neurons represent objects in the environment?
LEZIONE UNDICI SELF-ORGANIZATION AND EMERGENCE IN DYNAMIC SYSTEMS.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
Swimmy Copyright Dr. Franklin B. Krasne, 2008 Advantages of Swimmy 1)Very faithful simulation of neural activity 2)Flawless electrode placement. 3)No.
Chapter Thirteen The Biology of Learning and Memory.
Molecular mechanisms of memory. How does the brain achieve Hebbian plasticity? How is the co-activity of presynaptic and postsynaptic cells registered.
Neural Plasticity: Long-term Potentiation Lesson 15.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Presented by Scott Lichtor An Introduction to Neural Networks.
synaptic plasticity is the ability of the connection, or synapse, between two neurons to change in strength in response to either use or disuse of transmission.
Excitable cells and their biochemistry David Taylor
Numerical Experiments in Spin Network Dynamics Seth Major and Sean McGovern ‘07 Hamilton College Dept. of Physics Spin networks In one approach to quantum.
8 Memory Formation: Post-Translation Processes. The goal of this chapter and several that follow is to determine if some of the processes that have been.
Its all physical!.  Basic structure of the NS is set before birth  Neurons are however flexible living cells that can grow new connections  The ability.
Swimmy Copyright Dr. Franklin B. Krasne, 2008 Advantages of Swimmy 1)Very faithful simulation of neural activity 2)Flawless electrode placement. 3)No.
Unit 4 Psychology Learning: Neural Pathways, Synapse Formation & the Role of Neurotransmitters.
Identity. Identify of Objects  What a thing is, what makes it what it is, its properties  The problem  If an object really changes, there can't literally.
Mechanisms for memory: Introduction to LTP Bailey Lorv Psych 3FA3 November 15, 2010.
B. Stochastic Neural Networks
Chapter 44: Neurons and Nervous Systems CHAPTER 44 Neurons and Nervous Systems.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
LONG-TERM POTENTIATION (LTP) Introduction LTP as a candidate mechanism for the activity-dependent change in the strength of synaptic connections LTP is.
Slide 1 Neuroscience: Exploring the Brain, 3rd Ed, Bear, Connors, and Paradiso Copyright © 2007 Lippincott Williams & Wilkins Bear: Neuroscience: Exploring.
Lecture 5 Neural Control
Trends in Biomedical Science Making Memory. The following slides are mostly derived from The Brain from Top to Bottom, an Interactive Website about the.
Computational Cognitive Neuroscience Lab Today: Model Learning.
1 Complejidad Dia 7 Ecología Biologí a Psicologia Meteorología MacroEconomía Geofisica UBA, Junio 19, 2012.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Fear conditioning… e.g., Electric shock associated with specific stimuli.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CSC321: Neural Networks Lecture 1: What are neural networks? Geoffrey Hinton
Synaptic Plasticity Synaptic efficacy (strength) is changing with time. Many of these changes are activity-dependent, i.e. the magnitude and direction.
Electroencephalography (EEG) is based on synaptic currents
Synaptic Transmission / Central Synapses I Tom O’Dell Department of Physiology C8-161 (NPI), x64654.
1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately neurons, each connected to, on average, 10 4.
Strategies for Improving Concentration & Memory -Knowledge Zone.
Neural Mechanisms of Memory Storage
Artificial Neural Networks This is lecture 15 of the module `Biologically Inspired Computing’ An introduction to Artificial Neural Networks.
Leading in a Complex Healthcare Environment
Long Term Potentiation
Cognitive Computing…. Computational Neuroscience
1. Describe the structures and functions of the animal nervous system
UNIT 4 BRAIN, BEHAVIOUR & EXPERIENCE
John T.R. Isaac, Michael C. Ashby, Chris J. McBain  Neuron 
David Kauchak CS158 – Spring 2019

Neuroscience: Exploring the Brain, 3e
Presentation transcript:

Dante R. Chialvo Learning, Memory and Criticality “Most of our entire life is devoted to learn. Although its importance in medicine is tremendous, the field don’t quite have yet an understanding of what is the essence of brain learning. We have the intuition that brain learning must be a collective process (in the strong sense) for which there is not yet theory. Main stream efforts runs in a direction we argue will not leads to the solution. In this “motivational” talk we illustrate briefly the main point.”

1. 1.(blah blah) Complex vs. Complicated (numerics) Toy model of learning -> is critical.

Why We Do What We Do? 1.Brains self-organize to survive predators escaping, moving. 2.Immune systems self-organize to survive predators(when is inside and escaping is useless). 3.Societies self-organize to survive predators (when the individual response is useless). 4.…. More. All these systems are complex dynamical systems, with very large number of nonlinear degrees of freedom, curiously share a property: memory… would it be possible to learn something relevant about memory studying societies, brains etc?.. Brains found useful to be the way they are

many linear pieces + a central supervisor + blueprint = “whole” Example: a tv set many nonlinear pieces + coupling + injected energy = “emergent properties” Example: society Complex system Complicated system Complicated or Complex?

Is Learning & Memory a Complex or a Complicated Problem? If learning & memory is just complicated, then somebody will eventually figure out the whole problem. But if happen to be complex … we can seat and wait forever…But if happen to be complex … we can seat and wait forever… Note that: Current experiments explore isolated details (i.e. one neuron, few synapses… etc.)

What Is the Problem? The current emphasis is … To understand how billions of neurons learn, remember and forget on a self-organized way. I Don’t Know the Solution! The problem belong to biology but the solution to physics.  To find a relationship between hippocampal long-term potentiation, (“LTP”) of synapses and memory.

Steps of Long-term Potentiation 1. 1.Rapid stimulation of neurons depolarizes them Their NMDA receptors open, Ca 2+ ions flows into the cell and bind to calmodulin This activates calcium-calmodulin-dependent kinase II (CaMKII) CaMKII phosphorylates AMPA receptors making them more permeable to the inflow of Na + ions (i.e., increasing the neuron’ sensitivity to future stimulation The number of AMPA receptors at the synapse also increases Increased gene expression (i.e., protein synthesis - perhaps of AMPA receptors) and additional synapses form.

Biology is concerned with “Long-Term Potentiation”  If A and B succeed together to fire the neuron (often enough) synapse B will be reinforced

What Is Wrong With “LTP”? First of all: There is no evidence* linking memory  LTP Furthermore: It is a process purely local (lacking any global coupling). It implies a positive feedback (“addictive”). It needs multiple trials (“rehearsal”). Finally: Network components are not constant, neurons are replaced (even in adults). *(non-circumstantial)

How difficult would be for a neuronal network to learn The idea was not to invent another “learning algorithm” but to play with the simplest, still biologically realistic, one. Chialvo and Bak, Neuroscience (1999) Bak and Chialvo, Phys. Rev. E (2001). Wakeling J. Physica A, 2003) Wakeling and Bak, Phys.Rev. E (2001).

Self-organized Learning: Toy Model 1) Neuron “I*” fires 2) Neuron “j*” with largest W*(j*,I*) fires and son on neuron with largest W*(k*,j*) fires… 3) If firing leads to success: Do nothing  Do nothing otherwise otherwise   decrease W* by  That is all Bak and Chialvo. Phys. Rev. E (2001). Chialvo and Bak, Neuroscience (1999) Wakeling J. Physica A, 2003)

How It Works on a Simple Task Connect one (or more) input neurons with a given output neuron. Chialvo and Bak, Neuroscience (1999)

A simple gizmo a)left right b)10% “blind” c)10% “stroke” d)40% “stroke” Chialvo and Bak, Neuroscience (1999)

How It Scales With Brain Size More neurons -> faster learning. It makes sense! The only model where larger is better Chialvo and Bak, Neuroscience (1999)

How It Scales With Problem Size (on the Parity Problem) A) Mean error vs Time for various problem’ sizes (i.e., N=2 m bit strings) B) Rescaled Mean error (with k=1.4) Chialvo and Bak, Neuroscience (1999)

Order-Disorder Transition Learning time is optimized for  > 1

Order-Disorder Transition At  = 1 the network is critical!

Synaptic landscape remains rough Elimination of the least- fit connections Activity propagates through the best-fit ones At all times the synaptic landscape is rough Fast re-learning Chialvo and Bak, Neuroscience (1999)

Summing up: 1.We discusses why we don’t share the main-stream idea that learning in the brain is based on LTP. Probably LTP is an epi- phenomena. 2.Intuition tell us that learning in brains must be a collective process. Theory is needed here. 3.As an exercise we showed an alternative toy model of self- organized learning (not based on LTP) which is biologically plausible.