1 Complejidad Dia 7 Ecología Biologí a Psicologia Meteorología MacroEconomía Geofisica UBA, Junio 19, 2012.

Slides:



Advertisements
Similar presentations
Molecular Mechanisms of Learning and Memory
Advertisements

Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
Molecular mechanisms of long-term memory
Justin Besant BIONB 2220 Final Project
Figure 8.1 Forms of short-term synaptic plasticity.
Part Fundamentals of Physiology Part II Food, Energy, and Temperature Part III Integrating systems Part IV Movement and Muscle Part V Oxygen, Carbon dioxide,
Neural Mechanisms of Memory Storage Molecular, synaptic, and cellular events store information in the nervous system. New learning and memory formation.
Adult Cortical Plasticity 1.Maps in somatic sensory and motor cortex 2.Reorganization of cortical maps following sensory deprivation 3.Synaptic basis of.
BIPN 148 Lecture 9. How are synaptic changes regulated? Malenka and Nicoll, 1995.
Synapses are everywhere neurons synapses Synapse change continuously –From msec –To hours (memory) Lack HH type model for the synapse.
The Decisive Commanding Neural Network In the Parietal Cortex By Hsiu-Ming Chang ( 張修明 )
Using the Science of Learning (Neuroplasticity) To Inform Our Teaching Dorothy Kozlowski Ph.D. - Biology Christine Skolnik Ph.D. – WRD Sandra Virtue Ph.D.
The Integrate and Fire Model Gerstner & Kistler – Figure 4.1 RC circuit Threshold Spike.
Functional dissection of the CA3-CA1 learning rule Sam Wang Princeton University
Cognitive Neuroscience How do we connect cognitive processes in the mind with physical processes in the brain?
Nervous Systems. What’s actually happening when the brain “learns” new information? 3. I’m too old to learn anything new anymore; I hire people do that.
Neuron schematic  G t = RT ln (c 2 /c 1 ) + zF  E axon myelin sheath dendrites nerve endings nt release nt receptors Cell body synapse.
Critical periods A time period when environmental factors have especially strong influence in a particular behavior. –Language fluency –Birds- Are you.
Chapter 13 Learning and Memory. SIMPLE LEARNING a. habituation b. Pavlovian learning c. instrumental learning d. biological mechanisms HIGHER ORDER COGNITION.
Copyright © 2007 Wolters Kluwer Health | Lippincott Williams & Wilkins Neuroscience: Exploring the Brain, 3e Chapter 25: Molecular Mechanisms of Learning.
Synaptic plasticity Basic Neuroscience NBL 120. classical conditioning CS (neutral) - no response US - UR After pairing: CS - CR.
Vertebrate Models of Learning
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Molecular mechanisms of memory. How does the brain achieve Hebbian plasticity? How is the co-activity of presynaptic and postsynaptic cells registered.
The four units of the Ca 2+ signalling network Nature Rev. Mol. Cell Biol. 1, (2000) Speed Amplitude Spatio-temporal pattern.
Neural Plasticity: Long-term Potentiation Lesson 15.
synaptic plasticity is the ability of the connection, or synapse, between two neurons to change in strength in response to either use or disuse of transmission.
Excitable cells and their biochemistry David Taylor
Remembering Things How Your Brain Works - Week 9 Dr. Jan Schnupp HowYourBrainWorks.net.
Its all physical!.  Basic structure of the NS is set before birth  Neurons are however flexible living cells that can grow new connections  The ability.
Nervous System.
Biological Modeling of Neural Networks Week 6 Hebbian LEARNING and ASSOCIATIVE MEMORY Wulfram Gerstner EPFL, Lausanne, Switzerland 6.1 Synaptic Plasticity.
Neural dynamics of in vitro cortical networks reflects experienced temporal patterns Hope A Johnson, Anubhuthi Goel & Dean V Buonomano NATURE NEUROSCIENCE,
Mechanisms for memory: Introduction to LTP Bailey Lorv Psych 3FA3 November 15, 2010.
Lecture 24 Learning and Memory II. Memories Long term memory stored in cortex Information is processed for memory storage in other brain regions (i.e.
Synaptic plasticity DENT/OBHS 131 Neuroscience 2009.
LONG-TERM POTENTIATION (LTP) Introduction LTP as a candidate mechanism for the activity-dependent change in the strength of synaptic connections LTP is.
Slide 1 Neuroscience: Exploring the Brain, 3rd Ed, Bear, Connors, and Paradiso Copyright © 2007 Lippincott Williams & Wilkins Bear: Neuroscience: Exploring.
Trends in Biomedical Science Making Memory. The following slides are mostly derived from The Brain from Top to Bottom, an Interactive Website about the.
Copyright © 2004 Allyn and Bacon 1 Chapter 13 Learning and Memory: Basic Mechanisms This multimedia product and its contents are protected under copyright.
Dante R. Chialvo Learning, Memory and Criticality “Most of our entire life is devoted to learn. Although its importance in medicine is tremendous, the.
Neural Mechanisms of Learning & Memory Lesson 24.
Fear conditioning… e.g., Electric shock associated with specific stimuli.
APPROACHES TO THE BIOLOGY OF MEMORY Scale of analysis: –Micro: intra, intercellular –Medio: cell assemblies and neural networks –Macro: Coordinated brain.
Nervous System Notes Part 4. Neurons do not under go mitosis. Neurons are the largest cells in the human body. They can be up to 3 feet long. SOME MORE.
CSC321: Neural Networks Lecture 1: What are neural networks? Geoffrey Hinton
Synaptic Plasticity Synaptic efficacy (strength) is changing with time. Many of these changes are activity-dependent, i.e. the magnitude and direction.
Do Now Complete Part 1 on your worksheets with a partner. A problem for you to solve: – Given that you know the axon sends signals electrically, and that.
Synaptic Transmission / Central Synapses I Tom O’Dell Department of Physiology C8-161 (NPI), x64654.
1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately neurons, each connected to, on average, 10 4.
Ch 8. Synaptic Plasticity 8.9 ~ 8.10 Adaptive Cooperative Systems, Martin Beckerman, Summarized by Kim, S. –J. Biointelligence Laboratory, Seoul.
Measuring the Effect of Silencing the APP gene and Stimulating the CaMKII Pathway on Synaptic Plasticity in Alzheimer’s Disease Harshita Nangunuri.
Synaptic Plasticity and the NMDA Receptor
Long Term Potentiation
Warm-Up (12/10) Explain how proteins are secreted (released) by cells.
Types of Learning Associative Learning: Classical Conditioning
Types of Learning Associative Learning: Classical Conditioning
OVERVIEW OF BIOLOGICAL NEURONS
UNIT 4 BRAIN, BEHAVIOUR & EXPERIENCE
Effects of Excitatory and Inhibitory Potentials on Action Potentials
Neuronal Signals.
Types of Learning Associative Learning: Classical Conditioning
Types of Memory (iconic memory) (7 bits for 30seconds)
Types of Learning Associative Learning: Classical Conditioning
Synaptic Transmission and Integration
David Kauchak CS158 – Spring 2019

BIOL3833 Week 11b: Dendrites.
Neural Condition: Synaptic Transmission
Neuroscience: Exploring the Brain, 3e
Presentation transcript:

1 Complejidad Dia 7 Ecología Biologí a Psicologia Meteorología MacroEconomía Geofisica UBA, Junio 19, 2012.

2 “Learning as a collective“ Chialvo and Bak, Neuroscience (1998) Bak and Chialvo, Phys. Rev. E (2001). Wakeling J. Physica A, 2003) Wakeling and Bak, Phys.Rev. E (2001). Hoy:

3 Learning is never smooth

What Is the Problem? The current emphasis is in … To understand how billions of neurons learn, remember and forget on a self-organized way. To find a relationship between neuronal long-term potentiation, (so called “LTP”) of synapses and memory.

Biology is concerned with “Long-Term Potentiation”  If A and B succeed together to fire the neuron (often enough) synapse B will be reinforced

Steps of Long-term Potentiation 1.Rapid stimulation of neurons depolarizes them. 2.Their NMDA receptors open, Ca 2+ ions flows into the cell and bind to calmodulin. 3.This activates calcium-calmodulin- dependent kinase II (CaMKII). 4.CaMKII phosphorylates AMPA receptors making them more permeable to the inflow of Na + ions (i.e., increasing the neuron’ sensitivity to future stimulation. 5.The number of AMPA receptors at the synapse also increases. 6.Increased gene expression (i.e., protein synthesis - perhaps of AMPA receptors) and additional synapses form.

What Is Wrong With the emphasis on “LTP”? Nothing but there is no evidence linking memory and LTP and LTP is not the solution of how memory works

How difficult would be for a neuronal network to learn? The idea was not to invent another “learning algorithm” but to play with the simplest, still biologically realistic, one. Chialvo and Bak, Neuroscience (1999) Bak and Chialvo, Phys. Rev. E (2001). Wakeling J. Physica A, 2003) Wakeling and Bak, Phys.Rev. E (2001).

Self-organized Learning: Toy Model 1) Neuron “I*” fires 2) Neuron “j*” with largest W*(j*,I*) fires and son on neuron with largest W*(k*,j*) fires… 3) If firing leads to success: Do nothing  Do nothing otherwise otherwise   decrease W* by  That is all

How It Works on a Simple Task Connect one (or more) input neurons with a given output neuron. Chialvo and Bak, Neuroscience (1999)

A simple gizmo a)left right b)10% “blind” c)10% “stroke” d)40% “stroke” Chialvo and Bak, Neuroscience (1999)

How performance scales with “brain” size More neurons -> faster learning. It makes sense! The only model where larger is better Chialvo and Bak, Neuroscience (1999)

How It Scales With Problem Size (on the Parity Problem) A) Mean error vs Time for various problem’ sizes (i.e., N=2 m bit strings) B) Rescaled Mean error (with k=1.4) Chialvo and Bak, Neuroscience (1999)

Order-Disorder Transition Learning time is optimized for  > 1

Order-Disorder Transition At  = 1 the network is critical

Synaptic landscape remains rough Elimination of the least-fit connections Activity propagates through the best-fit ones At all times the synaptic landscape is rough Fast re-learning Chialvo and Bak, Neuroscience (1999)

17

18 If you make a mistake, next do something different H. Ohta, Y.P. Gunji / Neural Networks 19 (2006) 1106–1119

19 By “inhibiting” the past states H. Ohta, Y.P. Gunji / Neural Networks 19 (2006) 1106–1119

20 H. Ohta, Y.P. Gunji / Neural Networks 19 (2006) 1106–1119 So you can learn new thing without deleting the old ones

Solid ‐ State Atomic Switch o “Mermistors” nanoresistores con memoria (o “electroquimica seca” o “electrolitos solidos”)

Tsuyoshi Hasegawa et al, Learning Abilities Achieved by a Single Solid ‐ State Atomic Switch Advanced Materials, 22, , 2010

Tsuyoshi Hasegawa et al, Learning Abilities Achieved by a Single Solid ‐ State Atomic Switch Advanced Materials, 22, , 2010 Experimental result of a gradual increase in the current

Memory is in the spatial configuration of the Ag cations. A collective memory… nanogap Ag 2 S Electrodo Ag Electrodo metal Ag atomic bridge Tsuyoshi Hasegawa et al, Learning Abilities Achieved by a Single Solid ‐ State Atomic Switch Advanced Materials, 22, , 2010

a, Schematics of a Ag2S inorganic synapse and the signal transmission of a biological synapse. b,c, Change in the conductance of the inorganic synapse when the input pulses were applied with intervals of T=20 s (b) and 2 s (c). Inorganic synapse showing STP and LTP, depending on input-pulse repetition time. “Short-term plasticity and long- term potentiation mimicked in single inorganic synapses” Takeo Ohno et al. Nature Materials 10, 591–595 (2011)

Emergent Criticality in Complex Turing B‐Type Atomic Switch Networks “Emergent Criticality in Complex Turing B ‐ Type Atomic Switch Networks” Advanced Materials Stieg et al, 24, , Fabrication scheme for complex, electronic networks

Emergent Criticality in Complex Turing B‐Type Atomic Switch Networks (a) Experimental I–V curve demonstrating hysteresis (b) Ultrasensitive IR image of a distributed device conductance. (c,e) Representative experimental network current response to a 2 V pulse showing switching between discrete, metastable conductance states. (d,f) Metastable states residence times for (d) single 10 ms pulse and (f) over 2.5 s during extended periods of pulsed stimulation. “Emergent Criticality in Complex Turing B ‐ Type Atomic Switch Networks” Advanced Materials Stieg et al, 24, , 2011.

Desafío: 1) Modelar eficientemente la física del collectivo de mermistores. Es decir: modelos numéricos eficientes de una red arbitraria de mermistores ( probable punto de partida: random fuse model) 2) Modelar aprendizaje en esa red: Es decir: Encontrar algoritmos de aprendizaje auto-organizables implementables in silico.