A computational paradigm for dynamic logic-gates in neuronal activity Sander Vaus 15.10.2014.

Slides:



Advertisements
Similar presentations
Genome 559: Introduction to Statistical and Computational Genomics Elhanan Borenstein Artificial Neural Networks Some slides adapted from Geoffrey Hinton.
Advertisements

Rhythms in the Nervous System : Synchronization and Beyond Rhythms in the nervous system are classified by frequency. Alpha 8-12 Hz Beta Gamma
Introduction to Neural Networks 2. Overview  The McCulloch-Pitts neuron  Pattern space  Limitations  Learning.
10/4-6/05ELEC / Lecture 111 ELEC / (Fall 2005) Special Topics in Electrical Engineering Low-Power Design of Electronic Circuits.
Artificial Intelligence Techniques. Aims: Section fundamental theory and practical applications of artificial neural networks.
Stochastic Dynamics as a Principle of Brain Functions: Binary and Multiple Decision and Detection G. Deco, R. Romo and E. Rolls Universitat Pompeu Fabra/ICREA.
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
William Buchser Neuroscience Journal Club April 28 th 2004 Timing of neural responses in cortical organotypic slices.
A model for spatio-temporal odor representation in the locust antennal lobe Experimental results (in vivo recordings from locust) Model of the antennal.
B.Macukow 1 Lecture 3 Neural Networks. B.Macukow 2 Principles to which the nervous system works.
Introduction: Neurons and the Problem of Neural Coding Laboratory of Computational Neuroscience, LCN, CH 1015 Lausanne Swiss Federal Institute of Technology.
1Neural Networks B 2009 Neural Networks B Lecture 1 Wolfgang Maass
CH12: Neural Synchrony James Sulzer Background Stability and steady states of neural firing, phase plane analysis (CH6) Firing Dynamics (CH7)
Romain Brette Institut de la Vision, Paris Main developers of : Dan Goodman & Marcel Stimberg Neural.
September 7, 2010Neural Networks Lecture 1: Motivation & History 1 Welcome to CS 672 – Neural Networks Fall 2010 Instructor: Marc Pomplun Instructor: Marc.
Fall 2006, Oct. 31, Nov. 2 ELEC / Lecture 10 1 ELEC / (Fall 2006) Low-Power Design of Electronic Circuits Power Analysis:
Week 7a, Slide 1EECS42, Spring 2005Prof. White Week 7a Announcements You should now purchase the reader EECS 42: Introduction to Electronics for Computer.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
EE 231 Digital Electronics Fall 01 week 1-slide 1 Digital Hardware Systems Digital Systems Digital vs. Analog Waveforms Analog: values vary over a broad.
On-Line Adjustable Buffering for Runtime Power Reduction Andrew B. Kahng Ψ Sherief Reda † Puneet Sharma Ψ Ψ University of California, San Diego † Brown.
9/19/06 Hofstra University – Overview of Computer Science, CSC005 1 Chapter 4 Gates and Circuits.
EE311: Junior EE Lab Phase Locked Loop J. Carroll 9/3/02.
Presenter: PCLee Design Automation Conference, ASP-DAC '07. Asia and South Pacific.
Lecture 13, Slide 1EECS40, Fall 2004Prof. White Lecture #13 Announcements You should now purchase the reader EECS 40: Introduction to Microelectronics,
Hippocampal Network Analysis Using a Multi-electrode Array (MEA)
© 2000 Prentice Hall Inc. Figure 6.1 AND operation.
How facilitation influences an attractor model of decision making Larissa Albantakis.
1 COMP305. Part I. Artificial neural networks.. 2 The McCulloch-Pitts Neuron (1943). McCulloch and Pitts demonstrated that “…because of the all-or-none.
By Praveen Venkataramani Vishwani D. Agrawal TEST PROGRAMMING FOR POWER CONSTRAINED DEVICES 5/9/201322ND IEEE NORTH ATLANTIC TEST WORKSHOP 1.
November 7, 2012Introduction to Artificial Intelligence Lecture 13: Neural Network Basics 1 Note about Resolution Refutation You have a set of hypotheses.
Can brains generate random numbers? joint work with Mark Goldsmith.
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
Chapter 11: Cognition and neuroanatomy. Three general questions 1.How is the brain anatomically organized? 2.How is the mind functionally organized? 3.How.
Chapter 8 Architecture Analysis. 8 – Architecture Analysis 8.1 Analysis Techniques 8.2 Quantitative Analysis  Performance Views  Performance.
B-1 Appendix B - Reduction of Digital Logic Principles of Computer Architecture by M. Murdocca and V. Heuring © 1999 M. Murdocca and V. Heuring Principles.
Establishing the Equivalence between Recurrent Neural Networks and Turing Machines. Ritesh Kumar Sinha(02d05005) Kumar Gaurav Bijay( )
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
Logical Calculus of Ideas Immanent in Nervous Activity McCulloch and Pitts Deep Learning Fatima Al-Raisi.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
0 Chapter 1: Introduction Fundamentals of Computational Neuroscience Dec 09.
Motif finding with Gibbs sampling CS 466 Saurabh Sinha.
CS344 : Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 29 Introducing Neural Nets.
CS621: Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 41,42– Artificial Neural Network, Perceptron, Capacity 2 nd, 4 th Nov,
Lattice-Based Computation of Boolean Functions Mustafa Altun and Marc Riedel University of Minnesota.
ACOE1611 Combinational Logic Circuits Reference: M. Mano, C. Kime, “Logic and Computer Design Fundamentals”, Chapter 2.
Najah Alshanableh. Fuzzy Set Model n Queries and docs represented by sets of index terms: matching is approximate from the start n This vagueness can.
ICS 586: Neural Networks Dr. Lahouari Ghouti Information & Computer Science Department.
Neural Modeling - Fall NEURAL TRANSFORMATION Strategy to discover the Brain Functionality Biomedical engineering Group School of Electrical Engineering.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
History 398 Fall 2004 History 398Lecture 20 FROM ENIAC TO EDVAC.
Computing via boolean logic. COS 116: 3/8/2011 Sanjeev Arora.
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Information encoding and processing via spatio-temporal spike patterns in cortical networks Misha Tsodyks, Dept of Neurobiology, Weizmann Institute, Rehovot,
Spiking Neural Networks Banafsheh Rekabdar. Biological Neuron: The Elementary Processing Unit of the Brain.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Network Models (2) LECTURE 7. I.Introduction − Basic concepts of neural networks II.Realistic neural networks − Homogeneous excitatory and inhibitory.
Chapter 1. Introduction in Creating Brain-like intelligence, Sendhoff et al. Course: Robots Learning from Humans Bae, Eun-bit Otology Laboratory Seoul.
Where are we? What’s left? HW 7 due on Wednesday Finish learning this week. Exam #4 next Monday Final Exam is a take-home handed out next Friday in class.
The Neural Code Baktash Babadi SCS, IPM Fall 2004.
Memory Network Maintenance Using Spike-Timing Dependent Plasticity David Jangraw, ELE ’07 Advisor: John Hopfield, Department of Molecular Biology 12 T.
Mechanisms of Simple Perceptual Decision Making Processes
Synaptic transmission
Joost N. Kok Universiteit Leiden
Dr. Unnikrishnan P.C. Professor, EEE
Volume 56, Issue 6, Pages (December 2007)
Volume 40, Issue 6, Pages (December 2003)
Beth L. Parkin, Hamed Ekhtiari, Vincent F. Walsh  Neuron 
Adaptive Rescaling Maximizes Information Transmission
Feedforward, Feedback and Response Variability
Short-term synaptic depression in SCN neurons during stimulus train application. Short-term synaptic depression in SCN neurons during stimulus train application.
Presentation transcript:

A computational paradigm for dynamic logic-gates in neuronal activity Sander Vaus

Background “A logical calculus of the ideas immanent in nervous activity” (Mcculloch and Pitts, 1943)

Background “A logical calculus of the ideas immanent in nervous activity” (Mcculloch and Pitts, 1943) Neumann’s generalized Boolean framework (1956)

Background “A logical calculus of the ideas immanent in nervous activity” (Mcculloch and Pitts, 1943) Neumann’s generalized Boolean framework (1956) Shannon’s simplification of Boolean circuits (Shannon, 1938)

Problems Static logic-gates (SLGs)

Problems Static logic-gates (SLGs) – Influencial in developing artificial neural networks and machine learning

Problems Static logic-gates (SLGs) – Influencial in developing artificial neural networks and machine learning – Limited influence on neuroscience

Problems Static logic-gates (SLGs) – Influencial in developing artificial neural networks and machine learning – Limited influence on neuroscience Alternative: – Dynamic logic-gates (DLGs)

Problems Static logic-gates (SLGs) – Influencial in developing artificial neural networks and machine learning – Limited influence on neuroscience Alternative: – Dynamic logic-gates (DLGs) Functionality depends on history of their activity, the stimulation frequencies and the activity of their interconnetcions

Problems Static logic-gates (SLGs) – Influencial in developing artificial neural networks and machine learning – Limited influence on neuroscience Alternative: – Dynamic logic-gates (DLGs) Functionality depends on history of their activity, the stimulation frequencies and the activity of their interconnetcions Will require new systematic methods and practical tools beyond the methods of traditional Boolean algebra

Elastic response latency Neuronal response latency – The time-lag between a stimulation and its corresponding evoked spike

Elastic response latency Neuronal response latency – The time-lag between a stimulation and its corresponding evoked spike – Typically in the order of several milliseconds

Elastic response latency Neuronal response latency – The time-lag between a stimulation and its corresponding evoked spike – Typically in the order of several milliseconds – Repeated stimulations cause the delay to stretch

Elastic response latency Neuronal response latency – The time-lag between a stimulation and its corresponding evoked spike – Typically in the order of several milliseconds – Repeated stimulations cause the delay to stretch – Three distinct states/trends

Elastic response latency Neuronal response latency – The time-lag between a stimulation and its corresponding evoked spike – Typically in the order of several milliseconds – Repeated stimulations cause the delay to stretch – Three distinct states/trends – The higher the stimulation rate, the higher the increase of latency

Elastic response latency Neuronal response latency – The time-lag between a stimulation and its corresponding evoked spike – Typically in the order of several milliseconds – Repeated stimulations cause the delay to stretch – Three distinct states/trends – The higher the stimulation rate, the higher the increase of latency – In neuronal chains, the increase of latency is cumulative

(Vardi et al., 2013b)

Δ

Experimentally examined DLGs Dyanamic AND-gate

(Vardi et al., 2013b)

Experimentally examined DLGs Dyanamic AND-gate Dynamic OR-gate

(Vardi et al., 2013b)

Experimentally examined DLGs Dyanamic AND-gate Dynamic OR-gate Dynamic NOT-gate

(Vardi et al., 2013b)

Experimentally examined DLGs Dyanamic AND-gate Dynamic OR-gate Dynamic NOT-gate Dynamic XOR-gate

(Vardi et al., 2013b)

Theoretical analysis A simplified theoretical framework

Theoretical analysis A simplified theoretical framework l(q) = l 0 + qΔ(1) l 0 – neuron’s initial response latency q – number of evoked spikes Δ – constant (typically in range of 2-7 μs

Theoretical analysis A simplified theoretical framework l(q) = l 0 + qΔ(1) τ(q) = τ 0 + nqΔ(2) τ 0 – initial time delay of the chain n – number of neurons in the chain

Theoretical analysis A simplified theoretical framework l(q) = l 0 + qΔ(1) τ(q) = τ 0 + nqΔ(2) Simplifying assumption: The number of evoked spikes of a neuron is equal to the number of its stimulations

Theoretical analysis Dynamic AND-gate

(Vardi et al., 2013b)

Theoretical analysis Dynamic AND-gate – Generalized AND-gate

(Vardi et al., 2013b)

Theoretical analysis Dynamic AND-gate – Generalized AND-gate – number of intersections of k non-parallel lines: 0.5k(k – 1)

(Vardi et al., 2013b)

Theoretical analysis Dynamic AND-gate – Generalized AND-gate – number of intersections of k non-parallel lines: 0.5k(k – 1) Dynamic XOR-gate

(Vardi et al., 2013b)

Theoretical analysis Dynamic AND-gate – Generalized AND-gate – number of intersections of k non-parallel lines: 0.5k(k – 1) Dynamic XOR-gate Transitions among multiple modes

(Vardi et al., 2013b)

Theoretical analysis Dynamic AND-gate – Generalized AND-gate – number of intersections of k non-parallel lines: 0.5k(k – 1) Dynamic XOR-gate Transitions among multiple modes Varying inputs

(Vardi et al., 2013b)

Multiple component networks and signal processing Basic edge detector: (Vardi et al., 2013b)

Suitability of DLGs to brain functionality Short synaptic delays

Suitability of DLGs to brain functionality Short synaptic delays – The examined cases set the synaptic delays to a few tens of milliseconds, as opposed to those of several milliseconds in the brain

Suitability of DLGs to brain functionality Short synaptic delays – The examined cases set the synaptic delays to a few tens of milliseconds, as opposed to those of several milliseconds in the brain Can be remedied with the help of long synfire chains

(Vardi et al., 2013b)

Suitability of DLGs to brain functionality Short synaptic delays – The examined cases set the synaptic delays to a few tens of milliseconds, as opposed to those of several milliseconds in the brain Can be remedied with the help of long synfire chains Population dynamics – DLGs assume

(Vardi et al., 2013b)

References 1. Goldental, A., Guberman, S., Vardi, R., Kanter, I. (2014). “A computational paradigm for dynamic logic-gates in neuronal activity,” Frontiers in Computational Neuroscience, Volume 8, Article 52, pp Vardi, R., Guberman, S., Goldental, A., Kanter, I. (2013b). “An experimental evidence-based computational paradigm for new logic-gates in neuronal activity,” EPL 103: Mcculloch, W. S., Pitts, W. (1943). “A logical calculus of the ideas immanent in nervous activity,” Bull. Math. Biophys., 5: Shannon, C. (1938). “A symbolic analysis of relay and switching circuits,” Trans. AIEE 57: