Multi-scale Models of the Cerebellum: Role of the Adaptive Filter Model Paul Dean, Christian Rössert & John Porrill University of Sheffield REALNET.

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

NEU Neural Computing MSc Natural Computation Department of Computer Science University of York.
Neural Networks  A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Neural Network of the Cerebellum: Temporal Discrimination and the Timing of Responses Michael D. Mauk Dean V. Buonomano.
Cerebellar Spiking Engine: Towards Object Model Abstraction in Manipulation UGR with input from PAVIA and other partners  Motivation 1.Abstract corrective.
Cerebellar Spiking Engine: EDLUT simulator UGR with input from other partners.  Motivation 1. Simulation of biologically plausible spiking neural structures.
1Neural Networks B 2009 Neural Networks B Lecture 1 Wolfgang Maass
Artificial Intelligence (CS 461D)
Simple Neural Nets For Pattern Classification
Neural Networks Basic concepts ArchitectureOperation.
Partial Parallel Interference Cancellation Based on Hebb Learning Rule Taiyuan University of Technology Yanping Li.
COGNITIVE NEUROSCIENCE
Symbolic Encoding of Neural Networks using Communicating Automata with Applications to Verification of Neural Network Based Controllers* Li Su, Howard.
SME Review - September 20, 2006 Neural Network Modeling Jean Carlson, Ted Brookings.
Motor systems III: Cerebellum April 16, 2007 Mu-ming Poo Population coding in the motor cortex Overview and structure of cerebellum Microcircuitry of cerebellum.
Chapter Seven The Network Approach: Mind as a Web.
Neural Networks (NN) Ahmad Rawashdieh Sa’ad Haddad.
October 7, 2010Neural Networks Lecture 10: Setting Backpropagation Parameters 1 Creating Data Representations On the other hand, sets of orthogonal vectors.
Theories of Development. Cognitive Development Early psychologists believed that children were not capable of meaningful thought and that there actions.
Reflective practice Session 4 – Working together.
Optimality in Motor Control By : Shahab Vahdat Seminar of Human Motor Control Spring 2007.
August 19 th, 2006 Computational Neuroscience Group, LCE Helsinki University of Technology Computational neuroscience group Laboratory of computational.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Systems Analysis – Analyzing Requirements.  Analyzing requirement stage identifies user information needs and new systems requirements  IS dev team.
Steven A. Jones Associate Professor Biomedical Engineering Louisiana Tech University Created for our NSF-funded Research Experiences for Teachers Program.
Machine Learning. Learning agent Any other agent.
How to do backpropagation in a brain
Getting on your Nerves. What a lot of nerve! There are about 100,000,000,000 neurons in an adult human. These form 10,000,000,000,000 synapses, or connections.
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
Course Overview Syllabus Schedule of Topics Topic Pages –Reading Assignments –Discussion Questions –Additional Sources Exams – 100 points each Quizzes.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
ARTIFICIAL NEURAL NETWORKS. Introduction to Neural Networks.
NEURAL NETWORKS FOR DATA MINING
Cognition, Brain and Consciousness: An Introduction to Cognitive Neuroscience Edited by Bernard J. Baars and Nicole M. Gage 2007 Academic Press Chapter.
2 2  Background  Vision in Human Brain  Efficient Coding Theory  Motivation  Natural Pictures  Methodology  Statistical Characteristics  Models.
Cerebellum Overview and structure of cerebellum Microcircuitry of cerebellum Motor learning.
Cerebellum John H. Martin, Ph.D. Center for Neurobiology & Behavior Columbia University.
Classwork Date  Lesson 6 – Lesson objective  Practice Types and presentation of skills  By the end of this lesson you should be able to:  Describe.
PSYB4. Can you answer this question? Discuss the biological approach in psychology. Refer to at least one other approach in your answer (12 marks)
Language Project.  Neural networks have a large appeal to many researchers due to their great closeness to the structure of the brain, a characteristic.
PROGRAM DEVELOPMENT CYCLE. Problem Statement: Problem Statement help diagnose the situation so that your focus is on the problem, helpful tools at this.
Introduction  Human brain is the most valuable creation of God.  The man is called intelligent because of the brain only  But we loss the knowledge.
Neural Modeling - Fall NEURAL TRANSFORMATION Strategy to discover the Brain Functionality Biomedical engineering Group School of Electrical Engineering.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Teachers Name - Suman Sarker Subject Name Subject Name – Industrial Electronics (6832) Department Department – Computer (3rd) IDEAL INSTITUTE OF SCIENCE.
From brain activities to mathematical models The TempUnit model, a study case for GPU computing in scientific computation.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
IE 585 History of Neural Networks & Introduction to Simple Learning Rules.
Neural Networks 2nd Edition Simon Haykin
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Functions of Distributed Plasticity in a Biologically-Inspired Adaptive Control Algorithm: From Electrophysiology to Robotics University of Edinburgh University.
Software Design Process. What is software? mid-1970s executable binary code ‘source code’ and the resulting binary code 1990s development of the Internet.
Artificial Intelligence (CS 370D)
Professor Martin McGinnity1,2, Dr. John Wade1 and MSc. Pedro Machado1
The biophysics of Purkinje computation and coding
Neural Networks Dr. Peter Phillips.
Neural Networks A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
OVERVIEW OF BIOLOGICAL NEURONS
The Brain as an Efficient and Robust Adaptive Learner
Fundamentals of Neural Networks Dr. Satinder Bal Gupta
The Timing Is Right for Cerebellar Learning
ARTIFICIAL NEURAL networks.
The Network Approach: Mind as a Web
The Brain as an Efficient and Robust Adaptive Learner
Deep Neural Networks as Scientific Models
Presentation transcript:

Multi-scale Models of the Cerebellum: Role of the Adaptive Filter Model Paul Dean, Christian Rössert & John Porrill University of Sheffield REALNET

Slide No 2Sardinia 2013 Adaptive Filter Models of the Cerebellum First proposed by Fujita in 1982, based on the original Marr-Albus framework We have argued that many models of the cerebellar role in motor control (especially eye or arm movement) are based in the adaptive filter Such popularity would suggest that the adaptive-filter model probably has a role to play in multi-scale modelling of the cerebellum What role? What is the basis for the popularity? Dean, P., Porrill, J., Ekerot, C. F., & Jorntell, H. (2010). The cerebellar microcircuit as an adaptive filter: experimental and computational evidence. Nature Reviews Neuroscience, 11(1),

Slide No 3Sardinia 2013 Fixed Filter Fixed filters are familiar from audio Example here (B, C) a low-pass filter that selectively attenuates high frequencies (Can also be equivalently described in terms of its impulse response (A))

Slide No 4Sardinia 2013 Adjustable Filters Again familiar from audio Knobs (A) to alter gain (volume) and frequency response (B)

Slide No 5Sardinia 2013 Adjustable Filters Can be quite fancy, e.g. a graphic equalizer Gain of individual frequency channels adjustable for the ultimate listening experience BUT – still has to be adjusted by hand

Slide No 6Sardinia 2013 Adaptive Filters What we want is an adjustable filter where the adjustments are made automatically Therefore need some sort of ‘adjuster signal’, to tell the filter what to do Demonstrated by the Analysis-Synthesis filter

Slide No 7Sardinia 2013 Analysis-Synthesis Filter Input ‘analysed’ into component signals using a bank of fixed filters (here leaky integrators). Components are then weighted and recombined (synthesised) to produce the output Weight values can be adjusted; allows shape of output to be altered. Output is in effect ‘sculpted’ from components.

Slide No 8Sardinia 2013 How are Weights Adjusted? This is where the adjuster signal comes in Each weight receives the same adjuster signal (more usually referred to as ‘error’ or ‘teaching’ signal) Adjuster Signal

Slide No 9Sardinia 2013 Learning Rule The adaptive filter changes its weights according to the correlation between input component and teaching signal, i.e. –if positive correlation, reduce weight –if negative correlation, increase weight Learning stops when there is no correlation between component and teaching signals In effect, the analysis-synthesis filter implements a decorrelation algorithm

Slide No 10Sardinia 2013 Adaptive Filters Work Adaptive filters are very widely used in signal processing – e.g. communications, radar, sonar, navigation, seismology, biomedical engineering, and financial engineering As already mentioned, widely used to model the cerebellum in the control of eye, eyelid and arm movements More recently, shown to be a good candidate for the adaptive element in “Internal Models”

Slide No 11Sardinia 2013 The Forward Model Example of in internal model – the ‘forward model’ Suggested in relation to cerebellum by e.g. Miall, Wolpert Use to predict sensory effects of movement Useful for e.g. distinguishing external sensory signals from those produced by one’s own movement Miall, R. C., & Wolpert, D. M. (1996). Forward models for physiological motor control. Neural Networks, 9(8),

Slide No 12Sardinia 2013 Role of Adaptive Filter Can show that the adaptive filter is capable of learning forward models Expands cerebellar role from motor control to sensory prediction (and possibly ‘cognitive’ prediction?) Porrill, J., Dean, P., & Anderson, S. R. (2013). Adaptive filters and internal models: Multilevel description of cerebellar function. Neural Networks, in press.

Slide No 13Sardinia 2013 How Realistic is the Adaptive Filter Model? 1.Analysis: granular layer produces components of input (mossy fibre) signals 2.Components weighted by parallel-fibre Purkinje cell synapses 3.Weights adjusted by climbing fibre signal 4.Purkinje cell combines weighted components to produce output.

Slide No 14Sardinia 2013 Explains Two Striking Features 1.A very large number of fixed filters are required at the analysis stage to ensure adequate coverage of all contingencies – in biology, the precise nature of the required response cannot be known in advance Granule cells constitute ~80% of neurons in the human brain (Herculano-Houzel 2010) 2.The adjuster signal must not interfere with system output, but must be able to affect all weights Seems to fit climbing fibre properties Herculano-Houzel, S. (2010). Frontiers in Neuroanatomy, 4, Article 12.

Broad-Brush Realism? “The structure of the granular layer network and its mossy fibre inputs is well suited for spreading diverse sets of information (referred to here as ‘diversity spreading’).” (p.625) The huge diversity of parallel fibre codings, which are widely distributed over the molecular layer, has the advantage that guiding signals (provided by climbing fibres) can select and sculpt those codings that are needed to improve behaviour as required in a particular spatiotemporal context.” (p.630). Consistent with analysis-synthesis adaptive filter Slide No 15Sardinia 2013 Gao, Z. Y., van Beugen, B. J., & De Zeeuw, C. I. (2012). Distributed synergistic plasticity and cerebellar learning. Nature Reviews Neuroscience, 13(9),

Slide No 16Sardinia 2013 What’s the Problem? Although there may be a broad sense in which adaptive filter models are ‘realistic’, the details are clearly not realistic –Synapses can be either positive or negative –Firing rates not spikes –Neither single neurons nor populations represented explicitly However, this can’t be simply be solved by switching to current very detailed compartmental models - it can be difficult to determine just what these can do functionally Problem: –Abstract models can be used for control but lack detail –Detailed models have not been shown to be capable of control

Slide No 17Sardinia 2013 Bridging the Gap Natural question: is there a suitable intermediate level of modelling? Sydney Brenner: “There is a theory called the ‘cell theory’ that is about 150 years old. So I think studying the cell gives the proper perspective. You can then look downwards onto the molecule and upwards to the organism. So it is neither top down nor bottom up, rather it is middle out, and I think that is going to be the correct approach”

Slide No 18Sardinia 2013 Bridging the Gap Criterion #1: neurons and synapses represented explicitly but in as simplified a form as possible Criterion #2: network’s signal processing characteristics can be specified (and shown to be capable – or not – of adaptive filter functionality)

Slide No 19Sardinia 2013 Possible Benefits May be possible to assess computational impact of specific additional detailed features Are there features related to e.g. homeostasis rather than computational power? Are there features that enable the circuit to do computations that adaptive filters cannot do?

Slide No 20Sardinia 2013 Examples Update of a model started in 1991 Integrate and fire neurons Used for classical conditioning of eyeblink As yet no analysis of its computational properties?

Slide No 21Sardinia 2013 Examples Conductance-based integrate and fire neurons Used for eyeblink conditioning and OKR Talked about this morning

Slide No 22Sardinia 2013 Examples Integrate and fire neurons using EDLUT simulator Used to investigate possible roles of granular-layer plasticity Will be talked about shortly

Question Candidates for bridging the gap between abstract (adaptive- filter) and detailed (compartmental) models How can they be used to increase understanding of the connection between features at the cellular level, and algorithmic competence? Slide No 23Sardinia 2013