Cognitive Computing…. Computational Neuroscience

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Computer Science Department FMIPA IPB 2003 Neural Computing Yeni Herdiyeni Computer Science Dept. FMIPA IPB.
PDP: Motivation, basic approach. Cognitive psychology or “How the Mind Works”
Cognitive Computing…. Computational Neuroscience Jerome Swartz The Swartz Foundation May 10, 2006.
1Neural Networks B 2009 Neural Networks B Lecture 1 Wolfgang Maass
1 Neural Networks - Basics Artificial Neural Networks - Basics Uwe Lämmel Business School Institute of Business Informatics
PROTEIN SECONDARY STRUCTURE PREDICTION WITH NEURAL NETWORKS.
Biological inspiration Animals are able to react adaptively to changes in their external and internal environment, and they use their nervous system to.
Learning with spikes, and the Unresolved Question in Neuroscience/Complex Systems Tony Bell Helen Wills Neuroscience Institute University of California.
CSE 153Modeling Neurons Chapter 2: Neurons A “typical” neuron… How does such a thing support cognition???
November 5, 2009Introduction to Cognitive Science Lecture 16: Symbolic vs. Connectionist AI 1 Symbolism vs. Connectionism There is another major division.
Fitting models to data. Step 5) Express the relationships mathematically in equations Step 6)Get values of parameters Determine what type of model you.
Symbolic Encoding of Neural Networks using Communicating Automata with Applications to Verification of Neural Network Based Controllers* Li Su, Howard.
How does the mind process all the information it receives?
1 / x Complex Adaptive systems GRS Introduction Arnold Bregt.
Multi-level Human Brain Modeling Jerome Swartz The Swartz Foundation Rancho Santa Fe 9/30/06.
Machine Learning. Learning agent Any other agent.
Self-Organized Recurrent Neural Learning for Language Processing April 1, March 31, 2012 State from June 2009.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
Neuroscience at NSF Status and Prosepctives Rae Silver Senior Advisor OIA/OD November ENG Advisory Committee ProcMem Comm Disk ProcMem Comm Bus.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
SYSTEMS BIOLOGY AND NEUROENGINEERING Christine P. Hendon, PhD Assistant Professor Electrical Engineering.
The search for organizing principles of brain function Needed at multiple levels: synapse => cell => brain area (cortical maps) => hierarchy of areas.
Low Level Visual Processing. Information Maximization in the Retina Hypothesis: ganglion cells try to transmit as much information as possible about the.
NEURAL NETWORKS FOR DATA MINING
Cognition, Brain and Consciousness: An Introduction to Cognitive Neuroscience Edited by Bernard J. Baars and Nicole M. Gage 2007 Academic Press Chapter.
2 2  Background  Vision in Human Brain  Efficient Coding Theory  Motivation  Natural Pictures  Methodology  Statistical Characteristics  Models.
0 Chapter 1: Introduction Fundamentals of Computational Neuroscience Dec 09.
Computing and Communications and Biology Molecular Communication; Biological Communications Technology Workshop Arlington, VA 20 February 2008 Jeannette.
Projects: 1.Predictive coding in balanced spiking networks (Erwan Ledoux). 2.Using Canonical Correlation Analysis (CCA) to analyse neural data (David Schulz).
Learning sensorimotor transformations Maurice J. Chacron.
M Machine Learning F# and Accord.net. Alena Dzenisenka Software architect at Luxoft Poland Member of F# Software Foundation Board of Trustees Researcher.
CROSS-CUTTING CONCEPTS IN SCIENCE Concepts that unify the study of science through their common application across the scientific fields They enhance core.
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
Models and Modeling1 Ahmed Waheed Moustafa Prof. of ergonomics and Computer Simulation.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Neural Networks Si Wu Dept. of Informatics PEV III 5c7 Spring 2008.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CSC321: Neural Networks Lecture 1: What are neural networks? Geoffrey Hinton
1 Basics of Computational Neuroscience. 2 Lecture: Computational Neuroscience, Contents 1) Introduction The Basics – A reminder: 1) Brain, Maps, Areas,
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Sub-fields of computer science. Sub-fields of computer science.
Spiking Neuron Networks
Outline Of Today’s Discussion
Computation and cognition: Unsupervised learning
Fall 2004 Perceptron CS478 - Machine Learning.
CSC321: Neural Networks Lecture 22 Learning features one layer at a time Geoffrey Hinton.
Artificial Intelligence (CS 370D)
Cross-cutting concepts in science
Functional segregation vs. functional integration of the brain
Invitation to Computer Science 5th Edition
How Neurons Do Integrals
Computer Science Life Cycle Models.
Developmental neuroplasticity
OVERVIEW OF BIOLOGICAL NEURONS
of the Artificial Neural Networks.
OCNC Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.
Associative Memory: A Spiking Neural Network Robotic Implementation
ARTIFICIAL NEURAL networks.
Toward a Great Class Project: Discussion of Stoianov & Zorzi’s Numerosity Model Psych 209 – 2019 Feb 14, 2019.
How to win big by thinking straight about relatively trivial problems
Presentation transcript:

Cognitive Computing…. Computational Neuroscience Jerome Swartz The Swartz Foundation May 10, 2006

Large Scale Brain Modeling Science IS modeling Models have power To explain To predict To simulate To augment Why model the brain?

Brains are not computers … But they are supported by the same physics Energy conservation Entropy increase Least action Time direction Brains are supported by the same logic, but implemented differently… Low speed; parallel processing; no symbolic software layer; fundamentally adaptive / interactive; organic vs. inorganic

Brain research must be multi-level Scientific collaboration is needed Across spatial scales Across time scales Across measurement techniques Current field borders should not remain boundaries… Curtail Scale Chauvinism!

…both scientifically and mathematically To understand, both theoretically and practically, how brains support behavior and experience To model brain / behavior dynamics as Active requires Better behavioral measures and modeling Better brain dynamic imaging / analysis Better joint brain / behavior analysis

… the next research frontier Brains are active and multi-scale / multi-level The dominant multi-level model: Computers … with their physical / logical computer hierarchy the OSI stack physical / implementation levels logical / instruction levels

A Multi-Level View of Learning UNIT INTERACTIONS LEARNING society organism behaviour ecology predation, symbiosis natural selection sensory-motor learning cell spikes synaptic plasticity protein molecular forces gene expression, protein recycling voltage, Ca bulk molecular changes synapse amino acid direct,V,Ca molecular changes ( = STDP) Increasing Timescale LEARNING at a LEVEL is CHANGE IN INTERACTIONS between its UNITS, implemented by INTERACTIONS at the LEVEL beneath, and by extension resulting in CHANGE IN LEARNING at the LEVEL above. Interactions=fast Learning=slow Separation of timescales allows INTERACTIONS at one LEVEL to be LEARNING at the LEVEL above.

A Multi-Level View of Learning T.Bell LEVEL UNIT DYNAMICS LEARNING society organism behaviour ecology predation, symbiosis natural selection sensory-motor learning cell spikes synaptic plasticity protein molecular forces gene expression, protein recycling voltage, Ca bulk molecular changes synapse amino acid direct,V,Ca molecular changes ( = STDP) Increasing Timescale LEARNING at one LEVEL is implemented by DYNAMICS between UNITS at the LEVEL below. Dynamics=fast Learning=slow Separation of timescales allows DYNAMICS at one LEVEL to be LEARNING at the LEVEL above.

? = the Levels Hypothesis: Learning in the brain is: What idea will fill in the question mark? T.Bell physiology (of STDP) physics of self-organisation ? (STDP=spike timing- dependent plasticity) probabilistic machine learning ? = the Levels Hypothesis: Learning in the brain is: -unsupervised probability density estimation across scales the smaller (molecular) models the larger (spikes)…. suggested by STDP physiology, where information flow from neurons to synapses is inter-level….

Multi-level modeling: network of neurons network of 2 brains 1 cell 1 brain network of protein complexes (e.g., synapses) network of macromolecules Networks within networks

Infomax between Levels. (eg: synapses density-estimate spikes) 1 T.Bell Infomax between Levels. (eg: synapses density-estimate spikes) 1 ICA/Infomax between Layers. (eg: V1 density-estimates Retina) 2 t all neural spikes retina V1 synaptic weights x y synapses, dendrites y all synaptic readout between-level includes all feedback molecular net models/creates social net is boundary condition permits arbitrary activity dependencies models input and intrinsic together within-level feedforward molecular sublevel is ‘implementation’ social superlevel is ‘reward’ predicts independent activity only models outside input pdf of all spike times pdf of all synaptic ‘readouts’ ICA transform minimises statistical dependence between outputs. The bases produced are data-dependent, not fixed as in Fourier or Wavelet transforms. If we can make this pdf uniform then we have a model constructed from all synaptic and dendritic causality

The Infomax principle/ICA algorithms T.Bell The Infomax principle/ICA algorithms Many applications (6 international ICA workshops)… audio separation in real acoustic environments (as above) biomedical data-mining -- EEG,fMRI, image coding Cognitive Computing…Computational Neuroscience