Cognitive Computing…. Computational Neuroscience Jerome Swartz The Swartz Foundation May 10, 2006.

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
PDP: Motivation, basic approach. Cognitive psychology or “How the Mind Works”
1Neural Networks B 2009 Neural Networks B Lecture 1 Wolfgang Maass
1 Neural Networks - Basics Artificial Neural Networks - Basics Uwe Lämmel Business School Institute of Business Informatics
PROTEIN SECONDARY STRUCTURE PREDICTION WITH NEURAL NETWORKS.
Application of Statistical Techniques to Neural Data Analysis Aniket Kaloti 03/07/2006.
Biological inspiration Animals are able to react adaptively to changes in their external and internal environment, and they use their nervous system to.
Learning with spikes, and the Unresolved Question in Neuroscience/Complex Systems Tony Bell Helen Wills Neuroscience Institute University of California.
COGNITIVE NEUROSCIENCE
CSE 153Modeling Neurons Chapter 2: Neurons A “typical” neuron… How does such a thing support cognition???
November 5, 2009Introduction to Cognitive Science Lecture 16: Symbolic vs. Connectionist AI 1 Symbolism vs. Connectionism There is another major division.
Fitting models to data. Step 5) Express the relationships mathematically in equations Step 6)Get values of parameters Determine what type of model you.
Symbolic Encoding of Neural Networks using Communicating Automata with Applications to Verification of Neural Network Based Controllers* Li Su, Howard.
How does the mind process all the information it receives?
Associative Learning in Hierarchical Self Organizing Learning Arrays Janusz A. Starzyk, Zhen Zhu, and Yue Li School of Electrical Engineering and Computer.
Data representation techniques for adaptation Alexandra I. Cristea USI intensive course “Adaptive Systems” April-May 2003.
1 / x Complex Adaptive systems GRS Introduction Arnold Bregt.
Multi-level Human Brain Modeling Jerome Swartz The Swartz Foundation Rancho Santa Fe 9/30/06.
Machine Learning. Learning agent Any other agent.
Self-Organized Recurrent Neural Learning for Language Processing April 1, March 31, 2012 State from June 2009.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
Neuroscience at NSF Status and Prosepctives Rae Silver Senior Advisor OIA/OD November ENG Advisory Committee ProcMem Comm Disk ProcMem Comm Bus.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
SYSTEMS BIOLOGY AND NEUROENGINEERING Christine P. Hendon, PhD Assistant Professor Electrical Engineering.
The search for organizing principles of brain function Needed at multiple levels: synapse => cell => brain area (cortical maps) => hierarchy of areas.
Low Level Visual Processing. Information Maximization in the Retina Hypothesis: ganglion cells try to transmit as much information as possible about the.
NEURAL NETWORKS FOR DATA MINING
Cognition, Brain and Consciousness: An Introduction to Cognitive Neuroscience Edited by Bernard J. Baars and Nicole M. Gage 2007 Academic Press Chapter.
2 2  Background  Vision in Human Brain  Efficient Coding Theory  Motivation  Natural Pictures  Methodology  Statistical Characteristics  Models.
0 Chapter 1: Introduction Fundamentals of Computational Neuroscience Dec 09.
Computing and Communications and Biology Molecular Communication; Biological Communications Technology Workshop Arlington, VA 20 February 2008 Jeannette.
Learning sensorimotor transformations Maurice J. Chacron.
M Machine Learning F# and Accord.net. Alena Dzenisenka Software architect at Luxoft Poland Member of F# Software Foundation Board of Trustees Researcher.
CROSS-CUTTING CONCEPTS IN SCIENCE Concepts that unify the study of science through their common application across the scientific fields They enhance core.
Systems Biology ___ Toward System-level Understanding of Biological Systems Hou-Haifeng.
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Group Learning By Philip Sterne Supervisor : Shaun Bangay Neural Networks: sharing information.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
Models and Modeling1 Ahmed Waheed Moustafa Prof. of ergonomics and Computer Simulation.
Systems Science & Informatics: Computational Neuroscience Kaushik Majumdar Indian Statistical Institute Bangalore Center.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Neural Networks Si Wu Dept. of Informatics PEV III 5c7 Spring 2008.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CSC321: Neural Networks Lecture 1: What are neural networks? Geoffrey Hinton
1 Basics of Computational Neuroscience. 2 Lecture: Computational Neuroscience, Contents 1) Introduction The Basics – A reminder: 1) Brain, Maps, Areas,
1 Nonlinear models for Natural Image Statistics Urs Köster & Aapo Hyvärinen University of Helsinki.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Sub-fields of computer science. Sub-fields of computer science.
Outline Of Today’s Discussion
Computation and cognition: Unsupervised learning
Cognitive Computing…. Computational Neuroscience
Artificial Intelligence (CS 370D)
Cross-cutting concepts in science
Functional segregation vs. functional integration of the brain
Kocaeli University Introduction to Engineering Applications
OVERVIEW OF BIOLOGICAL NEURONS
Emre O. Neftci  iScience  Volume 5, Pages (July 2018) DOI: /j.isci
OCNC Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.
ARTIFICIAL NEURAL networks.
Toward a Great Class Project: Discussion of Stoianov & Zorzi’s Numerosity Model Psych 209 – 2019 Feb 14, 2019.
Active, dynamic, interactive, system
How to win big by thinking straight about relatively trivial problems
Machine Learning.
Presentation transcript:

Cognitive Computing…. Computational Neuroscience Jerome Swartz The Swartz Foundation May 10, 2006

Large Scale Brain Modeling Science IS modeling Models have power –To explain –To predict –To simulate –To augment Why model the brain?

Brains are not computers … But they are supported by the same physics Energy conservation Entropy increase Least action Time direction Brains are supported by the same logic, but implemented differently… –Low speed; parallel processing; no symbolic software layer; fundamentally adaptive / interactive; organic vs. inorganic

Brain research must be multi-level Scientific collaboration is needed –Across spatial scales –Across time scales –Across measurement techniques Current field borders should not remain boundaries… Curtail Scale Chauvinism!

…both scientifically and mathematically To understand, both theoretically and practically, how brains support behavior and experience To model brain / behavior dynamics as Active requires –Better behavioral measures and modeling –Better brain dynamic imaging / analysis –Better joint brain / behavior analysis

… the next research frontier Brains are active and multi-scale / multi-level The dominant multi-level model: Computers … with their physical / logical computer hierarchy –the OSI stack –physical / implementation levels –logical / instruction levels

( = STDP) A Multi-Level View of Learning LEARNING at a LEVEL is CHANGE IN INTERACTIONS between its UNITS, implemented by INTERACTIONS at the LEVEL beneath, and by extension resulting in CHANGE IN LEARNING at the LEVEL above. Increasing Timescale Separation of timescales allows INTERACTIONS at one LEVEL to be LEARNING at the LEVEL above. Interactions=fast Learning=slow LEVELUNITINTERACTIONSLEARNING societyorganism behaviour ecologysociety predation, symbiosis natural selection sensory-motor learning organismcellspikessynaptic plasticity cell proteinmolecular forces gene expression, protein recycling voltage, Ca bulk molecular changes synapse amino acid synapse proteindirect,V,Ca molecular changes

( = STDP) A Multi-Level View of Learning LEARNING at one LEVEL is implemented by DYNAMICS between UNITS at the LEVEL below. Increasing Timescale Separation of timescales allows DYNAMICS at one LEVEL to be LEARNING at the LEVEL above. Dynamics=fast Learning=slow LEVELUNITDYNAMICSLEARNING society organism behaviour ecology society predation, symbiosis natural selection sensory-motor learning organismcellspikes synaptic plasticity cell proteinmolecular forces gene expression, protein recycling voltage, Ca bulk molecular changes synapse amino acid synapse proteindirect,V,Ca molecular changes T.Bell

What idea will fill in the question mark? physiology (of STDP) physics of self- organisation probabilistic machine learning ? (STDP=spike timing- dependent plasticity) -unsupervised probability density estimation across scales - the smaller (molecular) models the larger (spikes)…. suggested by STDP physiology, where information flow from neurons to synapses is inter-level…. ? = the Levels Hypothesis: Learning in the brain is: T.Bell

network of 2 brains network of neurons network of macromolecules network of protein complexes (e.g., synapses) Networks within networks 1 cell 1 brain Multi-level modeling:

ICA/Infomax between Layers. (eg: V1 density-estimates Retina) 2 within-level feedforward molecular sublevel is ‘implementation’ social superlevel is ‘reward’ predicts independent activity only models outside input retina V1 synaptic weights x y Infomax between Levels. (eg: synapses density-estimate spikes) 1 between-level includes all feedback molecular net models/creates social net is boundary condition permits arbitrary activity dependencies models input and intrinsic together all neural spikes all synaptic readout synapses, dendrites t y pdf of all spike times pdf of all synaptic ‘readouts’ If we can make this pdf uniform then we have a model constructed from all synaptic and dendritic causality ICA transform minimises statistical dependence between outputs. The bases produced are data-dependent, not fixed as in Fourier or Wavelet transforms. T.Bell

The Infomax principle/ICA algorithms T.Bell Many applications (6 international ICA workshops)… audio separation in real acoustic environments (as above) biomedical data-mining -- EEG,fMRI, image coding Cognitive Computing…Computational Neuroscience