Self-Organized Recurrent Neural Learning for Language Processing www.reservoir-computing.org April 1, 2009 - March 31, 2012 State from June 2009.

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

Cognitive Systems, ICANN panel, Q1 What is machine intelligence, as beyond pattern matching, classification and prediction. What is machine intelligence,
Chrisantha Fernando & Sampsa Sojakka
Neural Network Intro Slides
Functional Link Network. Support Vector Machines.
1Neural Networks B 2009 Neural Networks B Lecture 1 Wolfgang Maass
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Brain-like design of sensory-motor programs for robots G. Palm, Uni-Ulm.
Hybrid Pipeline Structure for Self-Organizing Learning Array Yinyin Liu 1, Ding Mingwei 2, Janusz A. Starzyk 1, 1 School of Electrical Engineering & Computer.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
The Brain is Embodied and the Body is Embedded in the Environment Jeff Krichmar Department of Cognitive Sciences University of California, Irvine.
Time Organized Maps – Learning cortical topography from spatiotemporal stimuli “ Learning cortical topography from spatiotemporal stimuli ”, J. Wiemer,
Fourth International Symposium on Neural Networks (ISNN) June 3-7, 2007, Nanjing, China Online Dynamic Value System for Machine Learning Haibo He, Stevens.
Introductory Remarks Robust Intelligence Solicitation Edwina Rissland Daniel DeMenthon, George Lee, Tanya Korelsky, Ken Whang (The Robust Intelligence.
Artificial Neural Networks - Introduction -
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
CS-485: Capstone in Computer Science Artificial Neural Networks and their application in Intelligent Image Processing Spring
Introduction GAM 376 Robin Burke Winter Outline Introductions Syllabus.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Cognitive Systems Foresight Language and Speech. Cognitive Systems Foresight Language and Speech How does the human system organise itself, as a neuro-biological.
Towards Cognitive Robotics Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Christian.
NEURAL NETWORKS FOR DATA MINING
Lecture 10: 8/6/1435 Machine Learning Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Hierarchical Temporal Memory as a Means for Image Recognition by Wesley Bruning CHEM/CSE 597D Final Project Presentation December 10, 2008.
A New Theory of Neocortex and Its Implications for Machine Intelligence TTI/Vanguard, All that Data February 9, 2005 Jeff Hawkins Director The Redwood.
Chapter 11 Artificial Intelligence Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Illustrations and Answers for TDT4252 exam, June
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
Cognitive Systems Foresight Language and Speech. Cognitive Systems Foresight Language and Speech How does the human system organise itself, as a neuro-biological.
Neural Modeling - Fall NEURAL TRANSFORMATION Strategy to discover the Brain Functionality Biomedical engineering Group School of Electrical Engineering.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Lecture 5 Neural Control
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Artificial Intelligence: Research and Collaborative Possibilities a presentation by: Dr. Ernest L. McDuffie, Assistant Professor Department of Computer.
Ghent University Pattern recognition with CNNs as reservoirs David Verstraeten 1 – Samuel Xavier de Souza 2 – Benjamin Schrauwen 1 Johan Suykens 2 – Dirk.
A field of study that encompasses computational techniques for performing tasks that require intelligence when performed by humans. Simulation of human.
Neural Networks (NN) Part 1 1.NN: Basic Ideas 2.Computational Principles 3.Examples of Neural Computation.
Pattern Recognition NTUEE 高奕豪 2005/4/14. Outline Introduction Definition, Examples, Related Fields, System, and Design Approaches Bayesian, Hidden Markov.
IJCNN, July 27, 2004 Extending SpikeProp Benjamin Schrauwen Jan Van Campenhout Ghent University Belgium.
Ghent University Compact hardware for real-time speech recognition using a Liquid State Machine Benjamin Schrauwen – Michiel D’Haene David Verstraeten.
Ghent University An overview of Reservoir Computing: theory, applications and implementations Benjamin Schrauwen David Verstraeten and Jan Van Campenhout.
Co-funded by the European Union SlideSP4 Theoretical Neuroscience – HBP 2 nd Periodic Review – June 2016 SP4 Major Achievements: Ramp-Up Phase overall.
Machine Learning Artificial Neural Networks MPλ ∀ Stergiou Theodoros 1.
HIERARCHICAL TEMPORAL MEMORY WHY CANT COMPUTERS BE MORE LIKE THE BRAIN?
Artificial Neural Networks By: Steve Kidos. Outline Artificial Neural Networks: An Introduction Frank Rosenblatt’s Perceptron Multi-layer Perceptron Dot.
Ghent University Backpropagation for Population-Temporal Coded Spiking Neural Networks July WCCI/IJCNN 2006 Benjamin Schrauwen and Jan Van Campenhout.
Symbolic Reasoning in Spiking Neurons: A Model of the Cortex/Basal Ganglia/Thalamus Loop Terrence C. Stewart Xuan Choo Chris Eliasmith Centre for Theoretical.
Deep Learning Amin Sobhani.
Fall 2004 Perceptron CS478 - Machine Learning.
Randomness in Neural Networks
Overview of Year 1 Progress Angelo Cangelosi & ITALK team
Cognitive Computing…. Computational Neuroscience
Artificial Intelligence (CS 370D)
Joost N. Kok Universiteit Leiden
What is an ANN ? The inventor of the first neuro computer, Dr. Robert defines a neural network as,A human brain like system consisting of a large number.
Artificial Intelligence ppt
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
OCNC Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.
Fundamentals of Neural Networks Dr. Satinder Bal Gupta
A Dynamic System Analysis of Simultaneous Recurrent Neural Network
ARTIFICIAL NEURAL networks.
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
August 8, 2006 Danny Budik, Itamar Elhanany Machine Intelligence Lab
The Network Approach: Mind as a Web
Machine Learning.
Presentation transcript:

Self-Organized Recurrent Neural Learning for Language Processing April 1, March 31, 2012 State from June 2009

2 The task ( (introspectreangel.wordpress.com) (coli.uni-saarland.de/~steiner/) writing/speech sourcefeature stream (compuskills.com.cy) AI machine Speech and handwriting recognition = essentially same problem Humans can do it -- but only after years of learning: thus, a very difficult problem No human-level AI solution in sight

3 Mission Establish neurodynamical architectures as viable alternative to statistical methods for speech and handwriting recognition. State-of-the-art Speech recognition = statistical data analysis problem Leads to data-driven, feedforward "serial" learning and representation techniques (HMMs) Performance appears to asymptote well below human performance ORGANIC alternative Speech recognition = an achievement of human brains Leads to neural computation and cognitive neuroscience modelling with recurrent dynamics (cyclic top- down and bottom-up paths) Potential to come closer to human performance (From Rabiner 1990, classical speech recognition tutorial) (From Dominey et al 1995)

4 Basic paradigm: reservoir computing (RC) Also known as Echo State Networks and Liquid State Machines Discovered in 2000, now an established paradigm in computational neuroscience and machine learning RC makes, for the first time, training of recurrent neural networks practically feasible: a major enabling technology RC is biologically plausible Consortium comprises pioneers and leading investigators of RC field Principle of RC: Use large, fixed, random recurrent network as excitable medium Excite by input signal Read out desired output by trainable output weights (red)

5 Scientific objectives Basic blueprints: Design and proof-of-principle tests of fundamental architecture layouts for hierarchical neural system that can learn multi- scale sequence tasks. Reservoir adaptation: Investigate mechanisms of unsupervised adaptation of reservoirs. Spiking vs. non-spiking neurons, role of noise: Clarify the functional implications for spiking vs. non-spiking neurons and the role of noise. Single-shot model extension, lifelong learning capability: Develop learning mechanisms which allow a learning system to become extended in “single-shot” learning episodes to enable lifelong learning capabilities. Working memory and grammatical processing: Extend the basic paradigm by a neural index-addressable working memory. Interactive systems: Extend the adaptive capabilities of human-robot cooperative interaction systems by on-line and lifelong learning capabilities. Integration of dynamical mechanisms: Integrate biologically mechanisms of learning, optimization, adaptation and stabilization into coherent architectures.

6 Community service and dissemination objectives High performing, well formalized core engine: Collaborative development of a well formalized and high performing core Engine, which will be made publicly accessible. Comply to FP6 unification initiatives: Ensure that the Engine integrates with the standards set in the FACETS FP6 IP, and integrate with other existing code. Benchmark repository: Create a database with temporal, multi-scale benchmark data sets which can be used as an international touchstone for comparing algorithms.

7 Consortium InstitutionGroupResearch Jacobs University Bremen Machine Learning (Herbert Jaeger) Recurrent neural networks, nonlinear dynamics, pattern recognition Technical University Graz Computational Neuroscience (Wolfgang Maass) Spiking neurodynamics, generic neural microcircuits, reinforcement learning INSERM Lyon Human and Robot Interactive Cognitive Systems Team (Peter F. Dominey) Cognitive neuroscience, human cortical sequence processing and speech recognition Universiteit Gent Reservoir Computing Lab (Benjamin Schrauwen) Reservoir computing applications, algorithm design Speech processing group (Jean-Pierre Martens) Speech recognition methods research and application development Planet intelligent systems GmbH Research and Development (Welf Wustlich) Text and handwriting recognition solutions, address recognition

8 Workpackages and collaboration scheme