Chapter 10: The cognitive brain

Slides:



Advertisements
Similar presentations
Cognitive Systems, ICANN panel, Q1 What is machine intelligence, as beyond pattern matching, classification and prediction. What is machine intelligence,
Advertisements

Thomas Trappenberg Autonomous Robotics: Supervised and unsupervised learning.
Teaching an Agent by Playing a Multimodal Memory Game: Challenges for Machine Learners and Human Teachers AAAI 2009 Spring Symposium: Agents that Learn.
Nathan Wiebe, Ashish Kapoor and Krysta Svore Microsoft Research ASCR Workshop Washington DC Quantum Deep Learning.
Yiannis Demiris and Anthony Dearden By James Gilbert.
Some concepts from Cognitive Psychology to review: Shadowing Visual Search Cue-target Paradigm Hint: you’ll find these in Chapter 12.
Decision Support Systems
September 7, 2010Neural Networks Lecture 1: Motivation & History 1 Welcome to CS 672 – Neural Networks Fall 2010 Instructor: Marc Pomplun Instructor: Marc.
Neural Networks Marco Loog.
COGNITIVE NEUROSCIENCE
Associative Learning.
Artificial Intelligence
Natural Language Understanding
Cognitive level of Analysis
컴퓨터 그래픽스 분야의 캐릭터 자동생성을 위하여 인공생명의 여러 가지 방법론이 어떻게 적용될 수 있는지 이해
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
Learning Visual Similarity Measures for Comparing Never Seen Objects By: Eric Nowark, Frederic Juric Presented by: Khoa Tran.
1 11. System level organization and coupled networks Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer.
Understanding Movement Preparation
1 10. The cognitive brain Lecture Notes on Brain and Computation Summary by Byoung-Hee Kim Lecture by Byoung-Tak Zhang Biointelligence Laboratory School.
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
NEURAL NETWORKS FOR DATA MINING
2 2  Background  Vision in Human Brain  Efficient Coding Theory  Motivation  Natural Pictures  Methodology  Statistical Characteristics  Models.
0 Chapter 1: Introduction Fundamentals of Computational Neuroscience Dec 09.
Chapter 5. Adaptive Resonance Theory (ART) ART1: for binary patterns; ART2: for continuous patterns Motivations: Previous methods have the following problem:
Cerebral bases of masked priming
“When” rather than “Whether”: Developmental Variable Selection Melissa Dominguez Robert Jacobs Department of Computer Science University of Rochester.
Dr. Z. R. Ghassabi Spring 2015 Deep learning for Human action Recognition 1.
Cognitive Systems Foresight Language and Speech. Cognitive Systems Foresight Language and Speech How does the human system organise itself, as a neuro-biological.
Mehdi Ghayoumi Kent State University Computer Science Department Summer 2015 Exposition on Cyber Infrastructure and Big Data.
WEEK INTRODUCTION IT440 ARTIFICIAL INTELLIGENCE.
Biologically Inspired Approaches to Automated Feature Extraction and Target Recognition Speaker: Yi-Chun Ke Adviser: Bo-Chi Lai.
Cognitive models for emotion recognition: Big Data and Deep Learning
Object Recognizing. Deep Learning Success in 2012 DeepNet and speech processing.
0 Chapter 3: Simplified neuron and population models Fundamentals of Computational Neuroscience Dec 09.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
HIERARCHICAL TEMPORAL MEMORY WHY CANT COMPUTERS BE MORE LIKE THE BRAIN?
Artificial Neural Networks By: Steve Kidos. Outline Artificial Neural Networks: An Introduction Frank Rosenblatt’s Perceptron Multi-layer Perceptron Dot.
@, B.J. Baars, Global Workpace Theory: A quick tutorial intro. Bernard J. Baars The Neurosciences Institute San Diego, Calif. See:
Chapter 13 Artificial Intelligence. Artificial Intelligence – Figure 13.1 The Turing Test.
National Taiwan Normal A System to Detect Complex Motion of Nearby Vehicles on Freeways C. Y. Fang Department of Information.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Some Slides from 2007 NIPS tutorial by Prof. Geoffrey Hinton
Unsupervised Learning of Video Representations using LSTMs
What is cognitive psychology?
Convolutional Neural Network
Learning Fast and Slow John E. Laird
Deep Learning Amin Sobhani.
Learning in Neural Networks
Crises of the richness Hundreds of components ... transforming, visualizing ... Visual “knowledge flow” to link components, or script languages (XML) to.
CS 9633 Machine Learning Inductive-Analytical Methods
Soft Computing Introduction.
INFORMATION COMPRESSION, MULTIPLE ALIGNMENT, AND INTELLIGENCE
Artificial Intelligence (CS 370D)
Deep Learning Fundamentals online Training at GoLogica
Intelligent Information System Lab
Prediction as Data Mining Task
Deep learning and applications to Natural language processing
11. System level organization and coupled networks
Chapter 6: Feedforward mapping networks
Introduction to Neural Networks And Their Applications - Basics
Chapter 16 The Frontal Lobes.
3.1.1 Introduction to Machine Learning
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Toward a Great Class Project: Discussion of Stoianov & Zorzi’s Numerosity Model Psych 209 – 2019 Feb 14, 2019.
The Network Approach: Mind as a Web
Crises of the richness Hundreds of components ... transforming, visualizing ... Visual “knowledge flow” to link components, or script languages (XML) to.
Chapter 8: Recurrent associative networks and episodic memory
Deep Neural Networks as Scientific Models
Presentation transcript:

Chapter 10: The cognitive brain Fundamentals of Computational Neuroscience Chapter 10: The cognitive brain Dec 09

Hierarchical maps and attentive vision

Attention in visual search and object recognition Gustavo Deco

Model

Example results

The interconnecting workspace hypothesis Stanislas Dehaene, M. Kergsberg, and J.P. Changeux, PNAS 1998

Stroop task modelling

The anticipating brain The brain can develop a model of the world, which can be used to anticipate or predict the environment. The inverse of the model can be used to recognize causes by evoking internal concepts. Hierarchical representations are essential to capture the richness of the world. Internal concepts are learned through matching the brain's hypotheses with input from the world. An agent can learn actively by testing hypothesis through actions. The temporal domain is an important degree of freedom.

Outline

Recurrent networks with hidden nodes

Training Boltzmann machines

The restricted Boltzmann machine

Deep generative models

Adaptive resonance theory (ART)

Further readings